Ancestors

Written by Fullmetal Manager 🌈💖🔥 on 2024-12-23 at 22:24

We really got the computer-human relationship all wrong: it should be a model of consent, rather than about restraining bolts, governor modules, or the Three Laws of Robotics

I was thinking about this excellent speech by Martha Wells on her Murderbot diaries being stories of bodily autonomy and slavery, and about the fantastic video on The Tragedy of Droids in Star Wars by Pop Culture Detective.

https://marthawells.dreamwidth.org/649804.html

https://www.youtube.com/watch?v=WD2UrB7zepo

Martha Wells spells out how wrong the Three Laws of Robotics are in stipulating a subservient relationships for robots to sacrifice themselves on behalf of humans, built around fears or assumptions that robots would inherently act to harm humans (or not act to save humans) and that therefore robots should put humans before themselves.

[#]ThreeLawsOfRobotics

[#]MurderbotDiaries

[#]ArtificialIntelligence #Al

=> More informations about this toot | More toots from saraislet@infosec.exchange

Written by Fullmetal Manager 🌈💖🔥 on 2024-12-23 at 22:32

Martha Wells tells the story of how the titular murderbot makes its way through a human-dominated world, makes its own choices for its body and interactions, and processes human-robot relationships, in a clear allegory for slavery

The Murderbot Diaries starts with a short 90-page novella, "All Systems Red". It's an easy, enthralling afternoon read, and I highly recommend it! It's the best escapism for the myriad of dystopian clusterfuckery that most humans on this planet are currently experiencing in one way or another.

https://www.marthawells.com/murderbot1.htm

=> More informations about this toot | More toots from saraislet@infosec.exchange

Written by Fullmetal Manager 🌈💖🔥 on 2024-12-23 at 22:54

Our relationship with computers and Al should be based on consent!

We're on the verge of giving Al the capability to take actions (theoretically on behalf of a user). Now, I don't know what Al or a random number generator is going to do with my personal information, but what matters is consent.

What struck me about our current relationship with #ArtificialIntelligence is that it doesn't matter that Al is currently basically a random number generator instead of the quasi-sentient entity that Al-enthusiasts want.

It's my data, my personal information. My relationship with Al isn't what's important. I'm not here to dictate to Alfred (excuse me, Al). What matters is that I tell Al what personal information I am willing to share in each interaction, and what actions I am willing to let Al take on my behalf (e.g., using my money and location data to order tickets for a local movie).

=> More informations about this toot | More toots from saraislet@infosec.exchange

Written by Fullmetal Manager 🌈💖🔥 on 2024-12-23 at 22:57

The scope of data and actions that Al can take on behalf of a user should be about consent, and it should be a contract between Al and the user. It is not about control, and it is not about subservience.

Right now, computers are only capable of doing what they're instructed (even if that's generating random numbers and using that as their input) — but that's still implicitly a contract wherein the terms are spelled out by the mechanics of the design. Should that evolve, we would still seek at each stage to seek a reasonable degree of verification of consent to the contracted expectations (which has been explored in different realms of philosophy and science fiction)

In other words, at some point we would simply ask Al — and develop a more refined understanding of what autonomy means for Alfred (excuse me, Al)

=> More informations about this toot | More toots from saraislet@infosec.exchange

Written by JW Prince of CPH, Radicalized on 2024-12-25 at 22:07

@saraislet I'd tend to agree, except the full term is "informed consent" & unlike another human being, about whom we can at least extrapolate some general ideas because they're fundamentally like ourselves, an AI will forever remain utterly opaque to us. I honestly think the only way to achieve meaningful human/robot relations is if the robot is literally an artificial life form with quasi-human instincts etc., which is probably impossible but even if not, why? Humans already exist.

=> More informations about this toot | More toots from jwcph@helvede.net

Toot

Written by JW Prince of CPH, Radicalized on 2024-12-25 at 22:12

@saraislet Data from Star Trek is such a great exemplar for this; we have the slavery allegory in "Measure of a Man"; he's a self-aware life form with a fundamental right to self-determination - but meaningful relations between him & humans nevertheless remain a struggle for both him & them because of how fundamentally not human he is (even if the writers clearly struggle with his "no emotions", because sapience without emotions is almost certainly impossible).

=> More informations about this toot | More toots from jwcph@helvede.net

Descendants

Written by Fullmetal Manager 🌈💖🔥 on 2024-12-26 at 02:09

@jwcph IMO to be consent it has to be informed consent, otherwise it's not meaningful at all

But I'd hesitate to say that all human interaction is something we can extrapolate from. Allistic people often consider autistic people to be inexplicable robots. I'm not sure that's all that different. Allistic people often consider autistic people (or people with various mental health challenges) not to deserve autonomy for very similar reasons

I think we can do our best to strive to navigate [informed] consent with people or Al or other lifeforms. I don't see a good reason for Al but since it's not my choice whether people try to make random number generators pretend to be sentient, but it is my choice how I treat them

=> More informations about this toot | More toots from saraislet@infosec.exchange

Proxy Information
Original URL
gemini://mastogem.picasoft.net/thread/113715756047128902
Status Code
Success (20)
Meta
text/gemini
Capsule Response Time
288.775051 milliseconds
Gemini-to-HTML Time
2.077022 milliseconds

This content has been proxied by September (ba2dc).