this post was submitted on 08 Jun 2024
361 points (97.9% liked)

Technology

59569 readers
4136 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] FrostyCaveman@lemm.ee 57 points 5 months ago (3 children)

I think Asimov had some thoughts on this subject

Wild that we’re at this point now

[–] leftzero@lemmynsfw.com 42 points 5 months ago (1 children)

Asimov didn't design the three laws to make robots safe.

He designed them to make robots break in ways that'd make Powell and Donovan's lives miserable in particularly hilarious (for the reader, not the victims) ways.

(They weren't even designed for actual safety in-world; they were designed for the appearance of safety, to get people to buy robots despite the Frankenstein complex.)

[–] FaceDeer@fedia.io 30 points 5 months ago (1 children)

I wish more people realized science fiction authors aren't even trying to make good predictions about the future, even if that's something they were good at. They're trying to make stories that people will enjoy reading and therefore that will sell well. Stories where nothing goes particularly wrong tend not to have a compelling plot, so they write about technology going awry so that there'll be something to write about. They insert scary stuff because people find reading about scary stuff to be fun.

There might actually be nothing bad about the Torment Nexus, and the classic sci-fi novel "Don't Create The Torment Nexus" was nonsense. We shouldn't be making policy decisions based off of that.

[–] afraid_of_zombies@lemmy.world 2 points 5 months ago

Philip K. Dick wrote a short story from the dog's pov about living in a home and thinking about the trash can. According to the dog the humans were doing what they were supposed to do, burying excess food for when they are hungry later. The clever humans had a metal box for it. And twice a week the dog would be furious at the mean men who took the box of yummy food away. The dog couldn't understand why the humans who were normally so clever didn't stop the mean people from taking away the food.

He mentioned the story a great deal not because he thought it was well written but because he was of the opinion that he was the dog. He sees visions of the possible future and understands them from his pov then writes it down.

[–] Voroxpete@sh.itjust.works 13 points 5 months ago (1 children)

Asimov's stories were mostly about how it would be a terrible idea to put kill switches on AI, because he assumed that perfectly rational machines would be better, more moral decision makers than human beings.

[–] Nomecks@lemmy.ca 18 points 5 months ago (3 children)

This guy didn't read the robot series.

[–] grrgyle@slrpnk.net 13 points 5 months ago (2 children)

I mean I can see it both ways.

It kind of depends which of robot stories you focus on. If you keep reading to the zeroeth law stuff then it starts portraying certain androids as downright messianic, but a lot of his other (esp earlier) stories are about how -- basically from what amount to philosophical computer bugs -- robots are constantly suffering alignment problems which cause them to do crime.

[–] Nomecks@lemmy.ca 12 points 5 months ago* (last edited 5 months ago)

The point of the first three books was that arbitrary rules like the three laws of robotics were pointless. There was a ton of grey area not covered by seemingly ironclad rules and robots could either logicically choose or be manipulated into breaking them. Robots, in all of the books, operate in a purely amoral manner.

[–] leftzero@lemmynsfw.com 3 points 5 months ago (2 children)

downright messianic

Yeah, tell that to the rest of the intelligent life in the galaxy...

Oh, wait, you can't, because by the time humans got there these downright messianic robots had already murdered everything and hidden the evidence...

[–] blanketswithsmallpox@lemmy.world 2 points 5 months ago (1 children)

Praise be to R. Daneel Olivaw!

[–] grrgyle@slrpnk.net 2 points 5 months ago

Praise be! What a storied chatter. I also really like Asimov's fake names. They sound good in the ear

[–] grrgyle@slrpnk.net 1 points 5 months ago* (last edited 5 months ago)

Oh man I forgot that

That is pretty eschatological in a wrathful human centric way, so my point unintentionally stands

[–] Voroxpete@sh.itjust.works 6 points 5 months ago

This guy apparently stopped reading the robot series before they got to The Evitable Conflict.

[–] afraid_of_zombies@lemmy.world 6 points 5 months ago

All you people talking Asimov and I am thinking the Sprawl Trilogy.

In that series you could build an AGI that was smarter than any human but it took insane amounts of money and no one trusted them. By law and custom they all had an EMP gun pointed at their hard drives.

It's a dumb idea. It wouldn't work. And in the novels it didn't work.

I build say a nuclear plant. A nuclear plant is potentially very dangerous. It is definitely very expensive. I don't just build it to have it I build it to make money. If some wild haired hippy breaks in my office and demands the emergency shutdown switch I am going to kick him out. The only way the plant is going to be shut off is if there is a situation where I, the owner, agree I need to stop making money for a little while. Plus if I put an emergency shut off switch it's not going to blow up the plant. It's going to just stop it from running.

Well all this applies to these AI companies. It is going to be a political decision or a business decision to shut them down, not just some self-appointed group or person. So if it is going to be that way you don't need an EMP gun all you need to do is cut the power, figure out what went wrong, and restore power.

It's such a dumb idea I am pretty sure the author put it in because he was trying to point out how superstitious people were about these things.