this post was submitted on 26 Jan 2024
430 points (83.1% liked)
Technology
59627 readers
2911 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
"I didnt kill him, officer, my murder robot did. Oh, sure, I built it and programmed it to stab jenkins to death for an hour. Oh, yes, I charged it, set it up in his house, and made sure all the programming was set. Ah, but your honor, I didnt press the on switch! Jenkins did, after I put a note on it that said 'not an illegal murderbot' next to the power button. So really, the murderbot killed him, and if you like maybe even jenkins did it! But me? No, sir, Im innocent!"
How is this example relevant? You created the programming.
And someone created the AI programming too.
Then someone trained that AI.
It didn't just come out of the aether, there's a manual on how to do it.
Yes, but in ~~your~~ previous example ~~you~~ person specifically created a machine to stab a specific person.
Example would be apt, if you created a program that generates programming for industrial machines to insert things in to stuff and then you uploaded a generated program without checking the code and it stabbed some random guy.
That was not my example. The murder machine was someone else.
Sorry, my bad. Fixed
The liability of industrial machines is actually quite apt.
If you design a machine that kills someone during reasonable use. You are liable.
Aircraft engineers have a 25 year liability on their work. A mistake they might make could kill hundreds.
There is always a human responsible for the actions of a machine. Even unintended results have liability.
If you upload a program to a machine and someone dies as a result you're in hot water.
Moving away from life and death, unintended copyright infringement by a machine hasn't been tested. But it's likely it will be ruled that at least some of the builders of that machine are responsible.
AI "self-driving" cars are getting away with it by only offering an assist to driving. Keeping the driver responsible. But that's possible because you need a license to drive a car in the first place.
AI images like this are the equivalent of a fully self driving car. You set the destination, it drives you there. The liability falls on the process of driving, or the process of creating. The machine doing that means designers are then liable.
Lets call it assisting image creation then.
AI owners would love to do that.
Copyright owners would not.
Hence the legal battles.