this post was submitted on 26 Jan 2024
430 points (83.1% liked)
Technology
59534 readers
3195 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Can a tool create? It generated.
Anyway, in case like this, is creation even a factor in liability?
In my opinion one who gets monetary value first from the piece should be liable.
NYTimes?
"I didnt kill him, officer, my murder robot did. Oh, sure, I built it and programmed it to stab jenkins to death for an hour. Oh, yes, I charged it, set it up in his house, and made sure all the programming was set. Ah, but your honor, I didnt press the on switch! Jenkins did, after I put a note on it that said 'not an illegal murderbot' next to the power button. So really, the murderbot killed him, and if you like maybe even jenkins did it! But me? No, sir, Im innocent!"
How is this example relevant? You created the programming.
And someone created the AI programming too.
Then someone trained that AI.
It didn't just come out of the aether, there's a manual on how to do it.
Yes, but in ~~your~~ previous example ~~you~~ person specifically created a machine to stab a specific person.
Example would be apt, if you created a program that generates programming for industrial machines to insert things in to stuff and then you uploaded a generated program without checking the code and it stabbed some random guy.
That was not my example. The murder machine was someone else.
Sorry, my bad. Fixed
The liability of industrial machines is actually quite apt.
If you design a machine that kills someone during reasonable use. You are liable.
Aircraft engineers have a 25 year liability on their work. A mistake they might make could kill hundreds.
There is always a human responsible for the actions of a machine. Even unintended results have liability.
If you upload a program to a machine and someone dies as a result you're in hot water.
Moving away from life and death, unintended copyright infringement by a machine hasn't been tested. But it's likely it will be ruled that at least some of the builders of that machine are responsible.
AI "self-driving" cars are getting away with it by only offering an assist to driving. Keeping the driver responsible. But that's possible because you need a license to drive a car in the first place.
AI images like this are the equivalent of a fully self driving car. You set the destination, it drives you there. The liability falls on the process of driving, or the process of creating. The machine doing that means designers are then liable.
Lets call it assisting image creation then.
AI owners would love to do that.
Copyright owners would not.
Hence the legal battles.
So by that logic. I prompted you with a question. Did I create your comment?
I used you as a tool to generate language. If it was a Pulitzer winning response could I gain the plaudits and profit, or should you?
If it then turned out it was plagiarism by yourself, should I get the credit for that?
Am I liable for what you say when I have had no input into the generation of your personality and thoughts?
The creation of that image required building a machine learning model.
It required training a machine learning model.
It required prompting that machine learning model.
All 3 are required steps to produce that image and all part of its creation.
The part copyright holders will focus on is the training.
Human beings are held liable if they see and then copy an image for monetary gain.
An AI has done exactly this.
It could be argued that the most responsible and controlled element of the process. The most liable. Is the input of training data.
Either the AI model is allowed to absorb the world and create work and be held liable under the same rules as a human artist. The AI is liable.
Or the AI model is assigned no responsibility itself but should never have been given copyrighted work without a license to reproduce it.
Either way the owners have a large chunk of liability.
If I ask a human artist to produce a picture of Donald Duck, they legally can't, even though they might just break the law Disney could take them to court and win.
The same would be true of any business.
The same is true of an AI as either its own entity, or the property of a business.
I'm not non-sentient construct that creates stuff.
...and when the copyright law was written there was no non-sentient things gererating stuff.
There is literally no way to prove whether you're sentient.
Decart found that limitation.
The only definition in law is whether you have competency to be responsible. The law assumes you do as an adult unless it's proven you don't.
Given the limits of AI the court is going to assume it to be a machine. And a machine has operators, designers, and owners. Those are humans responsible for that machine.
It's perfectly legitimate to sue a company for using a copyright breaking machine.
You almost seem like you get the problem, but then you flounder away.
Law hasn't caught up with the world with generative programs. A.I will not be considered sentient and they will have this same discussion in court.
It doesn't matter whether AI is sentient or not. It has a designer, trainer, and owner.
Once you prove the actions taken by the AI, even as just a machine, breach copyright liability is easily assigned.
Argee to disagree and time will tell, but you must see there are factors that haven't existed before in the history of humanity.
Who knows how the laws will change because of AI. But as the law currently stands it's just a matter of proving it to a court. That's the main barrier.
This is strong evidence an AI is breaking the law.
That joker could have been somebodys avatar picture with matching username.
A.I. can't understand copyright and useful A.I can't be build by protecting it from every material somebody thinks is their IP. It needs to learn to understand humans and needs human material to do so. Shitload of it. Who's up for some manual filtering?
If we go by NYTimes standards we better mothball the entire AI endeavor.
That's why it's a massive legal fight.
They'll delay a ruling as long as possible.
They're definitely developing a new model on vetted public domain data as we speak. They just need to delay legal action long enough to get that new model to launch.
This is the same thing YouTube did. Delay all copyright claims in court, blaming users, then put their copyright claim system in place that massively advantages IP owners.