Training is theft imo.
Then it appears we have nothing to discuss.
Training is theft imo.
Then it appears we have nothing to discuss.
Christ this is a boring fucking debate. One side thinks companies like OpenAI are obviously stealing and feels no need to justify their position, instead painting anyone who disagrees as pro-theft.
But is it reasonable to have different standards for someone creating a picture with a paintbrush as opposed to someone creating the same picture with a machine learning model?
Copyright applies to reproduction of a work so if they build any machine that is capable of doing that (they did) then they are liable for it.
That is for sure not the case. The modern world is bursting with machines capable of reproducing copyrighted works, and their manufacturers are not liable for copyright violations carried out by users of those machines. You're using at least once of those machines to read this comment. This stuff was decided around the time VCRs were invented.
What if I do it myself? Do I still need to get permission? And if so, why should I?
I don't believe the legality of doing something should depend on who's doing it.
Of course it sounds bad when you using the word "steal", but I'm far from convinced that training is theft, and using inflammatory language just makes me less inclined to listen to what you have to say.
Do you believe it's reasonable, in general, to develop technology that has the potential to replace some human labor?
Do you believe compensating copyright holders would benefit the individuals whose livelihood is at risk?
the true benefits of AI are overwhelmingly for corporations and investors
"True" is doing a lot of work here, I think. From my perspective the main beneficiaries of technology like LLMs and stable diffusion are people trying to do their work more efficiently, people paying around, and small-time creators who suddenly have custom graphics to illustrate their videos, articles, etc. Maybe you're talking about something different, like deep fakes? The downside of using a vague term like "AI" is that it's too easy to accidently conflate things that have little in common.
I don't understand why people are defending AI companies
Because it's not just big companies that are affected; it's the technology itself. People saying you can't train a model on copyrighted works are essentially saying nobody can develop those kinds of models at all. A lot of people here are naturally opposed to the idea that the development of any useful technology should be effectively illegal.
Copyright law only works because most violations are not feasible to prosecute. A world where copyright laws are fully enforced would be an authoritarian dystopia where all art and science is owned by wealthy corporations.
Copyright law is inherently authoritarian. The conversation we should have been having for the last 100 years isn't about how much we'll tolerate technical violations of copyright law; it's how much we'll tolerate the chilling effect of copyright law on sharing for the sake of promoting new creative works.
IMHO being able to "control your creations" isn't what copyright was created for; it's just an idea people came up with by analogy with physical property without really thinking through what purpose is supposed to serve. I believe creators of intellectual "property" have no moral right to control what happens with their creations, and they only have a limited legal right to do so as a side-effect of their legal right to profit from their creations.
This is called assuming the consequent. Either you're not trying to make a persuasive argument or you're doing it very, very badly.