this post was submitted on 08 Jan 2024
334 points (96.1% liked)
Technology
59605 readers
3438 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
I'm not a huge fan of Microsoft or even OpenAI by any means, but all these lawsuits just seem so... lazy and greedy?
It isn't like ChatGPT is just spewing out the entirety of their works in a single chat. In that context, I fail to see how seeing snippets of said work returned in a Google summary is any different than ChatGPT or any other LLM doing the same.
Should OpenAI and other LLM creators use ethically sourced data in the future? Absolutely. They should've been doing so all along. But to me, these rich chumps like George R. R. Martin complaining that they felt their data was stolen without their knowledge and profited off of just feels a little ironic.
Welcome to the rest of the 6+ billion people on the Internet who've been spied on, data mined, and profited off of by large corps for the last two decades. Where's my god damn check? Maybe regulators should've put tougher laws and regulations in place long ago to protect all of us against this sort of shit, not just businesses and wealthy folk able to afford launching civil suits and shakey grounds. It's not like deep learning models are anything new.
Edit:
Already seeing people come in to defend these suits. I just see it like this: AI is a tool, much like a computer or a pencil are tools. You can use a computer to copyright infringe all day, just like a pencil can. To me, an AI is only going to be plagiarizing or infringing if you tell it to. How often does AI plagiarize without a user purposefully trying to get it to do so? That's a genuine question.
Regardless, the cat's out of the bag. Multiple LLMs are already out in the wild and more variations are made each week, and there's no way in hell they're all going to be reigned in. I'd rather AI not exist, personally, as I don't see protections coming for normal workers over the next decade or two against further evolutions of the technology. But, regardless, good luck to these companies fighting the new Pirate Bay-esque legal wars for the next couple of decades.
deleted
Its wild to me how so many people seem to have got it into their head that cheering for the IP laws that corporations fought so hard for is somehow left wing and sticking up for the little guy.
And your argument boils down to "Hitler was a vegetarian, all vegetarians are Fascists". IP laws are a huge stifle on human creativity designed to allow corporate entities to capture, control and milk innate human culture for profit. The fact that some times some corporate interests end up opposing them when it suits them does not change that.
deleted
I already have:
I thought that was a prima facie reason for why they are bad, And no I do not believe all copyright law is bad with no nuance, as you would have seen if you stalked deeper into my profile rather than just picking one that you thought you could have fun with.
deleted
There are plenty from people who actually study this stuff.
I don't have a significant opinion on the Disney case, though I will note that it stems from the fact that corporations are able to buy and sell rights to works as pieces of capital (in this case Disney buying it from Lucasfilm).
deleted
Stifling a writing tool because GRRM wants a payday, on the basis that it can spit out small parts of his work if you specifically ask it too, is the opposite of advancing the art.
...yet allowing individuals to build upon existing works. Its literally the rest of the statement you put in bold, stop trying not to see on purpose.
deleted
I'm clearly talking about the technology when I say tool (large language models) and not the company itself.
If we can't freely use copyrighted material to train, it completely and unequivocally kills any kind of open source or even small to medium model. Only a handful of companies would have enough data or funds to build LLMs. And since AI will be able to do virtually all desk jobs in the near future, it would guarantee Microsoft and Google owning the economy.
So no, I'm not taking the sides of the corporation. The corporations want more barriers and more laws, it kills competition and broadens their moat.
I don't think GRRM is evil, just a greedy asshole that's willingly playing into their hand. I also don't think loss of potential profit because the domain has been made even more competitive equals stealing. Nothing was stolen, the barrier for entry has been lowered.
This isn't helping anyone except big name author, the owners of publishing houses and Microsoft. The small time authors and artist arent getting a dime. Why should literally the rest of us get screwed so a couple of fat cats can have an other payday? How is this advancing the arts?
deleted
It doesn't matter what the subject is about, tim clearly not saying OpenAI the company when I use the term "writing tool"
I'm advocating for us and society as a whole. If only google and Microsoft hold the keys to AI, we all end up paying a surtax on everything we buy because every business will be forced into a subscription model to use it and stay competitive.
There is too much data involved to ask for consent, you would just end up with big players trading with each other. The small artists wouldn't get a dime, only Getty and Adobe. It's literally not pheasible.
Nothing was stolen except future potential jobs. You can't own a style or anything of the kind.
The small artists aren't going to get any kind of benefit out of these lawsuits. It sucks that it's even more saturated of a market but the good ones learn to use these tools (LLMs and img/vid gen) to elevate their own art and push the boundaries.