Good.
Technology
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each other!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
- Check for duplicates before posting, duplicates may be removed
- Accounts 7 days and younger will have their posts automatically removed.
Approved Bots
I think the answer is there just do what deepsesk did.
Then let it be over then.
To be fair copyright is a disease. But then so is billionaires, capitalism, business, etc.
I mean, if there's a war, and you shoot somebody, does that make you bad?
Yes and no.
But if you stop me from criming, how will I get better at crime!?!
I dont wanna be mean but I always thought this guy had a weird face
Maybe as a consumer product but governments will still want it
Good. I hope this is what happens.
- LLM algorithms can be maintained and sold to corpos to scrape their own data so they can use them for in house tools, or re-sell them to their own clients.
- Open Source LLMs can be made available for end users to do the same with their own data, or scrape whats available in the public domain for whatever they want so long as they don't re-sell
- Altman can go fuck himself
Time to sail the high seas.
No amigo, it's not fair if you're profiting from it in the long run.
These fuckers are the first one to send tons of lawyers whenever you republish or use any IP of them. Fuck these idiots.
This is basically a veiled admission that OpenAI are falling behind in the very arms race they started. Good, fuck Altman. We need less ultra-corpo tech bro bullshit in prevailing technology.
Good. If I ever published anything, I would absolutely not want it to be pirated by AI so some asshole can plagiarize it later down the line and not even cite their sources.
Do you promise?!?!
I have conflicting feelings about this whole thing. If you are selling the result of training like OpenAI does (and every other company), then I feel like it’s absolutely and clearly not fair use. It’s just theft with extra steps.
On the other hand, what about open source projects and individuals who aren’t selling or competing with the owners of the training material? I feel like that would be fair use.
What keeps me up at night is if training is never fair use, then the natural result is that AI becomes monopolized by big companies with deep pockets who can pay for an infinite amount of random content licensing, and then we are all forever at their mercy for this entire branch of technology.
The practical, socioeconomic, and ethical considerations are really complex, but all I ever see discussed are these hard-line binary stances that would only have awful corporate-empowering consequences, either because they can steal content freely or because they are the only ones that will have the resources to control the technology.
Japan already passed a law that explicitly allows training on copyrighted material. And many other countries just wouldn’t care. So if it becomes a real problem the companies will just move.
I think they need to figure out a middle ground where we can extract value from the for profit AI companies but not actually restrict the competition.
Oh no, not the plagiarizing machine! How are rich hacks going to feign talent now? Pay an artist for it?! Crazy!
Fuck these psychos. They should pay the copyright they stole with the billions they already made. Governments should protect people, MDF
TLDR: "we should be able to steal other people's work, or we'll go crying to daddy Trump. But DeepSeek shouldn't be able to steal from the stuff we stole, because China and open source"
At the end of the day the fact that openai lost their collective shit when a Chinese company used their data and model to make their own more efficient model is all the proof I need they don't care about being fair or equitable when they get mad at people doing the exact thing they did and would aggressively oppose others using their own work to advance their own.
Open can suck some dick.
Sounds fair, shut it down.