this post was submitted on 05 Oct 2024
638 points (95.8% liked)
Not The Onion
12368 readers
316 users here now
Welcome
We're not The Onion! Not affiliated with them in any way! Not operated by them in any way! All the news here is real!
The Rules
Posts must be:
- Links to news stories from...
- ...credible sources, with...
- ...their original headlines, that...
- ...would make people who see the headline think, “That has got to be a story from The Onion, America’s Finest News Source.”
Comments must abide by the server rules for Lemmy.world and generally abstain from trollish, bigoted, or otherwise disruptive behavior that makes this community less fun for everyone.
And that’s basically it!
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
I'm just happy someone at the copyright office knows what they're doing
This has been the copyright office's stance for quite a while now. Actually, most of the world's respective IP registrars and authorities do not grant IP rights to AI generated material.
I'm glad about this, honestly.
If you want to use an AI model trained on vast sums of publicly posted work, go for it, but be ready for the result to be made into a truly public work that you don't own at the end of it all.
I agree. I think the effective entry into the public domain of AI generated material, in combination with a lot of reporting/marking laws coming online is an effective incentive to keep a lot of material human made for large corporate actors who don't like releasing stuff from their own control.
What I'd like to see in addition to this is a requirement that content-producing models all be open source as well. Note, I don't think we need weird new IP rights that are effectively a "right to learn from" or the like.
I'm 100% in favor of requiring models to be open source. That's been my belief for a while now, because clearly, if someone wants to make an AI model off the backs of other people's work, they shouldn't be allowed to restrict or charge access to those models to the same people who had their work used, let alone other people more broadly.