this post was submitted on 10 Jan 2024
1237 points (96.5% liked)

Technology

59534 readers
3223 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] kibiz0r@lemmy.world 21 points 10 months ago (2 children)

We have a mechanism for people to make their work publically visible while reserving certain rights for themselves.

Are you saying that creators cannot (or ought not be able to) reserve the right to ML training for themselves? What if they want to selectively permit that right to FOSS or non-profits?

[–] BURN@lemmy.world 11 points 10 months ago

That’s exactly what they’re saying. The AI proponents believe that copyright shouldn’t be respected and they should be able to ignore any licensing because “it’s hard to find data otherwise”

[–] Grimy@lemmy.world -4 points 10 months ago (1 children)

Essentially yes. There isn't a happy solution where FOSS gets the best images and remains competitive. The amount of data needed is outside what can be donated. Any open source work will be so low in quality as to be unusable.

It also won't be up to them. The platforms where the images are posted will be selling and brokering. No individual is getting a call unless they are a household name.

None of the artists are getting paid either way so yeah, I'm thinking of society in general first.

[–] kibiz0r@lemmy.world 3 points 10 months ago

The artists (and the people who want to see them continue to have a livelihood, a distinct voice, and a healthy engaged fanbase) live in that society.

The platforms where the images are posted will be selling and brokering

Isn't this exactly the problem though?

From books to radio to TV, movies, and the internet, there's always:

  • One group of people who create valuable works
  • Another group of people who monopolize distribution of those works

The distributors hijack ownership (or de facto ownership) of the work, through one means or another (either logistical superiority, financing requirements, or IP law fuckery) and exploit their position to make themselves the only channel for creators to reach their audience and vice-versa.

That's the precise pattern that OpenAI is following, and they're doing it at a massive scale.

It's not new. Youtube, Reddit, Facebook, MySpace, all of these companies started with a public pitch about democratizing access to content. But a private pitch emerged, of becoming the main way that people access content. When it became feasible for them to turn against their users and liquidate them, they did.

The difference is that they all had to wait for users to add the content over time. Imagine if Google knew they could've just seeded Google Video with every movie, episode, and clip ever aired or uploaded anywhere. Just say, "Mon Dieu! It's impossible for us to run our service without including copyrighted materials! Woe is us!" and all is forgiven.

But honestly, whichever way the courts decide, the legality of it doesn't matter to me. It's clearly a "Whose Line Is It?" situation where the rules are made up and ownership doesn't matter. So I'm looking at "Does this consolidate power, or distribute it?" And OpenAI is pulling perhaps the biggest power grab that we've seen.

--

Unrelated: I love that there's a very distinct echo of something we saw with the previous era of tech grift, crypto. The grifters would always say, after they were confronted, "Well, there's no way to undo it now! It's on the blockchain!" There's always this back-up argument of "it's inevitable so you might as well let me do it".