this post was submitted on 19 Aug 2024
661 points (96.7% liked)
Technology
59605 readers
3366 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
This is absolutely wrong about how something like SD generates outputs. Relationships between atomic parts of an image are encoded into the model from across all training inputs. There is no copying and pasting. Now whether you think extracting these relationships from images you can otherwise access constitutes some sort of theft is one thing, but characterizing generative models as copying and pasting scraped image pieces is just utterly incorrect.
While, yes it is not copy and paste in the literal sense, it does still have the capacity to outright copy the style of an artist's work that was used to train it.
If teaching another artist's work is already frowned upon when trying to pass the trace off as one's own work, then there's little difference when a computer does it more convincingly.
Maybe a bit off tangent here, since I'm not even sure if this is strictly possible, but if a generative system was only trained off of, say, only Picasso's work, would you be able to pass the outputs off as Picasso pieces? Or would they be considered the work of the person writing a prompt or built the AI? What if the artist wasn't Picasso but someone still alive, would they get a cut of the profits?
The outputs would be considered no one's outputs as no copyright is afforded to AI general content.
That feels like it's rather besides the point, innit? You've got AI companies showing off AI art and saying "look at what this model can do," you've got entire communities on Lemmy and Reddit dedicated to posting AI art, and they're all going "look at what I made with this AI, I'm so good at prompt engineering" as though they did all the work, and the millions of hours spent actually creating the art used to train the model gets no mention at all, much less any compensation or permission for their works to be used in the training. Sure does seem like people are passing AI art off as their own, even if they're not claiming copyright.
I'm not sure how it could be besides the point, though it may not be entirely dispositive. I take ownership to be a question of who has a controlling and exclusionary right to something--in this case thats copyright. Copyright allows you to license these things and extract money for their use. If there is no copyright, there is no secure monetization (something companies using AI generated materials absolutely keep high in mind). The question was "who would own it" and I think it's pretty clear cut who would own it. No one.