this post was submitted on 05 Jul 2024
92 points (96.9% liked)

Technology

59589 readers
3825 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

But just as Glaze's userbase is spiking, a bigger priority for the Glaze Project has emerged: protecting users from attacks disabling Glaze's protections—including attack methods exposed in June by online security researchers in Zurich, Switzerland. In a paper published on Arxiv.org without peer review, the Zurich researchers, including Google DeepMind research scientist Nicholas Carlini, claimed that Glaze's protections could be "easily bypassed, leaving artists vulnerable to style mimicry."

you are viewing a single comment's thread
view the rest of the comments
[–] admin@lemmy.my-box.dev 4 points 4 months ago* (last edited 4 months ago) (1 children)

you're wanting to give people the right to control other people's ability to analyze the things that they see on public display.

For the second time, that's not what I want to do - I pretty much said so explicitly with my example.

Human studying a piece of content - fine.
Training a Machine Learning model on that content without the creator's permission - not fine.

But if you honestly think that a human learning something, and a ML model learning something are exactly the same, and should be treated as such, this conversation is pointless.

[–] FaceDeer@fedia.io 0 points 4 months ago (2 children)

Again, it's fundamentally the same thing. You're just using different tools to perform the same action.

I remember back in the day when software patents were the big boogeyman of the Internet that everyone hated, and the phrase "...with a computer" was treated with great derision. People were taking out huge numbers of patents that were basically the same as things people had been doing since time immemorial but by adding the magical "...with a computer" suffix on it they were treating it like some completely new innovation.

Suddenly we're on the other side of that?

Anyway, even if you do throw that distinction in you still end up outlawing huge swathes of things that we've depended on for years. Search engines as the most obvious example.

[–] admin@lemmy.my-box.dev 3 points 4 months ago

And that's the third time you've tried to put words into my mouth, rather than arguing my points directly.

Have fun battling your straw men, I'm out.

[–] sab@lemmy.world 1 points 4 months ago

Not OP, but I also don't think it's the same thing. But even if it were, the consequences are nowhere near the same.

A person might be able to learn to replicate an artist's style, given enough practice and patience, but it would take them a long time, and the most "damage" they could do with that, is create new content at roughly the same rate as the original creator.

It would take an AI infinitely less time to acquire that same skill, and infinitely less time to then create that content. So given those factors, I think there's an enormous difference between 1 person learning to copy your skill, or a company that does it as a business model.

Btw, if you didn't know it yet - search engines don't need to create a large language model in order to find web content. They've been working fine (one night even say Better) without doing that.