this post was submitted on 03 Jun 2024
1300 points (96.4% liked)

Technology

59589 readers
3024 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] nexussapphire@lemm.ee 6 points 5 months ago* (last edited 5 months ago) (2 children)

It's so fun to play with offline AI. It doesn't have the creepy underpinnings of knowing art and journalism as well as musings from social media was blatantly stolen from the internet and sold as a service for profit.

Edit: I hate theft and if you think theft is ok for training llms go ahead and dislike this comment. I don't feel bad about what I said, local offline AI is just better because it doesn't work on the premise of backroom deals and blatant theft. I will never use an AI like DALL.E when there is a talented artist trying to put food on the table with a skill they honed for years. If you condone stealing you are a cheap, heartless, coward.

[–] Teanut@lemmy.world 16 points 5 months ago (1 children)

I hate to break it to you, but if you're running an LLM based on (for example) Llama the training data (corpus) that went into it was still large parts of the Internet.

The fact that you're running the prompts locally doesn't change the fact that it was still trained on data that could be considered protected under copyright law.

It's going to be interesting to see how the law shakes out on this one, because an artist going to an art museum and doing studies of those works (and let's say it's a contemporary art museum where the works wouldn't be in the public domain) for educational purposes is likely fair use - and possibly encouraged to help artists develop their talents. Musicians practicing (or even performing) other artists' songs is expected during their development. Consider some high school band practicing in a garage, playing some song to improve their skills.

I know the big difference is that it's people training vs a machine/LLM training, but that seems to come down to not so much a copyright issue (which it is in an immediate sense) as a "should an algorithm be entitled to the same protections as a person? If not, what if real AI (not just an LLM) is developed? Should those entities be entitled to personhood?"

[–] nexussapphire@lemm.ee -1 points 5 months ago (2 children)

I hate to break it to you but not all machine learning is llms based. I've been messing with neural based tts from a small project called piper. I'm looking into an image recognition neural network to write software for and train myself. I might try writing it myself for fun 🤔

I'm not interested in anything that uses stolen data like that so my options are limited and relegated to incredibly focused single purpose tools or things I make myself with the tools available.

I'd love to play with image generation and large language models but until all the legal stuff is worked out and individuals get paid for their work I'm not touching it.

To me it's as cut and dry as this. If it's the difference between an individual becoming their own boss/making a better living and a corporation growing their market cap I'll always choose the individual. I know there's a possibility of that growth resulting in more jobs but I'd rather have an environment where small businesses open breed competition and overall improve everyone's life. Let's not give the keys over to companies like Microsoft and close more doors.

I don't care about the discussion of true AI having rights. It's only going to be used to make the wealthy wealthier.

[–] hellofriend@lemmy.world 9 points 5 months ago (1 children)

All LLMs are based on neural networks. Furthermore, all neural networks need training, regardless of whether they're an LLM or some other form of machine learning. If you want to ensure there's no stolen material used in the neural net then you have to train it yourself with material that you have the copyright to.

[–] nexussapphire@lemm.ee -2 points 5 months ago (1 children)

Boy I love it when people don't read.

[–] hellofriend@lemmy.world 5 points 5 months ago (1 children)

I was expanding on your point, you twat. But hey, just be a snarky cunt. I'm sure that'll get you far.

[–] nexussapphire@lemm.ee 5 points 5 months ago* (last edited 5 months ago) (1 children)

Sorry I thought you were being a smartass and just skimmed through it. Truly my bad.

Edit: it's hard to tell intention sometimes and I really do appreciate you summarizing what I said. It's true and a more approachable answer than what I gave.

[–] hellofriend@lemmy.world 6 points 5 months ago

My reaction was probably a bit heavy handed as well. My apologies.

[–] nexussapphire@lemm.ee 3 points 5 months ago (1 children)

Sorry I feel strongly about this. Play with it all you want it's really cool shit! But please don't pay for access to it and if you need some art or a professional write-up please just pay someone to do it.

It'll mean so much to your fellow man in these uncertain times and the quality will be so much better.

[–] LainTrain@lemmy.dbzer0.com -3 points 5 months ago

I'm not paying anyone for anything, not OpenAI techbro grifters and not any freelancer grifters

[–] nexussapphire@lemm.ee -2 points 5 months ago

I'm on his side, I don't get the dislike. Maybe he likes massive corporations stealing people's data putting artist and journalist out of work.