this post was submitted on 30 Jan 2024
504 points (93.4% liked)

Technology

59605 readers
3501 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] gravitas_deficiency@sh.itjust.works 34 points 10 months ago (1 children)

There are very valid philosophical and ethical reasons not to use it. We’re not just being luddites for the hell of it. In many cases, we’re engineers and scientists with interest, experience, or expertise in neural nets and LLMs ourselves, and we don’t like how fast and loose (in a lot of really, really important ways) all these big companies are playing it with the training datasets, nor how they’re actively disregarding any sort of legal or ethical responsibility around the technology writ large.

[–] tsonfeir@lemm.ee -5 points 10 months ago (1 children)

Likewise. The same could be said about every technology.

[–] Feathercrown@lemmy.world 2 points 9 months ago

Uh, no. Why would that be the case? Every technology has unique upsides and downsides and the downsides of this one are not being handled correctly and are in fact being exacerbated.