this post was submitted on 21 Mar 2024
97 points (89.4% liked)

Technology

59605 readers
3366 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] spujb@lemmy.cafe -1 points 8 months ago* (last edited 8 months ago) (1 children)

i miss when we had kept gpt unpublished because it was “too dangerous”. i wish we could have released it in a more mature way.

because we were right. we couldn’t be trusted and immediately ruined the biggest wonder of humanity by having it generate thousands to millions of articles for a quick buck. toothpaste is out of the tube now and it can never go back in.

[–] r3df0x@7.62x54r.ru 1 points 8 months ago* (last edited 8 months ago) (2 children)

Someone would have made one eventually. Unless the government monitors every computer in existence, AI is inevitable.

[–] JackGreenEarth@lemm.ee 4 points 8 months ago (2 children)

And just to make it clear, we should not give the government the ability to monitor every computer in existence, or even any computer not owned by them.

[–] spujb@lemmy.cafe 1 points 8 months ago* (last edited 8 months ago)

also, there are absolutely other ways to regulate technology, especially since it’s a tech that’s being bought and sold.

“monitor every computer” is emphatically not the only solution ? and it’s weird that they suggested that lol

[–] r3df0x@7.62x54r.ru 0 points 8 months ago

That's why AI is inevitable without a massive surveillance state.

[–] spujb@lemmy.cafe 3 points 8 months ago* (last edited 8 months ago)

it’s not the “making one” that’s a problem. it’s the making, optimizing and rabid marketing of one in the service of capital instead of humans.

if only a bunch of open source, true non-profits released language models, the landscape might still suck but would be distinctly less toxic.

and if the government (or even a decently sized ngo standards entity) had worked proactively with computer scientists to find solutions like watermarking, labor replacement protections, and copyright protections, things might be arguably perfect. not one of those things happened and so further into the hellscape we descend.