this post was submitted on 08 Oct 2024
136 points (95.9% liked)

Technology

59589 readers
2946 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

cross-posted from: https://lemmy.ml/post/21121074

OpenAI, a non-profit AI company that will lose anywhere from $4 billion to $5 billion this year, will at some point in the next six or so months convert into a for-profit AI company, at which point it will continue to lose money in exactly the same way. Shortly after this news broke, Chief Technology Officer Mira Murati resigned, followed by Chief Research Officer Bob McGrew and VP of Research, Post Training Barret Zoph, leaving OpenAI with exactly three of its eleven cofounders remaining.

This coincides suspiciously with OpenAI's increasingly-absurd fundraising efforts, where (as I predicted in late July) OpenAI has raised the largest venture-backed fundraise of all time $6.6 billion— at a valuation of $157 billion.

top 7 comments
sorted by: hot top controversial new old
[–] br3d@lemmy.world 31 points 1 month ago (2 children)

One of my big worries with the way people are using LLMs is that they're being trained to trust whatever they spit out. Hey Google, what's the nutritional content of peanuts? And people are learning not to ask where the information came from or to check sources.

One of the many reasons this worries me is that very soon these businesses are going to need to recoup the billions they're spending, and I wonder how long until these systems start feeding paid promotions to a population that's been trained to accept whatever they're told. imagine what some businesses, or governments, would pay to have exactly their choice of words produced on demand in response to knowledge queries.

[–] LunarLoony@lemmy.sdf.org 15 points 1 month ago (1 children)

And people are learning not to ask where the information came from or to check sources.

Worst of it is, this has been a problem for as long as I can remember, and it's getting so much worse than ever now

[–] original_reader@lemm.ee 1 points 1 month ago* (last edited 1 month ago) (1 children)

Which search results or which queries could one show the average user to make that point?

[–] lime@feddit.nu 6 points 1 month ago* (last edited 1 month ago)

there's no real universal example. you need to show them that it is wrong about something you know they know, to avoid the Gell-Mann amnesia effect.

I say this from experience. unfortunately some people are just average and have interests that are entirely subjective, like makeup trends or alternative medicine, and the effect that "always check the sources"has on those people is to make them distrust every source since nothing agrees with anything else on those topics.

[–] magikmw@lemm.ee 4 points 1 month ago

My search engine usage for 25 years has been just me going "yeah right" and changing the query to make it better. But I'm wired to distrust what I feel is bullshit, and I've experienced not many people are.

[–] NeoNachtwaechter@lemmy.world 6 points 1 month ago (1 children)

Imagine Sam Altmann resigning now :)

[–] paraphrand@lemmy.world 1 points 1 month ago

But he gets to use that magenta room all by himself now.