this post was submitted on 17 May 2024
503 points (94.8% liked)

Technology

59534 readers
3195 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] FalseMyrmidon@kbin.run 61 points 6 months ago (4 children)

Who's ignoring hallucinations? It gets brought up in basically every conversation about LLMs.

[–] 14th_cylon@lemm.ee 80 points 6 months ago (3 children)

People who suggest, let's say, firing employees of crisis intervention hotline and replacing them with llms...

[–] SkyezOpen@lemmy.world 24 points 6 months ago

"Have you considered doing a flip as you leap off the building? That way your death is super memorable and cool, even if your life wasn't."

-Crisis hotline LLM, probably.

[–] Voroxpete@sh.itjust.works 17 points 6 months ago (1 children)

Less horrifying conceptually, but in Canada a major airline tried to replace their support services with a chatbot. The chatbot then invented discounts that didn't actually exist, and the courts ruled that the airline had to honour them. The chatbot was, for all intents and purposes, no more or less official a source of data than any other information they put out, such as their website and other documentation.

[–] 14th_cylon@lemm.ee 1 points 6 months ago

i approve of that. it is funny and there is no harm to anyone else other than the shareholders, so... 😆

[–] L_Acacia@lemmy.one 3 points 6 months ago

They know the tech is not good enough, they just dont care and want to maximise profit.

[–] nyan@lemmy.cafe 12 points 6 months ago (1 children)

The part that's being ignored is that it's a problem, not the existence of the hallucinations themselves. Currently a lot of enthusiasts are just brushing it off with the equivalent of ~~boys will be boys~~ AIs will be AIs, which is fine until an AI, say, gets someone jailed by providing garbage caselaw citations.

And, um, you're greatly overestimating what someone like my technophobic mother knows about AI ( xkcd 2501: Average Familiarity seems apropos). There are a lot of people out there who never get into a conversation about LLMs.

[–] AnarchistArtificer@slrpnk.net 1 points 6 months ago

I was talking to a friend recently about this. They studied medieval English and aren't especially techy, besides being a Millennial with techy friends; I said that merely knowing and using the term LLM correctly puts their AI knowledge above the vast majority of people (including a decent chunk of people trying to make a quick buck off of AI hype)

[–] Neato@ttrpg.network 5 points 6 months ago (1 children)

It really needs to be a disqualifying factor for generative AI. Even using it for my hobbies is useless when I can't trust it knows dick about fuck. Every time I test the new version out it gets things so blatantly wrong and contradictory that I give up; it's not worth the effort. It's no surprise everywhere I've worked has outright banned its use for official work.

[–] DdCno1@kbin.social 3 points 6 months ago

I agree. The only application that is fine for this in my opinion is using it solely for entertainment, as a toy.

The problem is of course that everyone and their mothers are pouring billions into what clearly should only be used as a toy, expecting it to perform miracles it currently can not and might never be able to pull off.

[–] Teodomo@lemmy.world 1 points 6 months ago* (last edited 6 months ago)

Maybe on Lemmy and in some pockets of social media. Elsewhere it definitely doesn't.

EDIT: Also I usually talk with IRL non-tech people about AI, just to check what they feel about it. Absolutely no one so far knew what hallucinations were.