this post was submitted on 08 Apr 2026
254 points (97.0% liked)

Technology

83632 readers
4162 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] hesh@quokk.au 1 points 22 hours ago* (last edited 20 hours ago) (1 children)

OK, so I don't blame the GPUs crunching out the LLM lies, or the HTML on the page, I blame Google the company that programmed them.

[–] supamanc@lemmy.world 1 points 22 hours ago (1 children)

The point is, the LLM is not 'lying' to you. It's showing you information. It doesn't 'know' whether the information is true or not. It also doesn't 'care'. Because it is a statistical model and is incapable of those things. And if you scroll back to my initial point, I said "technically, it's not lying, because lying requires intent to deceive, and LLMs don't have intent"

[–] hesh@quokk.au 1 points 21 hours ago (1 children)

What's the point of making this semantic difference though?

[–] supamanc@lemmy.world 1 points 15 hours ago

Because 1) it's true and the article is a bit misleading as to whom is actually doing the lying and 2) it's important to remember that LLM are not sentient and to push back against the tide of language which subtly suggests they are.