this post was submitted on 05 Sep 2024
47 points (94.3% liked)

Technology

59569 readers
3431 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] ContrarianTrail@lemm.ee 6 points 2 months ago (13 children)

Bullshitting implies intention to do so. LLMs make mistakes, just like humans.

[–] wewbull@feddit.uk 17 points 2 months ago (12 children)

An LLMs "intent" is always to give you a plausible response even if it doesn't have the "knowledge". The same behaviour in a human would be classed as lying IMHO.

[–] ContrarianTrail@lemm.ee 2 points 2 months ago (11 children)

But you wouldn't call it lying if a person tells you something they think is true but turns out to be false. Lying means intentionally giving out false information. LLMs don't have intentions.

[–] wewbull@feddit.uk 1 points 2 months ago

...but if they don't know I expect them to say so. An LLM isn't trustworthy until it says "I don't know".

load more comments (10 replies)
load more comments (10 replies)
load more comments (10 replies)