this post was submitted on 28 Aug 2025
305 points (99.7% liked)

Technology

74524 readers
3739 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] Perspectivist@feddit.uk 18 points 6 hours ago (1 children)

This is exactly the use-case for an LLM

I don't think it is. LLM is language generating tool, not language understanding one.

[–] iglou@programming.dev 0 points 6 hours ago (1 children)

That is actually incorrect. It is also a language understanding tool. You don't have an LLM without NLP. NLP includes processing and understanding natural language.

[–] Perspectivist@feddit.uk 18 points 5 hours ago (1 children)

But it doesn’t understand - at least not in the sense humans do. When you give it a prompt, it breaks it into tokens, matches those against its training data, and generates the most statistically likely continuation. It doesn’t “know” what it’s saying, it’s just producing the next most probable output. That’s why it often fails at simple tasks like counting letters in a word - it isn’t actually reading and analyzing the word, just predicting text. In that sense it’s simulating understanding, not possessing it.

[–] iglou@programming.dev -3 points 5 hours ago (1 children)

You're entering a more philosophical debate than a technical one, because for this point to make any sense, you'd have to define what "understanding" language means for a human in a level as low as what you're describing for an LLM.

Can you affirm that what a human brain does to understand language is so different to what an LLM does?

I'm not saying an LLM is smart, but saying that it doesn't understand, when having computers "understand" natural language is the core of NLP, is meh.

[–] Feyd@programming.dev 11 points 5 hours ago

No they're not they're talking purely at a technical level and you're trying to apply mysticism to it.