this post was submitted on 26 Apr 2024
55 points (80.9% liked)

Technology

59534 readers
3143 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] db0@lemmy.dbzer0.com 45 points 6 months ago* (last edited 6 months ago) (1 children)

The model is just hallucinating. It has no capacity to execute code on its own and most FOSS clients of course won't send anything back to meta.

[–] Diplomjodler3@lemmy.world 16 points 6 months ago (2 children)

Damn thing doesn't even know it's running locally. Just ask it. And it can't tell the time.

[–] DarkThoughts@fedia.io 12 points 6 months ago

Another easy test is to ask a question, note the answer, then clear the chat and repeat the same question. Do this over and over again and you'll see varying responses because the majority of it is just made up instead of pooled information from somewhere. A lot of those LLM models are just good for roleplaying purposes. But even the large commercial models that actually were trained on a lot of potentially valuable information have this issue, which is why you should never blindly trust LLM answers.

[–] db0@lemmy.dbzer0.com 3 points 6 months ago

Of course not. They don't have any external info other than what you provide them. They don't know the concept of "running local" at all