this post was submitted on 19 Jul 2025
432 points (94.1% liked)

Technology

73071 readers
2369 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
 

..without informed consent.

you are viewing a single comment's thread
view the rest of the comments
[–] zapzap@lemmings.world 8 points 3 days ago (1 children)

I think sometimes when we ask people something we're not just seeking information. We're also engaging with other humans. We're connecting, signaling something, communicating something with the question, and so on. I use LLMs when I literally just want to know something, but I also try to remember the value of talking to other human beings as well.

[–] finitebanjo@lemmy.world 5 points 3 days ago (1 children)

You should pretty much assume everything that a chatbot says could be false to a much higher degree than human written content, making it effectively useless for your stated purpose.

[–] zapzap@lemmings.world 0 points 1 day ago (1 children)

That has not been my experience.

[–] finitebanjo@lemmy.world 1 points 1 day ago (1 children)

I gave advice, advice rarely follows what you've experienced or people wouldn't feel the need to give it.