this post was submitted on 07 Aug 2024
129 points (93.3% liked)

Technology

59589 readers
3077 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[โ€“] MonkderVierte@lemmy.ml 5 points 3 months ago (2 children)

It's already fact that the Turing Test only determines how much it can simulate human behavior. Nothing with intelligence to do.

Exactly. You could ask a human a lot of questions and make an "AI" that literally just looks up answers to common questions and have it pass the Turing test, provided the pre-answered questions cover what the human proctoring the "test" asks.

If we take it a step further and ask, why can't an LLM be "conscious," there's a lot of studies by experts that explain that. So I'll refer OP there.