this post was submitted on 28 Apr 2024
388 points (83.4% liked)

Technology

59534 readers
3195 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] conciselyverbose@sh.itjust.works 1 points 6 months ago (1 children)

Not if it comes with a level of invasiveness that is unforgivable it wouldn't be.

Forcibly invading someone's mind after they were convicted beyond reasonable doubt would make you a monster.

[–] kevincox@lemmy.ml 1 points 6 months ago (1 children)

Most trials and discoveries are already incredibly invasive. I don't really see why the mind should be treated much differently. I would rather define what is acceptable evasiveness generally than different for mind vs written down in my diary.

Also why would you do this after they are convicted beyond reasonable doubt? This should only be done when required to reach the conclusion. Just like avoiding physical searches you can just plead guilty if you don't want to be investigated.

If used properly this could actually be less invasive. Imagine a quick check of some facts that you believe with an automated machine that only returns the basic required information and you could be removed from the suspect list before other searches need to be done (like lawyers searching through your emails or personal notes).

I agree that this is a very dangerous thing to consider, and it needs to be applied very carefully. But I don't think it is in the abstract any more morally wrong than the current methods of evidence gathering that we currently do. In many ways it could potentially be less harmful to the person being investigated. However it will be impossible to know for sure until we know how exactly this technology (when it is developed) works.

[–] conciselyverbose@sh.itjust.works 2 points 6 months ago* (last edited 6 months ago)

No, mind reading is a hundred orders of magnitude more invasive than any possible search.

There is no possible scenario where it could ever possibly be justified or excused. Your brain is unconditionally sacred. There is no possible theoretical version of such technology that could ever not be pure, unforgivable evil to use without completely uncoerced consent.