this post was submitted on 25 Feb 2026
462 points (95.8% liked)

Technology

81907 readers
5040 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] hector@lemmy.today 1 points 11 hours ago (1 children)

Oh, how far from the blast and how does it mess them up do you know? I should know that I guess I just heard about the emp, and not sure how a neutron bomb would affect electronics either.

[–] Jax@sh.itjust.works 2 points 10 hours ago* (last edited 10 hours ago) (1 children)

No, that I can't answer — it would depend entirely on the level of fallout and where it happens to land.

You would need to be able to perfectly, and I mean perfectly, predict weather months in advance in order to prepare accordingly.

The reaility is that for an AI, or rather an AGI, to make the choice to launch nukes would require them to reach a point where they accept the potential loss of their own 'life' in exchange for whatever value a nuclear war might hold. I struggle to believe that a 'true' AGI would make that choice. There are far too many variables to control in comparison to a biological agent, one that likely would not affect a machine.

Now, a modern AI making that choice? Absolutely possible, the things are fucking crazy with literally no concept of what life is.

[–] unwarlikeExtortion@lemmy.ml 1 points 6 hours ago

An AI can easily start nuclear war, as can a human.

The only thing preventing a nuclear disaster are all the institutional measures limiting its accessiblity.

If you gave a single human (or a single AI) access to a magic no-strings-attached 'Send a Nuke' button, either the human/AI is the second coming of Jesus Christ, or a nuke will befall some unlucky portion of the population sooner or later. Bonus points if people can talk to the AI or if access to the button is hereditary.