this post was submitted on 19 Apr 2026
60 points (94.1% liked)

Technology

83893 readers
2903 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
top 15 comments
sorted by: hot top controversial new old
[–] Passerby6497@lemmy.world 3 points 5 hours ago (1 children)

Sure, papers about an abacus and a dog are funny and can make you look smart and contrarian on forums. But that’s not the job, and those arguments betray a lack of expertise. As Scott Aaronson said:

Once you understand quantum fault-tolerance, asking “so when are you going to factor 35 with Shor’s algorithm?” becomes sort of like asking the Manhattan Project physicists in 1943, “so when are you going to produce at least a small nuclear explosion?”

L. O. L.

I love that this dude just casually dismissed that QC hasn't been able to factor anything larger that 21 in the last 14 years without cheating and using primes that are nothing close to real world grade primes used in crypto.

[–] FauxLiving@lemmy.world 2 points 5 hours ago (1 children)

It looks like you're doing this exact thing:

sort of like asking the Manhattan Project physicists in 1943, “so when are you going to produce at least a small nuclear explosion?”

There are a lot of engineering issues that need to be solved to get this to work. It isn't like they're going to figure out how to factor a 2 digit prime and then a year later have a breakthrough that lets them factor a 3 digit prime and then some scientists will figure out a tweak to allow a 4 digit prime.

Expecting that kind of incremental advancement is kind of like expecting the Manhattan project to make a tiny nuclear explosion and then work their way up to a larger nuclear explosion... it shows a fundamental misunderstanding of the technology.

You can't just make a tiny nuclear bomb. You either have critical mass and a big nuclear explosion or you have no nuclear explosion at all. The early experiments with quantum computers where they were able to factor small numbers is akin to the ORNL research that showed that you could split an atom with neutron bombardment.

The Manhattan project wasn't simply taking that research and then trying to split 2 atoms, and then 3 atoms until they got to the Trinity device.

[–] Passerby6497@lemmy.world 0 points 5 hours ago* (last edited 5 hours ago) (1 children)

I commented on this issue a couple of days ago here and linked a study arguing that the current methods of "factoring" via QC are not scalable

https://lemmy.world/comment/23267756

https://www.nature.com/articles/s41598-022-11687-7

The issue at hand is that there's a fundamental limit of what we can effectively do at the moment, and a lot of the hype is being driven by "factorization methods" that ultimately only twiddle a few LSBs in the number to cheat to solve it using something that's not even remotely close to a real world example.

To use the Manhattan project analogy, this would be like saying "theoretically, if you smash enough radioactive stuff together into a critical mass it will fission, so we're going to compress these bananas until we hit that point".

[–] FauxLiving@lemmy.world 1 points 4 hours ago

I agree that those experiments are not scalable.

I just see them as demonstrating a proof of concept (like ORNL demonstrating the splitting of an atom via neutron bombardment) and not as an attempt to develop a path towards arbitrary prime factorization.

Whatever the future prototype will be, it won't be created by incrementally improving on those proof of concept demonstrations.

“theoretically, if you smash enough radioactive stuff together into a critical mass it will fission, so we’re going to compress these bananas until we hit that point”.

Potassium-40 does not produce neutrons as part if its decay process, so it is not even theoretically possible to achieve criticality in that manner.

The proof of concept ORNL tests used neutron bombardment which IS theoretically a method of achieving criticality, but there was no path for incremental improvements of those specific ORNL tests into anything resembling a weapon.

There actually were weapons tests that used neutron initiators but the source of those neutrons was not a particle accelerator. (Which is good because it's hard to carry an entire particle accelerator laboratory in an ICBM)

[–] Cocodapuf@lemmy.world 1 points 5 hours ago

Sigh... That timeline adjustment didn't go in the direction I was hoping.

I guess it was inevitable, the science showed that these quantum effects are real, so it was just a matter of time before these machines really work.

Time to rethink and replace everything.

[–] jobbies@lemmy.zip 7 points 10 hours ago (1 children)

In other words; who knows when its coming but its a good idea to be prepared.

[–] Cocodapuf@lemmy.world 2 points 5 hours ago* (last edited 5 hours ago)

I mean sure, publicly "who knows", but the relevant indicators are pointing to "imminently".

[–] CoconutLove@lemmy.today 6 points 14 hours ago (3 children)

So, maybe soon, but we don't know.

[–] NoForwadSlashS@piefed.social 7 points 12 hours ago

Be careful not to observe it too closely

[–] Speculater@lemmy.world 2 points 11 hours ago (1 children)

I wrote my masters thesis on this in 2020, they weren't close then and they're not closer now.

[–] CoconutLove@lemmy.today 1 points 1 hour ago

I watched a YouTube video from someone with a PhD in quantum computing who quit the field because they didn't think the tech was going anywhere soon.

[–] GreenShimada@lemmy.world 3 points 13 hours ago

TL;DR: 🤷‍♂️

[–] Thorry@feddit.org 3 points 12 hours ago (2 children)

I’ll be honest, I don’t actually know what all the physics in those papers means. That’s not my job and not my expertise.

Well maybe not the right dude to ask then?

My 2 cents: we'll never get to any sort of practical quantum computer size. As size increases, decoherence becomes a bigger problem. This is currently fixed by having more qubits to compensate. But as the size grows, the amount of qubits needed just to compensate for decoherence grows faster. So there's a practical limit on how large of a machine is possible. And it isn't like a smaller machine is just slower, it actually simply can't do any of the cryptography breaking stuff.

From my understanding decoherence is a fundamental part of reality, which can't be helped. But who knows, there might be some breakthrough that allows for it to work. It might also be impossible given the laws of nature. And what I gather it's also impossible to prove it can't be done.

So that's why quantum computers have been in this limbo state for years now. They might be just around the corner or they might never exist.

In the security world people are worried stuff is stored today, for it to be decrypted in 20 years time. So there is a push to think about this and take precautions. This seems smart, not because they think quantum computers will exist, but just as a precaution in case it turns out they do.

[–] Cocodapuf@lemmy.world 1 points 5 hours ago

That's exactly the attitude the author was warning against. "Trust me, I know better, this is nothing"

[–] jungle@lemmy.world 1 points 11 hours ago* (last edited 11 hours ago)

Well maybe not the right dude to ask then?

Are you an expert in QC? You didn't read past the first few paragraphs, did you?