It is widely claimed, without any hesitation or disclaimer, that quantum computers (QC) can (or well, will be soon able to) break cryptography by factorizing large numbers. However, to my knowledge, largest number ever factorized by a quantum computer is 21 (= 7 x 3) for Shors algorithm, and to make things worse, this world record hasn't been improved for a whole decade!

Some larger numbers were factorized too, but not like "give me a number and I will factorize it". Rather, it was done by starting from a known solutuion and figuring out if this particular case could be programmed in to the QC - which is basically cheating, or at least not a useful factorization process.

So what is stopping the game? What are the factors that limit the size (measured in number of bits) of an arbitrary numer that can be factorized by a QC? Are they just a technologic annoyance or is there a fundamental obstacle that invalidades QC from ever factoring large numbers?

More Mario Stipčević's questions See All
Similar questions and discussions