Phillip Hallam-Baker wrote:
Witness the large number of implementations of Shorr's algorithm for factoring that effectively make use of the factors as an input.
> I have a research degree in experimental particle physics. Then, I hope you can understand my draft does not deny Shor's algorithm for factoring. There are two major contributions of Shor for quantum computing. One is Shor's algorithm for efficient factoring, which is not denied by my draft. The other is quantum error correction, which is denied by my draft. Latter was expected to make quantum computers scale. As a result, we can't construct a quantum computer to run Shor's factoring algorithm for >1000 (or >100 or, maybe, even >20) bit numbers. Masataka Ohta