Phillip Hallam-Baker wrote:
It is my understanding of the work on Quantum error correction that it is correcting errors in the measurement of quantum states rather than trying to compensate for decoherence so the base assumption of the paper seems to
> be off. Read the draft and you can find Shor's paper, the first paper on quantum error correction (reference [1] of the draft), is titled "Scheme for reducing decoherence in quantum computer memory". The idea of Shor was that the measurement will reset randomly decoherred state to one of a finite set of states. As measurement result can identify which member of the set the states are reset by the measurement, the original state can be restored, which, Shor said, is reduction of decoherence, where reduction means compensation for single, but not double, qubit errors. The problem is that if a state is entangled with many unentangled terms, unless relative coherence between the terms is strictly retained, which is the impossible assumption of Shor, such reduction is impossible. Differently decoherred terms need term specific ways to reduce decoherence, which is impossible by measuring fixed number of extra qubits.
There are good reasons to build quantum computers,
If only it had scaled with quantum error correction. Masataka Ohta