


Surprisingly, the quantum threshold theorem shows that if the error to perform each gate is a small enough constant, one can perform arbitrarily long quantum computations to arbitrarily good precision, with only some small added overhead in the number of gates. Since a quantum computer will not be able to perform gate operations perfectly, some small constant error is inevitable hypothetically, this could mean that quantum computers with imperfect gates can only apply a constant number of gates before the computation is destroyed by noise. Recent (as of late 2004) estimates for this threshold indicate that it could be as high as 1-3% 1, provided that there are sufficiently many qubits available.The key question that the threshold theorem resolves is whether quantum computers in practice could perform long computations without succumbing to noise.
#Alexei kitaev quantum error correction code#
re-encode each logical qubit by the same code again, and so on, on logarithmically many levels- provided the error rate of individual quantum gates is below a certain threshold as otherwise, the attempts to measure the syndrome and correct the errors would introduce more new errors than they correct for. That these codes allow indeed for quantum computations of arbitrary length is the content of the threshold theorem, found by Michael Ben-Or and Dorit Aharonov, which asserts that you can correct for all errors if you concatenate quantum codes such as the CSS codes-i.e. A newer idea is Alexei Kitaev's topological quantum codes.A more general class of codes (encompassing the former) are the stabilizer codes of Daniel Gottesman. Calderbank, Peter Shor and Andrew Steane. A generalisation of this concept are the CSS codes, named for their inventors: A.Andrew Steane found a code which does the same with 7 instead of 9 qubits, see Steane code.the Shor code, encodes 1 logical qubit in 9 physical qubits and can correct for one bit flip and one phase flip error. Over time, researchers have come up with several codes: The crucial point is that the syndrome measurement tells us as much as possible about the error that has happened, but nothing at all about the value that is stored in the logical qubit-as otherwise the measurement would destroy any quantum superposition of this logical qubit with other qubits in the quantum computer. So, the syndrome measurement "forces" the qubit to "decide" for a certain specific "Pauli error" to "have happened", and the syndrome tells us which, so that we can let the same Pauli operator act again on the corrupted qubit to revert the effect of the error. So even if the error due to the noise was arbitrary, it can be expressed as a superposition of basis operations-the error basis (which is here given by the Pauli matrices and the identity). The reason is that the measurement of the syndrome has the projective effect of a quantum measurement. The latter is counter-intuitive at first sight: Since noise is arbitrary, how can the effect of noise be one of only few distinct possibilities? In most codes, the effect is either a bit flip, or a sign (of the phase) flip, or both (corresponding to the Pauli matrices X, Z, and Y). What is more, the outcome of this operation (the syndrome) tells us not only which physical qubit was affected, but also, in which of several possible ways it was affected. This is because a quantum error correcting code is designed such that a certain operation, called syndrome measurement can determine whether a qubit has been corrupted, and if so, which one. Now, if noise or decoherence corrupts one qubit, the information is not lost. Therefore it was a relief when Peter Shor realized that, even if the information cannot be copied, the information of one qubit (now called a logical qubit) can be spread onto several ( physical) qubits by using a quantum error correcting code. However, this is not possible with quantum information, as it cannot be copied: see no-cloning theorem. if, say, one copy says, the bit is a 0, and two others claim it to be a 1, then probably all three were a 1 and the first bit got corrupted. IntroductionĬlassical error correction employs redundancy: The simplest way is to store the information multiple times, and-if these copies are later found to disagree-just take a majority vote i.e. Quantum error correction techniques are methods to protect quantum information from errors due to decoherence and other noise. [taken from wikipedia - needs major work)
