*Nature Communications*(open access... wheeeeee!). This sent the geeky technical press aflutter, with publications such as TechCrunch, Gizmodo, MIT Technology Review, and phys.org (saving the best for last) fawning over the latest revelation in quantum computing, joined by some mainstream press in the Wall Street Journal and Business Insider. The research concerns quantum error correction, one of the keys to realizing a functional quantum computer. Qubits are hampered by two kinds of errors. Bit-flip errors, detrimental to classical computers as well, are when a 0 turns into a 1 or vice versa. Quantum computers, unlike the classical ones, are also susceptible to

*phase*-flip errors. That's because qubits can be 0 and 1 at the same time, which we write as |0> + |1> or |0> - |1>, for example. The phase-flip error occurs when that plus sign changes to a minus, or the other way around. The way to detect these errors is to entangle the data qubits, the ones you want to store your quantum data in, with

*ancilla*qubits, that you just use for error correction. By measuring the ancilla qubits (collapsing each to a |0> or |1>), you recover the information needed to correct your data qubits. Correcting for both of these kinds of errors at the same time has finally been accomplished.

So why is this such a big deal? Well, some Negative Nancys say quantum computers will never work because the error rates are too high. And indeed, the error rates are high, with the qubit lifetime/decoherence times on the order of 0.1 milliseconds for this particular style of quantum computing (superconducting transmon qubits, if you wanna get technical). This is exactly why error correction is so important in quantum computing. If you're sitting on some classical information, you can just copy it so that you have a layer of redundancy to protect against error correction. For example, that awesome selfie you took last weekend is concurrently on your phone, Instagram, your laptop, and is your Facebook profile picture. So many backups ensure that even if the data making up the file containing your picture is corrupted, you've got a copied stored elsewhere. Thanks to the

*no-cloning theorem*of quantum mechanics, we can't take the same approach with our quantum data. Let me repeat that so that it sinks in: it is impossible to copy quantum information. As bad as this sounds, we're actually in luck, because other forms of error detection such as parity checks are conducive to use with quantum computers. Parity checks loosely consist of adding up your data and seeing if the result is odd or even after some time and comparing that against what you figured out beforehand. With some sophistication, this allows you to correct for said errors, and the quantum version of this is what IBM has just demonstrated.

Quantum Error Correction: a) The tiling of the surface code into X (bit-flip) and Z (phase-flip) errors, and their mapping onto qubits, which interact with their nearest neighbors. b) False-colored device where the qubits are inside the four square 'pockets' in the center of the image. c) 'Circuit diagram of entanglement and error measurement process
[AD Corcoles
et al, Nat Comm 6, 6979 (2015)] |

One popular architecture for quantum computing, and the one that IBM and Google (pretty much John Martinis and a third of his UCSB research group) are pursuing, utilizes the surface code of quantum error correction. (Microsoft has taken a different tack, instead attempting to make quantum computers out of unicorns). The surface code consists of an two-dimensional lattice of qubits, where each data qubit is surrounded by two bit-flip-detecting ancilla qubits and two ancillas that detect phase-flip errors (see picture). Error in the data bits is mapped (by quantum entanglement) to the ancilla qubits, which is then measured, yielding the corrections that must be made to the data qubits. IBM implemented the simplest such surface code: two data qubits and two ancilla qubits, featured in the futuristic-looking microchip with all the squiggles. The surface code of error correction lets you get away with a ballsy 1% amount of error in your (intentionally loosely-defined) quantum operations, which is the amount of error IBM and other researchers are currently in the range of. In fact, the Googs recently showed they can do repetitive error correction in one dimension (NYTimes for the layperson), but that only corrects for classical bit-flip errors. Some of the recent press, in trying hard for an IBM vs Google narrative, stated that IBM ideas had won out against Google. That's quite rash, as both companies are generally pursing the surface code with similar qubits, but IBM starting with a smaller (4-qubit), but two-dimensional lattice, and Google opting for a larger (9-qubit) but one-dimensional. Other than that, the finer points of how they're different would likely make your eyes glaze over with boredom, so I'll leave it at that.

Amazingly, even with this advance, quantum computers aren't landing under your Christmas tree anytime soon. So what's the next big step? I'm surely among company in suggesting the next big milestone in quantum computing will arrive at the 13 to 17 qubit device, where error correction may be performed to the level that a quantum state may be maintained indefinitely. And that will be the glorious time when something besides error is the biggest impediment to building a quantum computer.

## No comments:

## Post a Comment