Qubits Can Be as Safe as Bits, Researchers Show | Quanta Magazine

‘oer the centuries, we ‘ve learned to put information into increasingly durable and useful form, from stone tablets to paper to digital media. beginning inna 1980s, researchers began theorizing bout how to store the information inside a quantum computer, where tis subject to all sorts of atomic-scale errors. by the 1990s they had found a few methods, but these methods fell short o'their rivals from classical (regular) computers, which provided an incredible combination of reliability and efficiency.

now, in a preprint posted on nov 5, pavel panteleev and gleb kalachev of moscow state university ‘ve shown that — at least, in theory — quantum information can be protected from errors just swell as classical information can. they did it by combining two exceptionally compatible classical methods and inventing new tek knicks to prove their properties.

“it’s a huge achievement by pavel and gleb,” said jens eberhardt of the university of wuppertal in germany.

tody, quantum computers can use 1-ly round 100 qubits, the quantum equivalent of classical bits. they will need thousands or millions + to become truly useful. the new method for quantum data maintains constant performance as the № of qubits scales up, so it ‘d help keep the size and complexity of future quantum computers to a minimum.

the authors also showed how their quantum method ‘d play a long-sought after role in making classical information testable for errors — atta same time that another group discovered the same capability in a classical method. “tis amazing how a problem twas' open for 30 yrs was solved primordially atta same time by two ≠ groups,” said alex lubotzky of the weizmann institute of sci in israel.

ultimately, we can never protect information perfectly from all errors. we know that we can mathematically represent classical information, s'as a word or a №, as a sequence of binary digits or bits, 1s and 0s. but whn'we actually build these bits, inna form of electrical circuits, we find that unwanted electrical interactions — often simply called noise — cause random bits to flip to the wrong val.

inna 1940s and ’50s, claude shannon and richard hamming figd out the 1st solutions, discovering methods to detect and correct errors b4 computation began.

hamming’s method was pticularly practical. starting with an initial sequence of bits that represented raw data (for ex, the sequence 110101 mite represent the № 53), he added new bits to the sequence that acted like receipts specifying how somd' initial bits ‘d sum up. (for ex, 110101 ‘d ‘ve a digit appended to it, 0, which tells us the sum of all the other bits has an even val.) by checking the data bits gainsta receipt bits (in this ex, making sure they really do sum to an even val), errors ‘d be routinely detected, located and corrected.

these methods for correcting errors are now called error-correcting codes, or merely codes. a code forges a reedy sequence of bits into an iron chn that can be repaired, atta cost of bein’ longer and less efficient.

but for quantum computers, creating a code proved harder. instead of bits, quantum computers use qubits tha're ptly 0 and ptly 1 — all at once. these are susceptible to two kinds of errors, which either collapse them to a single val (either 0 or 1) or throw off the balance tween them. each kind of error can also interact w'da other, making the protection of qubits much + complicated.

“these qubits are bad. they are really, really noisy,” said panteleev.

in 1995, peter shor showed that the problem was, surprisingly, simpler than it seemed. he created a quantum code from a clever combination of two classical codes, one for each type of error. iow, he forged the quantum ore of qubits into a strong chn swell. but this 1st quantum code was inefficient, requiring many receipt qubits for an initial sequence.

the tek of classical codes was much + advanced by comparison, with 3 specific properties known to be obtainable. a code that had all 3 o'em was called simply “good.” 1st, it ‘d be able to correct many errors (making the chn strong). 2nd, it ‘d require few receipt bits to be added (making the chn lite and efficient). third, the strength and efficiency of the chn ‘d remain constant, no matter how long a sequence of bits you started with. with that property, called constant scaling, shannon showed that you ‘d always improve the ability to suppress errors by simply increasing the chn length. this remarkable finding was l8r reproduced in a quantum context.

after shor’s work, researchers sought to create quantum codes w'da same properties. and they succeeded — for those 3. but there was an additional, 4th property offa code they needed, and ‘d not obtain in addition to the other 3. known as lo-density parity check (ldpc), it states that each receipt ‘d 1-ly sum up a lil № of bits (or qubits).

“it’s a neat thing to ‘ve for classical codes,” said nikolas breuckmann of university college london. “but it’s absolutely indispensable for quantum codes.”

unfortunately, shor’s initial approach of combining classical codes broke down when trying to create a good quantum ldpc code. for mathematical reasons, good classical ldpc codes were incompatible, and ‘d not be combined in an optimal way. for over 20 yrs, no one ‘d fig out how to obtain a quantum code that simultaneously possessed the ldpc property with constant scaling: as quantum ldpc codes increased in length, their strength degraded.

then in 2020, a series of ≠ researchers, including panteleev and kalachev, figd out radically new approaches to combining classical codes to make a quantum code. the quantum chns t'they forged still became weaker with increasing length, but not as rapidly as the codes that had come b4. breuckmann and eberhardt even created a quantum code t'they conjectured ‘d possess constant scaling, but they were unable to prove it.

in 2021, panteleev and kalachev built onna surge of work to create a new quantum code, which they ‘d prove possessed the elusive combination of all 4 properties. the distinguishing features of the classical codes that combined to make their quantum code is their symmetry.

the symmetry offa code can be understood by conceiving o'it as a graph, a collection of edges (lines) connected by vertices (dots), which is a common perspective inna mathematics of codes. bits of information are represented as edges of the graph, and receipts are represented as vertices, which sum up all the edges (bits) that touch them. from this perspective, a code witha circular graph can be said to ‘ve rotational symmetry, for ex. remarkably, the geometric properties offa graph can be identified with properties of its code. for ex, the length of the shortest path round a torus (a doughnut shaped surface) can be identified w'da corresponding code’s strength (the № of errors it can correct).

panteleev and kalachev’s quantum code is analogous to a combination or product of graphs, each with exceptional symmetry. the quantum code is ⊢ itself highly symmetric, like a torus produced from two circles. by twisting the torus in various ways, the lengths on its surface can be constantly increased as the № of qubits inna graph becomes larger. ultimately, this provides constant scaling, in addition to the other 3 properties.

the result means'dat quantum codes now match classical codes in their combination of properties. it also provides a means to make quantum computers + efficient, since their ability to correct errors can now (theoretically) remain constant as they are made larger.

“it brings the theoretical quality of these quantum codes to the point that has existed in classical coding for a long time,” said naomi nickerson of the quantum computing company psiquantum.

inna course of achieving their result, panteleev and kalachev also became aware that their quantum code ‘d be interpreted as a classical code witha spesh property. if the data encoded by their method is filled witha large proportion of errors, this implies that checks of almost any receipt will unveil them. this property is called local testability, and along w'da code’s strength and efficiency, t'has constant scaling of all 3 properties, making for a new type of code that had also long evaded researchers.

original content at: www.quantamagazine.org…
authors: mordechai rorvig

Share: