, author: Ermakova M.

Google announces a new milestone in quantum computing

The company has overcome one of the biggest hurdles to making quantum computing truly possible.

Google researchers note that they have overcome one of the biggest traps on the way to ensure that in the future humanity can use quantum computing. They achieved this by developing a new system that controls the high error rate inherent in this technology, due to the fact that quantum bits retain their state and phase for only one moment in time. In addition to how its physical components are exposed to any environmental influences.

While the technology is only possible in theory or in very limited and controlled situations in the lab, Google has found a way to reduce the error rate as the size of the system increases. Something that until now has been impossible.

In classical computing, everything comes down to ones and zeros. Each bit can only be 0 or 1, true or false. And with this storage and way of working, everything from programming languages to video games has been created. However, there is another way to perform calculations, at least on paper: quantum computing, in which the bits are not binary.

Google claims to have reached a milestone by discovering a way to distribute processed information across multiple qubits so that a computer can store information long enough to complete a computation even if individual qubits have gone out of their quantum state.

That is, they give up many good qubits in favor of one excellent one. Information is encoded in multiple physical qubits to build a single logic that is more robust, reliable, and capable of running large-scale quantum algorithms.

"Under the right conditions, the more physical qubits used to build a logic qubit, the better that logic qubit will be. However, this will not work if the added errors of each additional physical qubit outweigh the benefits of QEC. So far, a high level of physical errors has always prevailed," - explained Hartmut Neven, Google's vice president of engineering.

The team, according to their Nature paper, has succeeded in developing an error-correcting code that, for the first time, reduces the error rate the more qubits are used in a logic qubit.

x