Listen to this story
Google’s quantum division has achieved a significant breakthrough in a recent paper that demonstrates increasing the number of qubits can reduce errors.
Their research has developed a quantum error correction scheme that encodes a large number of physical qubits (basic information unit of quantum systems) into one logical qubit on quantum processors to reduce error.
The researchers used Google’s sycamore quantum chip to find the logical error rate between two different sizes of logical qubits—17 qubits and 49 qubits. The results suggested that the larger logical qubit had a lower error rate of 2.9% compared to the smaller logical qubit, which was about 3.0%.
Subscribe to our Newsletter
Join our editors every weekday evening as they steer you through the most significant news of the day, introduce you to fresh perspectives, and provide unexpected moments of joy
Qubits occupy values beyond the traditional states of 0 and 1, as opposed to classical computers whose basic information units are confined to these states. To correct errors in classical computing, the chip copies some of the information into redundant ‘error correction’ bits. As a result, when an error occurs—due to stray electrons crossing an imperfectly insulating barrier, or a cosmic-ray particle hampering the circuit—the chip can detect and solve the problem itself.
On the other hand, in the quantum world—since they exist in states that are a mixture of 0 and 1—a qubit cannot be read without losing its quantum state irretrievably, which is why information cannot be simply copied onto redundant qubits. Additionally, qubits are so sensitive that even stray light can cause calculation errors.
Also Read: Inside Google’s Quantum AI Campus
Google also announced that it is working on making quantum hardware, tools and applications available to its customers and partners, via Google Cloud as well, for them to leverage the capabilities of quantum in multiple new ways.
This milestone comes three years after Google initially published a paper in Nature which made it the first to achieve a ‘quantum advantage’—by demonstrating that their quantum computer was able to perform calculations that would take a normal computer thousands of years to complete.
Google’s Quantum AI project has thus been making significant leaps, bringing quantum slowly into the mainstream.