MITB Banner

Google’s Quantum Division Reaches A New Milestone, Reduces Error Rate

Qubits are so sensitive that even stray light can cause calculation errors.
Listen to this story

Google’s quantum division has achieved a significant breakthrough in a recent paper that demonstrates increasing the number of qubits can reduce errors. 

Their research has developed a quantum error correction scheme that encodes a large number of physical qubits (basic information unit of quantum systems) into one logical qubit on quantum processors to reduce error. 

The researchers used Google’s sycamore quantum chip to find the logical error rate between two different sizes of logical qubits—17 qubits and 49 qubits. The results suggested that the larger logical qubit had a lower error rate of 2.9% compared to the smaller logical qubit, which was about 3.0%. 

Qubits occupy values beyond the traditional states of 0 and 1, as opposed to classical computers whose basic information units are confined to these states. To correct errors in classical computing, the chip copies some of the information into redundant ‘error correction’ bits. As a result, when an error occurs—due to stray electrons crossing an imperfectly insulating barrier, or a cosmic-ray particle hampering the circuit—the chip can detect and solve the problem itself. 

On the other hand, in the quantum world—since they exist in states that are a mixture of 0 and 1—a qubit cannot be read without losing its quantum state irretrievably, which is why information cannot be simply copied onto redundant qubits. Additionally, qubits are so sensitive that even stray light can cause calculation errors. 

Also Read: Inside Google’s Quantum AI Campus

Google also announced that it is working on making quantum hardware, tools and applications available to its customers and partners, via Google Cloud as well, for them to leverage the capabilities of quantum in multiple new ways. 

This milestone comes three years after Google initially published a paper in Nature which made it the first to achieve a ‘quantum advantage’—by demonstrating that their quantum computer was able to perform calculations that would take a normal computer thousands of years to complete. 

Google’s Quantum AI project has thus been making significant leaps, bringing quantum slowly into the mainstream.

Access all our open Survey & Awards Nomination forms in one place >>

Picture of Ayush Jain

Ayush Jain

Ayush is interested in knowing how technology shapes and defines our culture, and our understanding of the world. He believes in exploring reality at the intersections of technology and art, science, and politics.

Download our Mobile App

CORPORATE TRAINING PROGRAMS ON GENERATIVE AI

Generative AI Skilling for Enterprises

Our customized corporate training program on Generative AI provides a unique opportunity to empower, retain, and advance your talent.

3 Ways to Join our Community

Telegram group

Discover special offers, top stories, upcoming events, and more.

Discord Server

Stay Connected with a larger ecosystem of data science and ML Professionals

Subscribe to our Daily newsletter

Get our daily awesome stories & videos in your inbox
Recent Stories