Google’s Quantum Division Reaches A New Milestone, Reduces Error Rate

Qubits are so sensitive that even stray light can cause calculation errors.
Listen to this story

Google’s quantum division has achieved a significant breakthrough in a recent paper that demonstrates increasing the number of qubits can reduce errors. 

Their research has developed a quantum error correction scheme that encodes a large number of physical qubits (basic information unit of quantum systems) into one logical qubit on quantum processors to reduce error. 

The researchers used Google’s sycamore quantum chip to find the logical error rate between two different sizes of logical qubits—17 qubits and 49 qubits. The results suggested that the larger logical qubit had a lower error rate of 2.9% compared to the smaller logical qubit, which was about 3.0%. 

Subscribe to our Newsletter

Join our editors every weekday evening as they steer you through the most significant news of the day, introduce you to fresh perspectives, and provide unexpected moments of joy
Your newsletter subscriptions are subject to AIM Privacy Policy and Terms and Conditions.

Qubits occupy values beyond the traditional states of 0 and 1, as opposed to classical computers whose basic information units are confined to these states. To correct errors in classical computing, the chip copies some of the information into redundant ‘error correction’ bits. As a result, when an error occurs—due to stray electrons crossing an imperfectly insulating barrier, or a cosmic-ray particle hampering the circuit—the chip can detect and solve the problem itself. 

On the other hand, in the quantum world—since they exist in states that are a mixture of 0 and 1—a qubit cannot be read without losing its quantum state irretrievably, which is why information cannot be simply copied onto redundant qubits. Additionally, qubits are so sensitive that even stray light can cause calculation errors. 

Also Read: Inside Google’s Quantum AI Campus

Google also announced that it is working on making quantum hardware, tools and applications available to its customers and partners, via Google Cloud as well, for them to leverage the capabilities of quantum in multiple new ways. 

This milestone comes three years after Google initially published a paper in Nature which made it the first to achieve a ‘quantum advantage’—by demonstrating that their quantum computer was able to perform calculations that would take a normal computer thousands of years to complete. 

Google’s Quantum AI project has thus been making significant leaps, bringing quantum slowly into the mainstream.

Ayush Jain
Ayush is interested in knowing how technology shapes and defines our culture, and our understanding of the world. He believes in exploring reality at the intersections of technology and art, science, and politics.

Download our Mobile App

MachineHack | AI Hackathons, Coding & Learning

Host Hackathons & Recruit Great Data Talent!

AIMResearch Pioneering advanced AI market research

With a decade of experience under our belt, we are transforming how businesses use AI & data-driven insights to succeed.

The Gold Standard for Recognizing Excellence in Data Science and Tech Workplaces

With Best Firm Certification, you can effortlessly delve into the minds of your employees, unveil invaluable perspectives, and gain distinguished acclaim for fostering an exceptional company culture.

AIM Leaders Council

World’s Biggest Community Exclusively For Senior Executives In Data Science And Analytics.

3 Ways to Join our Community

Telegram group

Discover special offers, top stories, upcoming events, and more.

Discord Server

Stay Connected with a larger ecosystem of data science and ML Professionals

Subscribe to our Daily newsletter

Get our daily awesome stories & videos in your inbox
MOST POPULAR