High-Fidelity Quantum Computing Is Now Possible, Thanks AI

To read electron spin states on quantum dots, SANKEN researchers use machine learning.

Researchers at Osaka University, led by the Institute of Scientific and Industrial Research (SANKEN), developed a deep neural network to properly predict the output state of quantum bits in the presence of external noise. The team’s new strategy paves the way for quantum computers to gain widespread adoption. 

SANKEN researchers have significantly improved accuracy when measuring the spin states of electrons on quantum dots using machine learning classification, which may pave the way for more robust and practical quantum computing. Researchers from SANKEN used a machine learning technique called a deep neural network to decipher the signal produced by electrons’ spin orientation on quantum dots. “We created a classifier based on a deep neural network to correctly measure the state of a qubit even when the input signals are noisy,” co-author Takafumi Fujita explains.

System Development

The researchers taught the machine learning system to discriminate between these signals and noise. They employed a convolutional neural network to detect essential signal features and a recurrent neural network to monitor the time series data in their deep neural network.

AIM Daily XO

Join our editors every weekday evening as they steer you through the most significant news of the day, introduce you to fresh perspectives, and provide unexpected moments of joy
Your newsletter subscriptions are subject to AIM Privacy Policy and Terms and Conditions.

“Our technique streamlined the process of responding to heavy interference, which varies depending on the situation,” senior scientist Akira Oiwa explains. The team first evaluated the classifier’s robustness by using generated noise and drift. The system was then trained on actual data from an array of quantum dots, achieving accuracy rates of over 95%. The findings pave the way for high-fidelity measurements of large-scale qubit arrays in future quantum computers.

Potential Quantum Applications

Quantum Machine Learning (QML), Quantum Simulation (QS), and Quantum-enhanced Optimization (QEO) are three of the most potential applications for near-term devices. Recent progress in these three domains has mostly been driven by a class of hybrid quantum-classical variational algorithms. A classical computer assists the quantum computer in searching through a parameterised class of quantum circuits in these methods. Quantum neural networks are a term that is occasionally used to refer to these parameterised quantum circuits. 


Download our Mobile App



Certain machine learning tasks can be made faster using improved quantum-enhanced algorithms. The great majority of quantum-enhanced algorithms have been designed for fault-tolerant quantum computing. However, there will be large levels of uncertainty in the operation of quantum devices shortly because they will not be error corrected. Due to this, whether quantum devices with pre-fault-tolerance noise levels are suitable for industrial applications is now in question. 

Nevertheless, be aware that some optimisation and quantum chemistry applications can benefit from a quantum advantage that is partially resilient to noise when using classical-quantum hybrid algorithms like the Quantum Eigensolver (QE). When it comes to machine-learning applications, it’s been shown that annealers can do some things like machine learning, but it’s still unclear whether a quantum computing model of a near-term circuit can do the same things.

Notable Research Contributions

Variational quantum algorithms had grown in popularity due to the Variational Quantum Eigensolver, which triggered a Cambrian boom of work in the field of near-term algorithms when it was developed and implemented in the Centre for Quantum Photonics, University of Bristol. (Read here)

Microsoft Research in Redmond, Washington; a group of researchers has devised two algorithms for effectively training restricted Boltzmann machines (RBMs) based on amplification amplitude and quantum Gibbs sampling. (Read here)

According to researchers at University College London’s Gatsby Computational Neuroscience Unit, QML algorithms quadratically reduce the examples to train the RBM, but their algorithm scaling in terms of the number of edges is quadratically less than that of contrastive divergence. (Read here)

Conclusion

As indicated previously, quantum computing can aid in the solution of scientific problems. However, numerous issues in quantum machine learning must be solved on both the hardware and software sides. To begin, to reap the benefits of quantum algorithms discussed in this paper, quantum hardware must be viable. Second, QML means the implementation of interface devices to encode classical data in quantum mechanical form. These hardware difficulties are not minor and must be overcome. Thirdly, to fully realise QML techniques, the limitations of quantum algorithms’ application must be resolved.

Sign up for The Deep Learning Podcast

by Vijayalakshmi Anandan

The Deep Learning Curve is a technology-based podcast hosted by Vijayalakshmi Anandan - Video Presenter and Podcaster at Analytics India Magazine. This podcast is the narrator's journey of curiosity and discovery in the world of technology.

Dr. Nivash Jeevanandam
Nivash holds a doctorate in information technology and has been a research associate at a university and a development engineer in the IT industry. Data science and machine learning excite him.

Our Upcoming Events

24th Mar, 2023 | Webinar
Women-in-Tech: Are you ready for the Techade

27-28th Apr, 2023 I Bangalore
Data Engineering Summit (DES) 2023

23 Jun, 2023 | Bangalore
MachineCon India 2023 [AI100 Awards]

21 Jul, 2023 | New York
MachineCon USA 2023 [AI100 Awards]

3 Ways to Join our Community

Telegram group

Discover special offers, top stories, upcoming events, and more.

Discord Server

Stay Connected with a larger ecosystem of data science and ML Professionals

Subscribe to our Daily newsletter

Get our daily awesome stories & videos in your inbox
MOST POPULAR

Council Post: The Rise of Generative AI and Living Content

In this era of content, the use of technology, such as AI and data analytics, is becoming increasingly important as it can help content creators personalise their content, improve its quality, and reach their target audience with greater efficacy. AI writing has arrived and is here to stay. Once we overcome the initial need to cling to our conventional methods, we can begin to be more receptive to the tremendous opportunities that these technologies present.