Now Reading
High-Fidelity Quantum Computing Is Now Possible, Thanks AI

High-Fidelity Quantum Computing Is Now Possible, Thanks AI

  • To read electron spin states on quantum dots, SANKEN researchers use machine learning.

Researchers at Osaka University, led by the Institute of Scientific and Industrial Research (SANKEN), developed a deep neural network to properly predict the output state of quantum bits in the presence of external noise. The team’s new strategy paves the way for quantum computers to gain widespread adoption. 

SANKEN researchers have significantly improved accuracy when measuring the spin states of electrons on quantum dots using machine learning classification, which may pave the way for more robust and practical quantum computing. Researchers from SANKEN used a machine learning technique called a deep neural network to decipher the signal produced by electrons’ spin orientation on quantum dots. “We created a classifier based on a deep neural network to correctly measure the state of a qubit even when the input signals are noisy,” co-author Takafumi Fujita explains.

System Development

The researchers taught the machine learning system to discriminate between these signals and noise. They employed a convolutional neural network to detect essential signal features and a recurrent neural network to monitor the time series data in their deep neural network.

“Our technique streamlined the process of responding to heavy interference, which varies depending on the situation,” senior scientist Akira Oiwa explains. The team first evaluated the classifier’s robustness by using generated noise and drift. The system was then trained on actual data from an array of quantum dots, achieving accuracy rates of over 95%. The findings pave the way for high-fidelity measurements of large-scale qubit arrays in future quantum computers.

Potential Quantum Applications

Quantum Machine Learning (QML), Quantum Simulation (QS), and Quantum-enhanced Optimization (QEO) are three of the most potential applications for near-term devices. Recent progress in these three domains has mostly been driven by a class of hybrid quantum-classical variational algorithms. A classical computer assists the quantum computer in searching through a parameterised class of quantum circuits in these methods. Quantum neural networks are a term that is occasionally used to refer to these parameterised quantum circuits. 

Certain machine learning tasks can be made faster using improved quantum-enhanced algorithms. The great majority of quantum-enhanced algorithms have been designed for fault-tolerant quantum computing. However, there will be large levels of uncertainty in the operation of quantum devices shortly because they will not be error corrected. Due to this, whether quantum devices with pre-fault-tolerance noise levels are suitable for industrial applications is now in question. 

Nevertheless, be aware that some optimisation and quantum chemistry applications can benefit from a quantum advantage that is partially resilient to noise when using classical-quantum hybrid algorithms like the Quantum Eigensolver (QE). When it comes to machine-learning applications, it’s been shown that annealers can do some things like machine learning, but it’s still unclear whether a quantum computing model of a near-term circuit can do the same things.

Notable Research Contributions

Variational quantum algorithms had grown in popularity due to the Variational Quantum Eigensolver, which triggered a Cambrian boom of work in the field of near-term algorithms when it was developed and implemented in the Centre for Quantum Photonics, University of Bristol. (Read here)

See Also

Microsoft Research in Redmond, Washington; a group of researchers has devised two algorithms for effectively training restricted Boltzmann machines (RBMs) based on amplification amplitude and quantum Gibbs sampling. (Read here)

According to researchers at University College London’s Gatsby Computational Neuroscience Unit, QML algorithms quadratically reduce the examples to train the RBM, but their algorithm scaling in terms of the number of edges is quadratically less than that of contrastive divergence. (Read here)


As indicated previously, quantum computing can aid in the solution of scientific problems. However, numerous issues in quantum machine learning must be solved on both the hardware and software sides. To begin, to reap the benefits of quantum algorithms discussed in this paper, quantum hardware must be viable. Second, QML means the implementation of interface devices to encode classical data in quantum mechanical form. These hardware difficulties are not minor and must be overcome. Thirdly, to fully realise QML techniques, the limitations of quantum algorithms’ application must be resolved.

What Do You Think?

Subscribe to our Newsletter

Get the latest updates and relevant offers by sharing your email.
Join our Telegram Group. Be part of an engaging community

Copyright Analytics India Magazine Pvt Ltd

Scroll To Top