Quantum computing is an expensive and tricky affair. Even if one manages to accumulate all the funds to set up one QC, the exponential scaling in the number of quantum variables makes it difficult to find the exact solutions. It gets more challenging in the case of quantum chemical equations. The solution remains out of reach for modern classical computers. To address this, the AI Quantum team at Google, in their latest work, performed the largest chemical simulation on a quantum computer to date.
About The Experiment
Quantum computers have already established that they are necessary for computations which would take an eternity to solve with the traditional ones. However, when it comes to simulating chemical experiments, quantum computers haven’t been pushed to the limit.
In a paper titled, “Hartree-Fock on a superconducting qubit quantum computer”, Google’s Quantum AI team used variational quantum eigensolver (VQE) to simulate chemical mechanisms using quantum algorithms. VQE is the most popular method to account for errors. The calculations in this experiment are twice as large as previous chemistry calculations on a quantum computer and contain 10x as many quantum gate operations.
Sign up for your weekly dose of what's up in emerging technology.
“We validate that algorithms being developed for currently available quantum computers can achieve the precision required for experimental predictions,” said the researchers. They believe that this can reveal pathways towards realistic simulations of quantum chemical systems.
Hartree-Fock method is a popular approximation method to find a solution for Schrodinger equations. Hartree-Fock has many implications in nuclear physics. And to mount this method on to a quantum computer is tricky as quantum computers are error-prone and mitigating these errors is one of the major challenges of this paradigm.
Errors in this context can surface from various sources. It can be from the interactions of the quantum circuitry with the environment or from minor temperature fluctuations. These can lead to qubit errors.
So, the researchers had a huge challenge to reduce these errors with low overhead if they wanted to simulate algorithms for simulating chemical experiments on near-term quantum devices.
Like neural networks, VQE can tolerate imperfections in data by optimisation. This method dynamically adjusts quantum circuit parameters to account for errors that occur during the quantum computation.
Where Sycamore Comes Into Play
Google’s Sycamore, which made a lot of noise late last year for demonstrating quantum supremacy, comes to the rescue of this largest quantum chemical experiment. The researchers believe that a quantum-enabled paradigm shift from qualitative/descriptive chemistry simulations to quantitative/predictive chemistry simulations could modernise the field so dramatically that the examples imaginable today are just the tip of the iceberg.
As part of the experiment, Google used the Sycamore quantum processor to simulate the binding energy of large hydrogen chains. Achieving precise control over the whole device requires fine-tuning more than 2,000 control parameters. Google claims that Sycamore has 54-qubits and consists of over 140 individually tunable elements, each controlled with high-speed, analogue electrical pulses.
To accurately control the device, they used an automated framework that maps the control problem onto a graph with thousands of nodes. With their high fidelity quantum processor, traversing these graphs for unknown parameters can be done within a day. The results show that the Quantum AI researchers have used techniques that have reduced the errors significantly.
These improvements might seem trivial on the surface, but any small change with regards to a quantum processor leads to many magnitudes of control over the simulations of the real world. Think: molecular properties, protein synthesis and drug discovery or running partial differential equations to simulate weather patterns to find the next catastrophe. The applications are endless.