Listen to this story
The world of technology is constantly evolving and professionals need to stay up-to-date with the latest tools, technologies, and trends in their field to meet the needs of their organisations. One such technology that is gaining widespread attention for its potential to shape the future of data processing and handling is quantum computing.
Computing plays a critical role in the entire data science pipeline, from capturing and maintaining data, to processing and analysing it, and ultimately communicating or taking action based on the insights. And computational challenges are often associated with statistical analysis.
Yazhen Wang, Chair and Professor of the Department of Statistics at the University of Wisconsin, explains in an article that statistical approaches that are mathematically optimal may not always be computationally possible, while data analysis methods that are computationally efficient may not be statistically optimal.
Subscribe to our Newsletter
Join our editors every weekday evening as they steer you through the most significant news of the day, introduce you to fresh perspectives, and provide unexpected moments of joy
But, data continues to increase in both scale and complexity, and models used in fields such as deep learning are becoming more complex. With this, Wang says, the development of computational techniques involved in data science, from chips to software to systems, is becoming increasingly challenging.
“As the amount of data available to generate an effective analysis and recommendation is increasing, new models will be required to enable integrating data from an increasing variety of sources,” said Sanjay Pandit, Senior Engineering Director in Unisys India. “And new analytics and mathematical computations are necessary to improve the output speed and quality of recommendations.”
But to make it a business reality, Pandit mentions that there must be a paradigm shift in mindset to integrate compute-intensive algorithms for the future. This can be seen from two perspectives. First, from the creators of data models, where the expansion in capacity and timely, optimised output provided by quantum computing allows them to utilise more complex algorithms and expect quicker results than in the past. Having a grounding in quantum-relevant mathematical concepts may be helpful to align data models with compute processing.
Second, from the consumers of data models, who can consider leveraging data from multiple sources, challenging current algorithms, and striving for faster business outcomes.
But, it is not just quantum computing advancing data science, there is also explored the potential of data science to advance quantum computing.
Quantum certification is a process that ensures quantum devices are performing correctly by using protocol approaches such as testing and assessing their properties. This process requires data science to help calibrate and validate the properties of quantum devices.
One type of quantum algorithm, called a quantum walk-based algorithm, such as Grover’s algorithm, has larger variance than its classical counterpart. These algorithms are random and can be considered as statistical problems. This means that there is a tradeoff between statistical and computational efficiency. Quantum algorithms gain more computational efficiency (with faster computational speed) at the expense of less statistical efficiency (with larger variance) compared to classical algorithms.
Data science approaches can help to understand and optimise this tradeoff, and to identify general resources for achieving computational speedup in quantum algorithms. These resources can be physical materials, digital contents, or mathematical elements, and help to study their efficiency in achieving computational speedup.
Quantum is not absolute
According to Barry Sanders, a professor at the University of Calgary, the quantum computers we have now are small and noisy, and whether or not quantum computers are advantageous for data science might only be known through an empirical approach. This approach involves building quantum computers, testing algorithmic performance on such computers, and seeing whether an advantage has been found or not.
But, the big question mark around “if there is an advantage” doesn’t seem to deter organisations from placing their bet on quantum to be the future of computing. As per a Gartner report, 90% of organisations will partner with consulting companies or full-stack providers to accelerate quantum computing innovation through 2023.
Some early advantages are however found in quantum-adjacent computing, which uses knowledge gained from quantum algorithm development to create better algorithms today for non-quantum computers to solve problems. To be specific, the early-stage quantum-adjacent advantages are in discrete combinatorial optimisation.
Sanders cites an analysis in an article published earlier this year which identifies that quantum computing is commercially viable in combinatorial optimisation problems which essentially deals with finding optimal solutions from some finite objects. In this regard, the analysis specifies three business verticals which leverage quantum for their businesses: financial portfolio management, computing lowest-energy configurations with applications to material design and to pharmaceuticals, and predicting rare failures for advanced manufacturing.