It was a revolution that was a long time coming. We are talking about quantum computing and Mountain View headquartered search giant Google’s bid in developing quantum computer that has the capacity to outperform any supercomputer. According to Dr Prithwis Mukerjee, Program Director for the Business Analytics Programs at Praxis Business School, “IBM and Google are investing heavily in quantum computing and it is the next big thing on the horizon,” shared Dr Mukerjee, adding it is a complicated concept and will have a tremendous impact in Machine Learning, image analysis and pattern recognition.
Canada based D-Wave Systems has made several breakthroughs in quantum computing that can revolutionize engineering, modeling and simulation, healthcare, financial analysis, logistics, and national defense applications. According to the world’s foremost quantum company, quantum computers powered by quantum bits or qubits, the basic components that go into making quantum computers can solve industry-scale classification, machine learning and optimization problems across enterprises.
Google’s not alone in the quantum race, there is IBM and Microsoft as well. And if talk is about chips, Intel can never be far behind. According to news reports, Intel is tweaking its existing hardware — silicon transistor and has reportedly a team of quantum hardware engineers working in collaboration with Netherlands based QuTech quantum research institute to work on silicon qubits.
Sign up for your weekly dose of what's up in emerging technology.
Quantum computing has the power to advance Artificial Intelligence and Machine Learning, reveals Dr Mukerjee. According to Phys.org, quantum computing can impact ML through its superior processing power and has the potential to be vital to automated cars and smart factories as well. Researchers point out quantum computing can dramatically improve two main branches of machine learning — supervised and unsupervised learning and can affect reinforcement learning as well.
Why is Microsoft doubling down on quantum computing?
According to recent news reports, Microsoft has doubled its investment in quantum computing and has made four key appointments in its effort to create “a scalable quantum computer”. One of the big names include physicist Leo Kouwenhoven, professor in Applied Physics with a specialization in Quantum NanoScience. Under his charge, Microsoft is building its own lab at Delft University of Technology’s campus. According to the company’s blog, other key appointments are Charles Marcus, Matthias Troyer and David Reilly.
What Microsoft hopes to get out of this academic partnership is to “create dependable tools that scientists without a quantum background can use to solve some of the world’s most difficult problems.” And in the process, Redmond-headquartered computing giant wants to usher in a “quantum economy” that could revolutionize industries such as medicine and materials science.
Quantum computing – the future of computing
Unlike classical computers, that uses bits for encoding, quantum computers make use of qubits and is extremely useful when it comes faster computing. MIT mathematician Peter Shor and the inventor of Shor’s factoring algorithm. According to Shor, there are problems you can solve on a quantum computer in fewer steps as opposed to a classical computer. One of these is a factoring algorithm which can be solved in nq times on a computer, where N is a number of bits you want to factor. This problem of factoring large numbers will take an exponentially long time on a classical computer.
So how does a quantum computer work? Hear it from Peter Shor
Well, in the words of Shor, it is essentially a Physics experiment. Instead of bits, a quantum computer runs on qubits which are two level compound systems that can either mean zero or one or superposition of zeros and ones. And, they can also be entangled; the way the quantum factoring algorithm works is you take the number you want to be factored and you can turn using number theory. When it comes to the problem you use quantum computer as computational interferometer. “It gives you a pattern that tells you the spacing of the grating, so we have a periodic pattern and we put the information through a computational interferometer which gives you the period. So, once you have the period you can use number theory on classical computer to factor the number,” Shor explains.