MITB Banner

NVIDIA Unleashes Quantum Computing Prowess With a CUDA Q-wist

NVIDIA's Quantum Cloud and CUDA-Q is spearheading the quantum revolution with its hybrid approach.

Share

NVIDIA Unleashes Quantum Computing Prowess With a CUDA Q-wist

Illustration by Nikhil Kumar

Listen to this story

Quantum computing, once a realm confined to theoretical speculation, is now transitioning into practical reality, thanks to NVIDIA’s pioneering efforts. Through a series of developments announced at GTC 2024, NVIDIA is not just envisioning the future of computing, but actively shaping it.

In Canada and the US, scientists employed LLMs to streamline quantum simulations, aiding in the exploration of molecular structures. “This new quantum algorithm opens the avenue to a new way of combining quantum algorithms with machine learning,” said Alan Aspuru-Guzik, a professor of chemistry and computer science at the University of Toronto, who led the team.

The team was the first to discover a lead candidate by using a quantum computer and classical computer. The endeavour employed NVIDIA’s CUDA-Q, a hybrid programming model designed for GPUs, CPUs, and the QPUs utilised by quantum systems. The research team conducted their experiments on Eos, which is NVIDIA’s H100 GPU supercomputer.

At GTC, Aspuru-Guzik revealed the algorithm that he developed, which employs machine learning and quantum computing to simulate chemical systems. This algorithm is now available for research and is helping in healthcare and chemistry. He added that if we continued using GPT-like models and these algorithms for quantum computing, we can have a GPT-like model for quantum computing. 

NVIDIA introduced the NVIDIA Quantum Cloud at GTC, aimed at supporting researchers in fields like biopharma and various scientific disciplines in pushing forward quantum computing and algorithmic research. 

According to NVIDIA, this cloud platform enables users to develop and experiment with novel quantum algorithms and applications, such as simulators and tools for hybrid quantum-classical computer programming, marking a significant advancement in accessibility and capabilities.

Fraud detection and hybrid computing

An interesting client leveraging and spearheading NVIDIA’s quantum dream is HSBC, which is one of the largest banks of the world. Researchers developed a quantum machine learning application capable of identifying fraudulent activity in digital payment systems.

Using NVIDIA GPUs, the bank’s quantum machine learning algorithm simulated an impressive 165 qubits. Typically, research papers focus on fewer than 40 of these quantum computing units.

Mekena Metcalf, a quantum computing research scientist at HSBC discussed her findings during a session at GTC. HSBC employed machine learning methodologies integrated with CUDA-Q and cuTensorNet software on NVIDIA GPUs to tackle the difficulties of scaling quantum circuit simulations. The focus was on applying these models to classify fraudulent transactions in digital payments.

Moreover, at GTC, two recent deployments showcased the expanding landscape for hybrid quantum-classical computing. 

The first, ABCI-Q at Japan’s National Institute of Advanced Industrial Science and Technology, is one of the largest supercomputers solely dedicated to quantum computing research. It leverages CUDA-Q on NVIDIA H100 GPUs to bolster the nation’s endeavours in this field.

Meanwhile, in Denmark, the Novo Nordisk Foundation is spearheading the deployment of an NVIDIA DGX SuperPOD, with a significant portion allocated to quantum computing research, aligning with the country’s strategic plan to advance the technology.

These new systems complement Australia’s Pawsey Supercomputing Research Centre, which recently announced its adoption of CUDA-Q on NVIDIA Grace Hopper Superchips at its National Supercomputing and Quantum Computing Innovation Hub.

The Partner and Collaboration Work

At the heart of NVIDIA’s quantum computing journey lies a dedication to research excellence and collaboration. By forging strategic partnerships with leading academic institutions, NVIDIA is cultivating the next generation of quantum scientists and engineers. 

For example, Israeli startup Classiq unveiled a new integration with CUDA-Q at GTC. Classiq‘s quantum circuit synthesis enables the automatic generation of optimised quantum programs from high-level functional models. This advancement empowers researchers to maximise the efficiency of current quantum hardware and expand the scope of their work towards future algorithms.

Rolls Royce, the aviation company also simulated the world’s largest circuit for computational fluid dynamics using cuQuantum multi-node QC simulation. This was done through a partnership with NVIDIA and Classiq. Another great example is QC Ware, a software and service provider, which is integrating its Promethium quantum chemistry package with the recently announced NVIDIA Quantum Cloud. 

ORCA Computing, headquartered in London and specialising in quantum systems development, showcased results of running quantum machine learning on its photonics processor using CUDA-Q.  Additionally, ORCA has been chosen to construct and supply a quantum computing testbed for the UK’s National Quantum Computing Centre, which will feature an NVIDIA GPU cluster utilising CUDA-Q.

NVIDIA also partnered with Inflection, a leader in quantum technology, to deliver cutting-edge quantum-enabled solutions for Europe’s largest cyber-defense exercise through the NVIDIA-enabled Superstaq software.

qBraid, a cloud-based platform for quantum computing, is integrating CUDA-Q into its developer environment. Furthermore, California-based BlueQubit detailed in a blog post how NVIDIA’s quantum technology, utilised in its research and GPU service, facilitates the fastest and most extensive quantum emulations feasible on GPUs.

All of these are just a few developments announced at GTC. As the quantum revolution unfolds, NVIDIA stands as a beacon of progress, leading the charge towards a future where the impossible becomes achievable. 

Share
Picture of Mohit Pandey

Mohit Pandey

Mohit dives deep into the AI world to bring out information in simple, explainable, and sometimes funny words. He also holds a keen interest in photography, filmmaking, and the gaming industry.
Related Posts

CORPORATE TRAINING PROGRAMS ON GENERATIVE AI

Generative AI Skilling for Enterprises

Our customized corporate training program on Generative AI provides a unique opportunity to empower, retain, and advance your talent.

Upcoming Large format Conference

May 30 and 31, 2024 | 📍 Bangalore, India

Download the easiest way to
stay informed

Subscribe to The Belamy: Our Weekly Newsletter

Biggest AI stories, delivered to your inbox every week.

AI Courses & Careers

Become a Certified Generative AI Engineer

AI Forum for India

Our Discord Community for AI Ecosystem, In collaboration with NVIDIA. 

Flagship Events

Rising 2024 | DE&I in Tech Summit

April 4 and 5, 2024 | 📍 Hilton Convention Center, Manyata Tech Park, Bangalore

MachineCon GCC Summit 2024

June 28 2024 | 📍Bangalore, India

MachineCon USA 2024

26 July 2024 | 583 Park Avenue, New York

Cypher India 2024

September 25-27, 2024 | 📍Bangalore, India

Cypher USA 2024

Nov 21-22 2024 | 📍Santa Clara Convention Center, California, USA

Data Engineering Summit 2024

May 30 and 31, 2024 | 📍 Bangalore, India

Subscribe to Our Newsletter

The Belamy, our weekly Newsletter is a rage. Just enter your email below.