Quantum computing, once a realm confined to theoretical speculation, is now transitioning into practical reality, thanks to NVIDIA’s pioneering efforts. Through a series of developments announced at GTC 2024, NVIDIA is not just envisioning the future of computing, but actively shaping it.
In Canada and the US, scientists employed LLMs to streamline quantum simulations, aiding in the exploration of molecular structures. “This new quantum algorithm opens the avenue to a new way of combining quantum algorithms with machine learning,” said Alan Aspuru-Guzik, a professor of chemistry and computer science at the University of Toronto, who led the team.
The team was the first to discover a lead candidate by using a quantum computer and classical computer. The endeavour employed NVIDIA’s CUDA-Q, a hybrid programming model designed for GPUs, CPUs, and the QPUs utilised by quantum systems. The research team conducted their experiments on Eos, which is NVIDIA’s H100 GPU supercomputer.
At GTC, Aspuru-Guzik revealed the algorithm that he developed, which employs machine learning and quantum computing to simulate chemical systems. This algorithm is now available for research and is helping in healthcare and chemistry. He added that if we continued using GPT-like models and these algorithms for quantum computing, we can have a GPT-like model for quantum computing.
NVIDIA introduced the NVIDIA Quantum Cloud at GTC, aimed at supporting researchers in fields like biopharma and various scientific disciplines in pushing forward quantum computing and algorithmic research.
According to NVIDIA, this cloud platform enables users to develop and experiment with novel quantum algorithms and applications, such as simulators and tools for hybrid quantum-classical computer programming, marking a significant advancement in accessibility and capabilities.
Fraud detection and hybrid computing
An interesting client leveraging and spearheading NVIDIA’s quantum dream is HSBC, which is one of the largest banks of the world. Researchers developed a quantum machine learning application capable of identifying fraudulent activity in digital payment systems.
Using NVIDIA GPUs, the bank’s quantum machine learning algorithm simulated an impressive 165 qubits. Typically, research papers focus on fewer than 40 of these quantum computing units.
Mekena Metcalf, a quantum computing research scientist at HSBC discussed her findings during a session at GTC. HSBC employed machine learning methodologies integrated with CUDA-Q and cuTensorNet software on NVIDIA GPUs to tackle the difficulties of scaling quantum circuit simulations. The focus was on applying these models to classify fraudulent transactions in digital payments.
Moreover, at GTC, two recent deployments showcased the expanding landscape for hybrid quantum-classical computing.
The first, ABCI-Q at Japan’s National Institute of Advanced Industrial Science and Technology, is one of the largest supercomputers solely dedicated to quantum computing research. It leverages CUDA-Q on NVIDIA H100 GPUs to bolster the nation’s endeavours in this field.
Meanwhile, in Denmark, the Novo Nordisk Foundation is spearheading the deployment of an NVIDIA DGX SuperPOD, with a significant portion allocated to quantum computing research, aligning with the country’s strategic plan to advance the technology.
These new systems complement Australia’s Pawsey Supercomputing Research Centre, which recently announced its adoption of CUDA-Q on NVIDIA Grace Hopper Superchips at its National Supercomputing and Quantum Computing Innovation Hub.
The Partner and Collaboration Work
At the heart of NVIDIA’s quantum computing journey lies a dedication to research excellence and collaboration. By forging strategic partnerships with leading academic institutions, NVIDIA is cultivating the next generation of quantum scientists and engineers.
For example, Israeli startup Classiq unveiled a new integration with CUDA-Q at GTC. Classiq‘s quantum circuit synthesis enables the automatic generation of optimised quantum programs from high-level functional models. This advancement empowers researchers to maximise the efficiency of current quantum hardware and expand the scope of their work towards future algorithms.
Rolls Royce, the aviation company also simulated the world’s largest circuit for computational fluid dynamics using cuQuantum multi-node QC simulation. This was done through a partnership with NVIDIA and Classiq. Another great example is QC Ware, a software and service provider, which is integrating its Promethium quantum chemistry package with the recently announced NVIDIA Quantum Cloud.
ORCA Computing, headquartered in London and specialising in quantum systems development, showcased results of running quantum machine learning on its photonics processor using CUDA-Q. Additionally, ORCA has been chosen to construct and supply a quantum computing testbed for the UK’s National Quantum Computing Centre, which will feature an NVIDIA GPU cluster utilising CUDA-Q.
NVIDIA also partnered with Inflection, a leader in quantum technology, to deliver cutting-edge quantum-enabled solutions for Europe’s largest cyber-defense exercise through the NVIDIA-enabled Superstaq software.
qBraid, a cloud-based platform for quantum computing, is integrating CUDA-Q into its developer environment. Furthermore, California-based BlueQubit detailed in a blog post how NVIDIA’s quantum technology, utilised in its research and GPU service, facilitates the fastest and most extensive quantum emulations feasible on GPUs.
All of these are just a few developments announced at GTC. As the quantum revolution unfolds, NVIDIA stands as a beacon of progress, leading the charge towards a future where the impossible becomes achievable.