IBM Study Charts Future of Superconducting-based Quantum Computing – HPCwire

Posted: October 19, 2022 at 3:48 pm

Few companies have worked as long or as broadly to develop superconducting-based qubits and quantum computers as IBM. Last month IBM posted a perspective paper The Future of Quantum Computing with Superconducting Qubits on arXiv and which has also been accepted but not yet published by the Journal of Applied Physics. Not a quick read, this work presents a fairly comprehensive review of what IBM thinks is needed to advance superconducting-based quantum computing, warts and all.

There are, of course, many qubit technologies being explored (superconducting, neutral atoms, trapped ions, photonics, etc.) upon which to base quantum computing platforms. Its not yet clear which, if any, will become dominant or which may emerge as more effective for particularly applications. Among the common challenges facing all of the qubit technologies are: how to scale up quantum system size (qubit counts); development and deployment effective error correction and error mitigation; and the need for hybrid architectures leveraging both classical and quantum systems.

The IBM authors[i] note, for example, that For quantum computing to succeed in changing what it means to compute, we need to change the architecture of computing. Quantum computing is not going to replace classical computing but rather become an essential part of it. We see the future of computing being a quantum-centric supercomputer where QPUs, CPUs, and GPUs all work together to accelerate computations.

This latest perspective from IBM presents deeper dive into many specific issues facing superconducting-based quantum computing, many of which are shared by other qubit modalities. For close watchers of the unfolding quantum technology landscape, the study is well worth reading.

Capturing the key points of the paper in sufficient detail in a short article is impractical. With apologies for the length of the excerpt, here is the studys conclusion which summarizes most of the central ideas:

We have charted how we believe that quantum advantage in some scientifically relevant problems can be achieved in the next few years. This milestone will be reached through (1) focusing on problems that admit a super-polynomial quantum speedup and advancing theory to design algorithmspossibly heuristicbased on intermediate depth circuits that can out- perform state-of-the-art classical methods, (2) the use of a suite of error mitigation techniques and improvements in hardware-aware software to maximize the quality of the hardware results and extract useful data from the output of noisy quantum circuits, (3) improvements in hardware to increase the fidelity of QPUs to 99.99% or higher, and (4) modular architecture designs that allow parallelization (with classical communication) of circuit execution. Error mitigation techniques with mathematical performance guarantees, like PEC (probabilistic error correction), albeit carrying an exponential classical processing cost, provide a mean to quantify both the expected run time and the quality of processors needed for quantum advantage. This is the near-term future of quantum computing.

Progress in the quality and speed of quantum systems will improve the exponential cost of classical processing required for error mitigation schemes, and a combination of error mitigation and error correction will drive a gradual transition toward fault-tolerance. Classical and quantum computations will be tightly integrated, orchestrated, and managed through a serverless environment that allows developers to focus only on code and not infrastructure. This is the mid-term future of quantum computing.

Finally, we have seen how realizing large-scale quantum algorithms with polynomial run times to enable the full range of practical applications requires quantum error correction, and how error correction approaches like the surface code fall short of the long-term needs owing to their inefficiency in implementing non-Clifford gates and poor encoding rate. We outlined a way forward provided by the development of more efficient LDPC codes with a high error threshold, and the need for modular hard- ware with non-2D topologies to allow the investigation of these codes. This more efficient error correction is the long-term future of quantum computing.

The full paper contains a fair amount of detail on key topics and is best read in full.

Link to pre-print of IBM paper (The Future of Quantum Computing with Superconducting Qubits), https://arxiv.org/abs/2209.06841

[i] Sergey Bravyi,1 Oliver Dial,1 Jay M. Gambetta,1 Dar o Gil,1 and Zaira Nazario1 IBM Quantum, IBM T.J. Watson Research Center, Yorktown Heights, NY 10598, USA

See original here:

IBM Study Charts Future of Superconducting-based Quantum Computing - HPCwire

Related Posts