ISC 2024 A Few Quantum Gems and Slides from a Packed QC Agenda – HPCwire

Posted: May 29, 2024 at 2:06 am

If you were looking for quantum computing content, ISC 2024 was a good place to be last week there were around 20 quantum computing related sessions. QC even earned a slide in Kathy Yelicks opening keynote Beyond Exascale. Many of the quantum sessions (and, of course, others) were video-recorded and ISC has now made them freely accessble.

Not all were recorded. For example what sounded like a tantalizing BOF panel Toward Hardware Agnostic Standards in Hybrid HPC/Quantum Computing featuring Bill Gropp (NCSA, University of Illinois), Philippe Deniel (Commissariat Energie Atomique (CEA)), Mitsuhisa Sato (RIKEN), Travis Humble (ORNL), Venkatesh Kannan (Irelands High Performance Centre), and Kristel Michielsen (Julich Supercomputing Center). Was sorry to miss that.

Regardless, theres a wealth of material online and its worth looking through the ISC 2024 inventory for subjects, speakers, and companies of interest (registration may be required). Compiled below are a few QC soundbites from ISC.

Yelick, vice chancellor for research at the University of California, covered a lot of ground in her keynote examining the tension and opportunities emerging from the clash of traditional FP64 HPC and mixed-precision AI and how the commercial supply line of advanced chips is changing. Quantum computing earned a much smaller slice.

I really just have this one slide about quantum. Theres been some really exciting progress if you have been following this and things like error correction over the last year with really, significant improvements in terms of the ability to build error corrected quantum systems. On the other hand, I would say we dont yet have an integrated circuit kind of transistor model yet, right. Weve got a bunch of transistors, [i.e.] weve got a whole bunch of different kinds of qubits that you can build, [and] theres still some debate [over them].

In fact, the latest one of the latest big error correction results was actually not for the superconducting qubits, which is what a lot of the early startups were in, but for the AMO (atomic, molecular, optical) physics. So this is really looking at the fact that were not yet at a place where we can rely on this for the next generation of computing, which is not to say that we should be ignoring it. Im really interested to see how [quantum computing evolves and] also thinking about how much classical computing were going to need with quantum because thats also going to be a big challenge with quantum. [Its] very exciting, but its not replacing also general purpose kind of computing that we do for science and engineering.

Not sure if thats a glass half-full or half-empty perspective. Actually, many of the remaining sessions tackled the questions she posed, including the best way to implement hyrbid HPC-Quantum system, error correction and error mitigation, and the jostling among competing qubit types.

It was easy to sympathize (sort of) with speakers presenting at the Quantum Computing Status of Technologies session, moderated by Valeria Bartsch of Fraunhofer CFL. The speakers came from companies developing different qubit modalities and, naturally, at least a small portion of their brief talks touted their company technology.

She asked, Heres another [submitted question]. What is the most promising quantum computing technology that your company is not developing yourself? I love that one. And everybody has to answer it now. You can think for a few seconds.

Very broadly speaking neutral atom, trapped ion, and superconducting are perhaps the most advanced qubit modalities currently and each speaker presented a bit of background on their companies technology and progress. Trapped ions boast long coherence times but somewhat slower swicthing speeds. Superconducting qubits are fast, and perhaps easier to scale, but error prone. Neutral atoms also have long coherence times but have so far been mostly used for analog computing though efforts are moving quickly to implement gate-based computing. To Hayes point, Marjorana (topology) qubits would be inherently resistant to error.

Not officially part of the ISC program, Hyperion delivered its mid-year HPC market update online just before the conference. The full HPCwire coverage is here and Hyperion said it planned to put its recorded presentation and slides available on its website. Chief Quantum Analyst Bob Sorensen provided a brief QC snapshot during the update predicting the WW QC market will surpass $1 billion in 2025.

Sorensen noted, So this is a quick chart (above) that just shows the combination of the last four estimates that we made, you can see starting in 2019, all the way up to this 2023 estimate that reaches that $1.5 billion in 2026 I talked about earlier. Now my concern here is always its dangerous to project out too far. So we do tend to limit the forecast to these kinds of short ranges, simply because a nascent sector like quantum, which has so much potential, but at the same time has some significant technical hurdles to overcome [which] means that there can be an inflection point most likely though in the upward direction.

He also pointed out that a new use case, a new breakthrough in modality or algorithms, any kind of significant driver that brings more interest in and performance to quantum kick can significantly change the trajectory here on the upside.

Sorensen said, Just to give you a sense of how these vendors that we spoke to looked at algorithms, we see the big three are still the big three in mod-sim, optimization, and AI with with some interest in cybersecurity aspects, post quantum encryption kinds of research and such as well as Monte Carlo processes taking advantage of quantum stability to generate random number generator, provable random numbers to support the Monte Carlo processing.

Interesting here is that were seeing a lot more other (17%). This is the first time weve seen that. We think it is [not so much] about new algorithms, but perhaps hybrid mod-sim optimized or machine learning that feeds into the optimization process. So we think were seeing more hybrid applications emerging as people take a look at the algorithms and decide what solves the use case that they have in hand, he said.

Satoshi Matsuoka, director of RIKEN Center for Computational Science, provided a quick overview of Fugaku plans for incorporating quantum computing as well as touching on the status of the ABCI-Q project. He, of course, has been instrumental with both systems. Both efforts emphasize creating a hybrid HPC-AI-Quantum infrastructure.

The ABCI-Q infrastructure (slide below) will be a variety of quantum-inspired and actual quantum hardware. Fujitsu will supply the former systems. Currently, quantum computers based on neutral atoms, superconducting qubits, and photonics are planned. Matsuoka noted this is well-funded a few $100 million with much of the work done geared toward industry.

Rollout of the integrated quantum-HPC hybrid infrastructure at Fugaku is aimed at the 2024/25 timeframe. Its also an ambitious effort.

About the Fugaku effort, Matsuoka said, [This] project is funded by a different ministry, in which we have several real quantum computers, IBMs Heron (superconducting QPU), a Quantinuum (trapped ion qubits), and quantum simulators. So real quantum computers and simulators to be coupled with Fugaku.

The objective of the project [is to] come up with a comprehensive software stack, such that when the real quantum computers that are more useful come online, then we can move the entire infrastructure along with any of those with quantum computers along with their successors to be deployed to solve real problems. This will be one of the largest hybrid supercomputers.

The aggressive quantum-HPC integration sounds a lot like what going on in Europe. (See HPCwire coverage, Europes Race towards Quantum-HPC Integration and Quantum Advantage)

The topic of benchmarking also came up during Q&A at one session. A single metric such as the Top500 is generally not preferred. But what then, even now during the so-called NISQ (noisy intermediate-scale quantum) computing era?

One questioner said, Lets say interesting algorithms and problems. Is there anything like, and Im not talking about a top 500 list for quantum computers, like an algorithm where we can compare systems? For example, Shors algorithm. So who did it and what is the best performance or the largest numbers you were able to factorize?

Hayes (Quantinuum) said, So we havent attempted to run Shors algorithm, and interesting implementations of Shors algorithm are going to require fault tolerance to factor a number that a classical computer cant. But you know, that doesnt mean it cant be a nice benchmark to see which company can factor the largest one. I did show some data on the quantum Fourier transform. Thats a primitive in Shors algorithm. I would say that thatd be a great candidate for benchmarking the progress and fault tolerance.

More interesting benchmarks for the NISC era are things like quantum volume, and theres some other ones that can be standardized, and you can make fair comparisons. So we try to do that. You know, theyre not widely or universally adopted, but there are organizations out there trying to standardize them. Its difficult getting everybody marching in the same direction.

Corcoles (IBM) added, I think benchmarking in quantum has an entire community around it, and they have been working on it for more than a decade. I read your question as focusing on application-oriented benchmarks versus system-oriented benchmarks. There are layers of subtlety there as well. If we think about Shors algorithm, for example, there were recent works last year suggesting theres more than one way to run Shors. Depending on the architecture, you might choose one or another way.

An architecture that is faster might choose to run many circuits in parallel that can capture Shors algorithm and then do a couple of processing or architecture that that might might take more time they just want to run one single circuit with high probability measure the right action. You could compare run times, but theres probably going to be differences that add to the uncertainty of what what technology you will use, meaning that there might be a regime of factoring, where you might want to choose one aspect or another, but then your particular physical implement, he said.

Macri (QuEra) said, My point is were not yet at the point where we can really [compare systems]. You know we dont want to compete directly with our technologies. I would say that especially in for what concerns applications we need to adopt a collaborative approach. So for example, there are certain areas where these benchmarks that you mentioned are not really applicable. One of them is a quantum simulation and we have seen really a lot of fantastic results from our technology, as well as from ion traps and superconducting qubits.

It doesnt really make sense really to compare the basic features of the technologies so that, you know, we can a priori, identify what is the specific application the result that you want to achieve. I would say lets focus on advancing the technology we see. We already know that there are certain types of devices that outperform others for specific applications. And then we will, we will decide these perhaps at a later stage. But I agreed for for very complex tasks, such as quantum Fourier transform, or perhaps the Shors algorithm, but I think, to be honest, its still too preliminary [for effective system comparisons].

As noted this was a break-out year for quantum at ISC which has long had quantum sessions but not as many. Europes aggressive funding, procurements, and HPC-quantum integration efforts make it clear it does not intend to be left behind in the quantum computing land rush, with, hopefully, a gold rush to follow.

Stay tuned.

Read more from the original source:

ISC 2024 A Few Quantum Gems and Slides from a Packed QC Agenda - HPCwire

Related Posts