View of Cyborg hand holding Quantum computing concept with qubit and devices 3d rendering
QC Investment Today
Quantum Computing (QC) proof of concept (POC) projects are growing in Q4 2021 with commercialization pilots by 2025 and broader adoption before 2030.Accelerated digital transformation and digital reshaping from the pandemic is driving investments and early IPOs (ex. Q1 announcement by IonQ). In my daily engagements pro bono with global communities (across governments, industry, computing and research organizations, NGOs, UN agencies, innovation hubs, think tanks) of more than 60K CEOs, 30K investors , 10K innovation leaders, Im finding nearly 50% are planning pilots for QC in five years. Theres an understanding that the exponential lead provided by a breakthrough in QC warrants the early investment and learnings now since practical adoption will take years.
As a measure of progress and to stimulate collaboration/sharing in QC, the non-profit IEEE held their first Quantum Week in October 2020 and is holding their second conference IEEE Quantum Week 2021 October 18-22 2021. Ill provide a follow-up article after the conference.
Quantum Physics produces Quantum Effects from Quantum Mechanics providing Quantum Information Science that includes quantum computing, quantum communications, quantum sensing, quantum measurement, quantum safe cryptography and more. I often use QC as the general term for simplicity in this article to point to Quantum Effects-related to Quantum Information Science. Quantum Information Science is the better umbrella term.
Learn From the QC Top Leaders
In this article, I will highlight QC 2021 best insights from my chats with QC top leaders in 2021. The pro bono full video interviews can be found with the non-profits such as IEEE TEMS and ACM (see interviews series Stephen Ibaraki). IEEE is the largest non-profit electrical engineering organization and responsible for many of the global standards in use today in technology.
The QC interviewees include:
Michele Mosca: Co-founder, Institute for Quantum Computing, University of Waterloo; Founder of Quantum-Safe Canada and Quantum Industry Canada; Co-founder and CEO of the quantum-safe cybersecurity company, evolutionQ.
William Hurley, who goes by the name whurley: Innovator; Serial Entrepreneur; Founder & CEO Strangeworks, about Quantum Computing.
Scott Aaronson: David J. Bruton Centennial Professor of Computer Science at the University of Texas at Austin; recipient of ACM Prize in Computing; about theoretical computer science and quantum computing. The ACM prize is the second highest award from the ACM, which is the largest non-profit computing science organization.
Stefan Woerner: IBM Quantum Applications Research & Software Lead. Stefan is considered one of the top researchers in QC applications.
QC Top Leaders Best Pointers
Michele Mosca details quantum history and being at the founding of world leading physics and quantum research groups at University of Waterloo. We discuss the future of quantum, the probabilities of success timelines, and providing quantum risk assessment. In addition, Michele and his students have founded companies in this area thus the entrepreneurship journey is shared.
We discuss categories of quantum:
Quantum computing (QC), the focus on my January Forbes article where Google in 2019 and China in 2020 provided examples of Quantum Supremacy where problems are solved in seconds that would take thousands or billions of years on classical digital computers.
Quantum safe cryptography and designs to be safe from quantum enabled attacks. NIST (National Institute of Standards and Technology) working on QC standards. Encryption being vulnerable to quantum computing capabilities including where data can be stored and decrypted later by quantum computers.
Quantum communications where China is leading and also the UN agency ITU has programs such as Quantum Information Technology for Networks.
Quantum sensing providing ultrasensitive capabilities to detect underwater deposits and seismic events plus much more.
Willan Hurley whurley shares his experiences as a serial entrepreneur including having several startups exit within the same year. whurley then shares turning his attention to QC by authoring the book, Quantum Computing for Babies, and launching his startup Strangeworks. Strangeworks provides a platform with developer tools and systems management. In our chat, whurley states, I think if you look at IBM public roadmap, if you look at IBM Q, and Rigetti, and all of the companies and what they're doing Microsoft, Google; Google, even then announce it, they think they'll have their machine in 2029...and I think that they will actually do it before. So I predict Google will have a machine online, closer to the 2025, 2026 range...There's over 500 startups involving quantum right now today. When I started three years ago, they were like 12...And you're going to see a big inflection point driven by the government investment worldwide ... whurley talks about billions invested in France, Germany, China, USA ...you've got Norway, Finland, Russia, you've got everybody in this game now.
Scott Aaronson received the 2020 ACM Prize in Computing in April 2021 for his contributions to QC. In our chat, we talk about his work and his views on QC today and into the future. Its good to view our chat - as noted in the ACM prize citation, Aaronson helped develop the concept of quantum supremacy, which denotes the milestone that is achieved when a quantum device can solve a problem that no classical computer can solve in a reasonable amount of time. Aaronson established many of the theoretical foundations of quantum supremacy experiments. Such experiments allow scientists to give convincing evidence that quantum computers provide exponential speedups without having to first build a full fault-tolerant quantum computer. The ACM citation provides notable contributions with: Boson Sampling, Fundamental Limits of Quantum Computers, Classical Complexity Theory, his respected book on QC Quantum Computing Since Democritus and Scotts work Making Quantum Computing Accessible (ex. his popular blog.Shtetl Optimized).
Here are excerpts from my extensive chat with Stefan Woerner. The interview has been edited for clarity and brevity and I used AI to provide the transcript (which has limits). I recommend going directly to the video interview for our nuanced discussion.
Stephen Ibaraki
I ask how Stefan got into quantum computing.
Stefan Woerner
And then I started to look into how can we apply this to problems I looked into before, for example, in optimization or in finance, and it turned out that, that there are many things that can be done...quantum computing gave me a new toolbox to look at the problems that I studied already for quite a while and it opened up completely new directions. It also came with quite new challenges. But but I think it's extremely exciting for me. Now having this additional tools, additional possibilities to try to solve relevant problems and eventually have an impact with optimization or with Monte Carlo simulation and things like that.
Stephen Ibaraki
That's fascinating, your grandfather's sort of stimulating this interest in mathematics and sciences in general as well...And then in your early work, using mathematics, did you use supercomputers at that time in your optimization problems?
Stefan Woerner
We did some optimization on the cloud. And we used some cloud solvers. But these were not supercomputing. So our approach was more to try to find good formulations that are accessible by the solvers. We had, for example, writing our own simulations for supply chains that could be leveraged in an optimization setting.
Stephen Ibaraki
Quantum computing is still a mystery to a lot of people and especially to developers so there's more and more tools coming out. You have the IBM challenge to try to make it easier for the broader community to start experimenting with quantum computing. But before we delve into the tools you have and how you make it accessible for proof of concepts. Let's go back to basics, what is quantum computing?
Stefan Woerner
So quantum computing is a completely new computational paradigm where you leverage the laws of quantum mechanics. And that means if we now really go to the basics, classically, you have a bit, that's either zero or one. In quantum computing, you have the quantum bit qubit, which can be a superpositions of zero or one. And that sometimes this is explained like it's 50%, zero or 50%, one. But that's not 100% true; it's really like a superposition, it's this state in between, so you can think of it as a continuous variable, in a way a continuous value. If you have two qubits they can also be entangled. And in a way, this means that the state of two qubits can be correlated. So if you can, you can construct states that are perfectly correlated, where the state of the one qubit perfectly determines the state of the other qubit. So if the one [qubit] is zero, you know, the other one is also zero. And the other way around, if the one [qubit] is one, the other is also one. So this correlation of two particles, which are two qubits, this is something that's purely quantum mechanical. This doesn't exist in classical computing and classical electronics. And if you scale this, this means that the state space of a system of qubits scales exponentially. So that the state space to describe the system really scales extremely fast to something that's way beyond you can handle classically, that alone would not be enough. There's one more feature, let's call them interference. And you know that from sound or from water, you can have constructive and destructive interference where waves are adding up or they're cancelling out. And this is something that we leverage in quantum computing, as well. So you can have this high dimensional states, and then you can let them interfere. And that's what actually then amplifies probabilities of good solutions. Now, this also tells you one important thing, a way a quantum computer is working. And the way you program a quantum computer is completely different to how you would do this classically. Because you need to translate your problem now into something that's leveraging this interference in a way.
Stephen Ibaraki
There's this idea early on in quantum computing, where they're measuring the capabilities by the number of relatively stable, qubits, or logical qubits. And then IBM came up with this idea of quantum volume. They're saying maybe qubits is not a great way of representing the capability of a quantum computing. Can you explain IBM's concept of a quantum volume?
Stefan Woerner
Qubits that we built today, let's refer to them as physical qubits. They are noisy so they after a while they lose the state, the operations that we can can use to control their state or to modify the state are not perfect. So there's an error. And that means it's so difficult to really operate with these qubits until you really have to imagine here this is really trying to harness nature as its extreme. It's, in our case, superconducting qubits. So they are in a very cold environment and shielded from external disturbances and so on. These physical qubits, they're kind of fragile and you can have lots of qubits, but if they have very high error rates, you won't be able to use all of them. Because once you operated on all of them and entangled them, and so on, you introduce so much noise that you're not getting out anything meaningful anymore. So you really need to take into account the number of qubits, that is an important factor. But as you said, not the only one. But also the errors indicating the decoherence time. So how long the qubit keeps it state, and things like that. And now the quantum volume is a single number that's determined by some benchmarking circuit. So you run some operations on your quantum hardware, where you kind of know the result, or you can evaluate the result. And you can then say, whether this is above a certain threshold or not. And then if you can run this on a certain certain number of qubits and with a certain number of operations, and this determines the quantum volume. And so the quantum volume in a way determines how many qubits you can use with a certain number of operations, meaning that the number of operations that you run sequentially is about the number of qubits. But this is kind of benchmark. So the single number benchmark that puts on it, that takes into account the number of qubits and noise and all these factors that actually impact the power of a quantum computer. Now, this is for these physical qubits, then now looking forward, once we reach a certain size and a certain quality, then we can leverage error correction. And we can get to fault tolerant quantum computing. In here, we take many physical qubits, and we encode them as one logical qubit. So there's like an abstract and logical error correction layer on top of that. And this overhead is relatively large, so it's estimated that you need a few 100 to 1000 physical qubits to get to one logical qubit. And then this logical qubit has a significantly suppressed error. And then you can start to work with that, in this clean theoretic computational paradigm where you ignore more or less the noise from the hardware.
Stephen Ibaraki
IBM, announcing their 1 million qubit roadmap by 2030. What does that roadmap mean? I know you've got some interim results that you're targeting: 2023, 2025, etc., 2030. What are the implications of this roadmap?
Stefan Woerner
So I think the next couple of years will be would be very, very exciting for different reasons. So the roadmap that we announced that says that, until 2023, we reach a quantum chip with more than 1000 qubits and also give some specifications on the error rates that these qubits should have because as I mentioned before, qubits, just the number of qubits doesn't mean too much. So the quality needs to be improved as well, to really get a more powerful quantum computer and so we get over 100 qubits. So currently we have 65 [qubits]. Last year we released 65 that can be accessed to the cloud this year, we plan to get to 127 I think, next year 433. And then after 2023 over 1000. And, in the roadmap and also the technical details, like what leads to this improvement, what are the changes that help us to grow these chips. Now getting to 1000 qubits is kind of an inflection point. And this is because, as I mentioned before, this is about the number that you need to to build a logical qubit. So that's where you can really start to study, fault tolerant quantum computing, maybe at a small scale. But, that will be then the first time this really can be investigated in depth. And then the next step to scale to the millions; also then to not have like more and more qubits on a single chip, but also go for example, you could imagine that you combine multiple chips with 1000 qubits. And that way, get a larger quantum computer. Well, that's from the technological development, this is extremely fascinating. And I think also, this path to the 1000 qubits will be extremely interesting for applications and algorithms, researchers like myself, because right now when we run algorithms on real hardware, and also when we simulate them classically, which is very, very expensive computationally, because it scales exponentially in a number of qubits...and once we scale to 60, 100, 400, 1000 cubits, this is really where we can see that the asymptotic behavior of these heuristics, so this is really where we can start to make forecasts about how they will perform for interesting problems. And I think that this will result in us getting a way better understanding of what we can do with near term quantum computers for optimization for machine learning, for things like that.
Stephen Ibaraki
Different companies and research groups come out with different claims; there's a group out of China recently came up with a claim that they've achieved some kind of quantum supremacy that it would take a supercomputer, over 2 billion years to do this kind of quantum problem of Gaussian boson sampling. Google, made some buzz, in 2019, where they released the Sycamore system, and they indicated, quantum supremacy on this quantum problem. It's not a practical problem, but really just to illustrate that it can do something that maybe supercomputers can't do. And yet IBM looked at that and said, maybe that's not as big of a breakthrough as you're indicating, because really, we can get that done on a supercomputer just by improving our algorithms to maybe a few days. So maybe it's not quantum supremacy is. So what is supremacy, what is quantum advantage? There's these words being thrown out, and what is it real?
Stefan Woerner
So we don't use quantum quantum supremacy for multiple reasons. One, is we don't believe that quantum computers will become superior to classical computers at any point. And so a quantum computer cannot speed up everything. A quantum computer can be used as an accelerator for some tasks. So I think it will always be a combination of classical and quantum computers that work in harmony to solve some problems. So it's not like you won't write your emails with a quantum computer [you will NOT be using quantum computers to write emails], you might solve some computationally heavy quantum chemical simulations or control optimization problems with a quantum computer. And now, what do we mean by quantum advantage? That's if you can do something with the help of a quantum computer that has some practical value. So I think what you mentioned are very nice experimental demonstrations. And important steps on the development of quantum computers. What we're looking for is really a practical value that has been achieved with a quantum computer. And I think that still is a bit out in the in the future.
Stephen Ibaraki
You're an expert in quantum computing, but there's different kinds of quantum computing. And what I mean by that: trapped ion concept, topological quantum computer that Microsoft has been chasing for some time, very low temperature spin, photonic, can you get into a summary of the different categories and why IBM has chosen your particular way of doing quantum computing?
Stefan Woerner
Trapped ions, spins, photonics, and also in superconducting they're different designs, we will look into superconducting qubits, because we think that's particularly in the near term, the most promising to scale...superconducting qubits are operating in very low temperature...about 50 milli Kelvin, which is, I think, 100 or 1000 times colder than outer space. So this really like just above the absolute zero temperature. And that, that sounds very challenging. But this, dilution refrigerators that get down to these temperatures, this is something that is actually quite reliable and well understood technology. So that first sounds like a big problem. But I think that's something that has been quite well understood. And if you have that solved, or if you have the environment where you can operate them, then you can process these chips, you can come up with different designs, the superconducting qubits, for example, at a larger scale than the spins. So I think, to get these near term systems, that there might be an advantage in processing, in fabricating them. And we came up with a design that is also accessible to error correction. So here's, that's an example where the theoretical research and error correction and the people who design the devices are really like collaborated very nicely because the design of the chip has been chosen such that it's good to manufacture it, and which has then let the error correction team to come up with new error correcting codes that can be run, eventually on this hardware. So these are all pieces that fit together that make us believe that we can scale this to 1000 cubits and then if we, for example, can connect larger chips also to the to the millions.
Stephen Ibaraki
I've been in computing for such a long time. And I remember in the early days, we would flick toggle switches, and program literally in binary code; we moved to assembler then we went to higher level languages. We got to a stage where you had abstraction of the hardware through an operating system; you can write to a more generic kind of code using a much higher level language and that made it much easier. So what is the work being done in that area in quantum computing, to abstract the hardware underneath from an operating standpoint; using toolkits?
Stefan Woerner
So, we just released the development roadmap earlier this year, which, addresses this to some extent, like how the stack will grow, how levels of abstraction will be included, whether this is for, like, pre defined quantum circuits, that you don't have to build the circuit yourself. But that, you know, there's hardware, a library of pre compiled for the hardware, pre compiled circuits, and it's like an optimized instruction set. And, things like that up to actual application services. And now, in terms of the actual languages, I think we are in a very interesting situation, which is a little bit different to what you explained before, because on the one side, we are at the stage of defining the new assembler standard, which is a quantum assembly language. But at the same time, we do have the classical languages, right, we do have a Python, for example, that we can embed all of this in. So we have a render situation that we can leverage the classical existing high level languages. And in this embed these new functionalities, we can write functions, classical functions, that compile or assemble or optimize some of these quantum stuff. And that, that allows us, for example, to build work on application modules. So you, I think, you mentioned qiskit before, qiskit is our open source Python framework, to program quantum computers to define quantum circuits to simulate quantum circuits and to also send them over the cloud to the real hardware. And within qiskit, we are building application modules. And here we're looking into the moment in four different application areas, there's optimization, there's natural sciences, there's machine learning, and finance. And the optimization module has been released the middle of last year. And what this does is it, it allows you to use a classical high level language to specify your optimization problem. Because that's something that has been solved, right, this is nothing quantum computing specific. Like classical optimization, subject matter experts knows how to define a optimization problem using different languages as for example, an IBM language, to model your problem. And now what the qiskit optimization module allows us to take this classical problem, and it automatically translates it into different different representations that are then accessible to different quantum optimization algorithms. So we on the one side, we still work on the assembly level. But on the other side, we have the classical language that does all the translations for us from a high level problem down to an actual circuit. And these, these optimization modules are built in such a way that it's very easy to get started. So you can, if you are like a subject matter expert in one of these domains, you can just download these modules, they are open source, and you can get the tutorials that actually allow you to use the quantum algorithm as a black box. So the entry barrier to run your first quantum optimization program on some illustrative example, is very low, forever. This whole thing is also built in such a modular and flexible way that you can use it as a black box, but you can open the black box, you can look at every level, you can tear it apart, you can replace different pieces by your own implementation, and see whether they improve, whether they change, how do they compare. So it's built in a way that is easy to get started. But that also really, really supports cutting edge research in these areas.
Stephen Ibaraki
But ultimately, if you want to have, mass proliferation, or usage, you will have to work at this much higher abstraction level. So it's easier for people to get involved. And I guess that's the reason behind the IBM challenge, right, to get people involved. And I read last year your two biggest communities who tried it, were in the data science area, and then financial services, but you also have people like high school students trying and completing the program. Can you talk about this challenge and what you're trying to do? And, and typically, what it involves maybe it's three or four stages of things that you put people through, and you actually get quite a few actually going through the entire program. So can you give us an example what that is like?
Stefan Woerner
This challenge was a collection of problems / tasks that people could apply and try to solve. And this included problems using qiskit, to solve an optimization problem. We have different difficulty...many people really reached the highest score... If I remember correctly, some people even reached the score where there was a little bit higher than anticipated. And that the challenge was one thing, but there was also a kid summer school, ...global summer school with, if I remember correctly, around four or 5000 participants globally. So we provide these educational offers, because it's really important, as you say, for people to be able to get into how this works, but what's different, to grow also the workforce in this in this area, because there will be an increasing demand. And I think, because it is so different, because it is still new, we just figured out the tip of the iceberg of what to use a quantum computer for or how to use a quantum computer to solve problems. So I think it will be extremely important to educate more people around quantum computing, and you see universities picking that up and coming up with new quantum computing curricula, and so on. And so this is important also to really leverage the full potential of this technology.
Stephen Ibaraki
Microsoft had a blog post where they indicated that it's really not suitable right now for for problems, which have a lot of data requirements, either data and in getting data in and getting data out, it's really more for certain kind of computational problems. And where you're really taking advantage of the unique capabilities within quantum computing. And you've indicated that as well, it's not a standalone; my iPhone isn't going to have a quantum computer in it; it's going to be in combination, or in hybrid form in some way. And you're seeing that with D-Wave, which has a piece of this quantum capability with their quantum annealing, but they have these hybrid systems. That leads to this question, what kind of industries are really suitable for quantum computing? What kind of problems are really suitable for quantum computing? What are the different categories where this whole quantum phenomena is being exploited right now? Or you think we'll have some major kind of advantages going into the future?
Stefan Woerner
Let me get back to the first point you mentioned about data. Because I think that's important. I indicated at the beginning that quantum computers can solve some problems better and that's really important, not all problems. And big data problems are like, if the problem is not that the tasks you want to execute is computationally very complex, but that you want to run it on like a tremendously large data set. Then this is very likely not a quantum computing use case, because loading this large data to a quantum computer just has complexity of the size of the data. But then many of these large are many of these big data algorithms. Classical algorithms also have that complexity, like if you have a big data set and your complexity is quadratic in the data size, and this probably won't work. And that means that loading a big data set into a quantum computer. Well, and here we're talking most likely about a fault tolerant quantum computer will have the same complexity as doing a solving the problem you're interested in classically. So for some problems is just a fundamental limitation; the good example is Grover's search, which is sometimes illustrated as searching unstructured database. But the first thing you have to do is you have to load this database into a quantum computer. And when you load this, and you have to take every element, then you just stop when you found what you're looking for. So we don't load the full database to the quantum computer, but that you would have to search it. So these things can happen. I think particularly in quantum machine learning algorithms, often this fact is is not considered. And still there are some interesting theoretical results. But if you want to look into this, from an application point of view, you really need to analyze it end to end from loading the data to extracting the result. And only then you can make a statement about a potential practical quantum advantage. Now on your question in the industries, so we actually working with quite a lot of companies; I think in the IBM quantum network, we have over 130 members by now. And there's of course, the financial service sector we're working with; with JP Morgan Chase, Goldman Sachs, on things like options, pricing, the derivative pricing and credit risk analysis or risk analysis in general, also optimization, portfolio optimization, things like that. So, I think the financial service sector is an industry that has a lot of of interest in quantum computing, because it's a very compute intensive industry. And, for example, many, many things are done by a Monte Carlo simulation, where we might have some potential speed ups with quantum computing. But there's also a lot of optimization and also machine learning; if you think about credit card fraud, this is something that still causes a lot of costs for the credit card industry and if they could reduce the false positives, and they would significantly reduce costs and improve proof reputation, because no customer likes accidentally blocking of their credit card. So, this is one sector, then since quantum computing might speed up optimization problems, there eventually might be a use cases around logistics supply chain, all these things...I mean, the original idea for quantum computing as Feynman formulated, this was for simulating quantum systems...quantum chemistry, quantum physics, material science, ... and eventually use cases...life science, industry and chemical industry, this these are certainly use cases that might really have a large potential...We have a lot of activities around quantum chemistry. And how to eventually scale this to get to design new materials or to understand to how chemical reactions work to build new catalysts that allow to run some chemical reactions at ambient conditions where today we require lots of energy and so on.
Stephen Ibarakiand Stefan Woener
I ask for further POCs in the near term and Stefan provides added examples. Stefan also looks longer term. ...opens up completely new ways of doing business of doing, for example, financial product, if you have like real time risk tracking, which can also maybe even prevent different things because you can react way faster. So it can lead to a way more informed decision making in multiple businesses... I think quantum quantum computing has also the potential to solve some of the really big problems that society may face in the coming years, whether this is fertilizers for food, and so on, which can use a lot of energy these days. And so this is something where it might help and there are a couple of examples where / when nature does something extremely efficient, and humans have no clue how to reproduce that. And I think with quantum computing, once we really figured out how to build this hardware, and then also, there's a lot of open questions on the algorithms. This might give us a completely new lens to look at nature, to look at how things actually work. So I would imagine that this helps us also to really push the fundamental understanding of how the world actually works, eventually.
We explore areas: quantum cryptography, quantum encryption and decryption and Shor's algorithm, a quantum accelerator, quantum sensing, quantum communications, quantum gravimeters, 20 million qubits where Shor's algorithm becomes a real factor, and in breaking RSA encryption, quantum key distribution.
We get into a discussion about quantum inspired applications (apply the principles to solve real problems today, even though the quantum hardware isn't quite there yet. And when it's ready, it scales.) Stefan provides his insights including improvements to classical software, It's a nice term for classical algorithms. I think, in principle, it's very cool if quantum algorithm research can also inspire finding new classical algorithms. I think this can happen either by kind of de-quantizing, some quantum algorithms, as we have seen, in the last years that there is a, like a quantum algorithm that promises a certain advantage. And then people have found how to kind of mimic some of the of the core parts of this algorithm using some classical sampling techniques. And they could show similar performance. I mean, this is always a little bit disappointing if you try to show a quantum advantage with this algorithm. And then classical algorithms can beat that. But I think it's a pretty cool development. But it stays a classical algorithm...that is not to forget that it is just a classical algorithm. It doesn't give you any advantage from coming from quantum; it's a classical algorithm that has been designed by using some ideas that are coming from quantum computing, but it's based on classical computers. So it will not give you a quantum advantage because it's classical.
We get into philosophical discussions about new kinds of computing and on quantum effects including on consciousness.
Read the original:
2021 Best Insights From Quantum Computing Top Leaders Quantum Computing - Forbes