Quantum Computing In Finance Where We Stand And Where We Could Go – Science 2.0

Posted: May 14, 2021 at 6:23 am

Quantum computers(QCs) operate totally differently than classical computers. Due to the quantumeffects known as superposition and entanglement, quantum bits (called qubits) cantake on non-binary states represented by complex numbers. This facilitatescomputational solutions to mathematical problems that cannot be solved byclassical computers because they require sequentially computing an astronomicalnumber of combinations or permutations.

This ability of QCsmean that they particularly excel at optimization problems, where the optimalcombination is only found after trying out an enormous number of possiblecombinations. Several important problems in finance are in essence optimizationproblems which meet this description. The portfolio-optimization problem infinance is one good example of such a problem. Asset pricing, credit-scoring,and Monte Carlo-type risk analysis are other examples. For example, it isestimated that running a risk assessment of a large portfolio which needs to bedone overnight or can even take days with classical computers, could one day bedone in real-time by a full-scale QC. That explains the keen interest of thefinance industry in quantum solutions.

The calculatingpower of a QC grows exponentially with the number of qubits. Quantum-computingroadmaps cite the number of qubits or competing metrics to indicate the risingpower of these machines, with some setting thresholds for so-called quantum supremacy,the point at which QCs will surpass classical supercomputers. But there arestill enormous technical challenges to solve before at-scale QCs can becommercialized, most notably the challenges of stability and error correction.

However, quantum-inspiredsoftware which is software running on classical computers but based on novelalgorithms that reframe mathematical problems in terms of quantum principles is already here. Several quantum-inspired solutions are currently focused onportfolio-optimization problems, and seem well positioned for their near-futureadoption by the financial asset-management industry.

Even thelimited-size, noisy QCs currently available lend themselves toportfolio-optimization solutions. Early proofs-of-concept (POCs) of hybrid orfull quantum solutions to asset-portfolio optimizations such as stock selectionhave already been demonstrated with encouraging results. Many of the largestnames in finance are already investing in quantum, or at least partnering withtechnology providers to explore finance applications. Financial servicescompanies who wait too long to gain experience in the field run the risk ofgetting left behind.

Quantum computing exploits quantum mechanics, the propertiesand behavior of fundamental particles at the subatomic level, as predicted byour best current understanding of quantum physics. The goal of quantumcomputing is to build hardware and develop suitable algorithms that processinformation in ways that are superior to so-called classical computers, i.e.the ubiquitous digital computers that the Information Age was built on.

The essential elements of a QC were postulated in the early1980s, but of late work in this area has accelerated with several largeestablished companies and start-ups building quantum-computing hardware. Aneven larger ecosystem of software platforms and solution providers exist aroundthe hardware providers. Collaboration models such as alliances and partnershipsare common. Many universities are involved, while governments are alsosupporting quantum-computing research.

Typical of a new industry, standards and metrics are stillin flux, and competing architectures, which leverage different mechanisms andimplementations of quantum principles, vie for technical supremacy andinvestment dollars. Announcements of new breakthroughs are made almost daily,which makes it important to distinguish the hype from real progress.

This paper attempts to demystify the technology, byexplaining the basic principles of quantum computing and the competingtechnologies vying for quantum supremacy. An overview of the current quantumcomputing industry and the main players is provided, as well as a look at thefirst applications and the different industries that could benefit. The focusthen turns to the finance industry, with an overview of the most importantcomputational problems in finance that lends themselves to quantum computing,with a deeper dive into portfolio optimization. Notable recent case studies andtheir participants are reviewed. The paper concludes with an assessment of thecurrent state of quantum computing and the business impact that can be expectedin the short and medium term.

What we now call classical(or conventional) digital computers perform all theircalculations in an aggregate of individual bits that are either 0 or 1 invalue, because they are implemented by transistors that are each eitherswitched completely on or off. This is called binary logic,which is the essence of any digital computer, and implemented in a longstandingcomputer-science paradigm originating with Turing and Von Neumann. Conventionalcomputers operate by switching billions of little transistors on and off, withall state changes governed by the computers clock cycle. With n transistors,there are 2n possible states for the computer to be in at any giventime. Importantly, the computer can only be in one of these states at a time. Digital computers are highly complexwith typical computer chips holding 20x1019 bits, yet incrediblyreliable at the semiconductor level with fewer than one error in 1024operations. (Software and mechanical-related errors are far more common incomputers.)

Analog computersprecede digital computers. In contrast to digital computers, classical analogcomputers perform calculations with electrical parameters (voltage or current)that take a full range of values along a continuous linear scale. Analogcomputers do not necessarily need to be electrical they can be mechanicaltoo, such as the first ones built by the ancient Greeks but the most sophisticated ones from the 20th century ones wereelectrical. Unlike digital computers, analog computers do not need a clock cycle,and all values change continuously. Before the digital revolution was enabledthrough the mass integration of transistors on chips, analog computers wereused in several applications, for example, to calculate flight trajectories orin early autopilot systems. But since the 1960s analog computers have largelyfallen into disuse due to the dominance of digital computers over the last fewdecades.

Both classical digital and analog computers are at theircore electrical devices, in the sense that they perform logic operations thatare reflected by the electrical state of devices, typically semiconductordevices such as transistors (or vacuum tubes for mid-20th centuryanalog computers), which comes about because of voltage differences and currentflow. Current flow is physically manifested in terms of the flow of electrons in an electricalcircuit.

Quantum computers(QCs), on the other hand, directly exploit the strange and counterintuitivebehavior of sub-atomic particles (electrons, nuclei or photons) as predicted byquantum theory to implement a new type of mathematics. In a QC, quantum bitscalled qubits can be measured as|0> or |1>, which are the quantum equivalents of the binary 0 and 1 inclassical computers. However, due to a quantum property called superposition, qubits can be non-binaryin a superposition state and interact with one another in that state duringprocessing. It is this special property that allows QCs to theoretically offerexponentially more processing power than classical computers in someapplications. Once the processing is complete, the result can only be measuredin the binary states, |0> or |1>, because superpositioning is alwayscollapsed by the measurement process.

Because of another curious quantum property called entanglement, the behavior of two ormore quantum objects is correlated even if they are physically separated.According to the laws of quantum mechanics, this pattern is consistent whethera millimeter or kilometer or an astronomical distance separates them.While one qubit is situated in a superposition between two basis states, 10qubits utilizing entanglement, could be in a superposition of 1,024 basisstates.

Unlike the linearity of classical computers, the calculatingpower of a QC grows exponentially with the number of qubits. It is this abilitythat gives QCs the extraordinary power of processing a huge number of possibleoutcomes simultaneously. When in the unobserved state of superposition, nqubits can contain the same amount of information as 2n classicalbits. So, four qubits are equivalent to 16 classical bits, which might notsound like a big improvement. But 16 qubits are equivalent to 85,536 classicalbits, and 300 qubits can contain more states than all the atoms estimated to bein the universe. That is not only an astronomical number; it is beyondastronomical. This exponential effect is why there is so much hope for thefuture of quantum computing. With single- or double-digit numbers of qubits,the advantage over classical computing is not immediately clear, but the powerof quantum computing scales exponentially beyond that in ways that are trulyhard to imagine. This explains why there is so much anticipation about thetechnology exploding once a certain number of qubits have been reached in areliable QC.

However, to reliably encode information and expect it to bereturned upon measurement, there are only two acceptable states for a qubit: 0and 1.This means a qubit can only store 1 bit of information at a time. Even withmany qubits, the scaling of information storage doesn't improve beyond whatyou'd get classically: ten qubits can store 10 bits of information and onethousand qubits can store 1,000 bits. Because a qubit can only be measured in one of these two states,qubits cannot store any more data than conventional computer bits. There isthus no quantum advantage in data storage. The advantage is in informationprocessing, and that advantage comes from the special quantum properties of aqubit that it can occupy asuperposition of states when not being measured.

Another point to keep in mind is that due to probabilisticwaveform properties of qubits, QCs do not typically deliver one answer, butrather a narrow range of possible answers. Multiple runs of the samecalculation can further narrow the range, but at the expense of lessening speedgains.

Classical computers will not be replaced by QCs. A primaryreason for this is that QCs cannot run the if/then/else logic functions thatare a cornerstone of the classical Von Neumann computer architecture. InsteadQCs will be used alongside classical computers to solve those problems thatthey are particularly good at, such as optimization problems.

The strengths of QCs in simultaneous calculations mean thatthey excel at finding optimal solutions to problems with a large number ofvariables, where the optimal combination is only found after trying out anenormous number of possible combinations or permutations. Such problems arefound, for example, in optimizing any portfolio composition, or trying outmillions of possible new molecular combinations for drugs, or in routing manyaircraft between many hubs. In such problems there are typically 2npossibilities and they all have to be tried out to find an optimal solution. Ifthere are 100 elements to combine, it becomes a 2100 computation,which is almost impossible to solve with a classical computer but a 100-qubitcomputer could solve it in one operation.

Quite a few hard problems in finance are in essenceoptimization problems and therefore meet the description of problems that canbe solved by QCs. The portfolio-optimization problem in finance is one goodexample of such a problem. Asset pricing, credit-scoring, and Monte Carlo-typerisk analysis are other examples. That explains the keen interest of thefinance industry in quantum solutions. The finance industry is also wellpositioned to be an early adopter, because financial algorithms are muchquicker to deploy than algorithms that drive industrial or other physicalprocesses.

A QC architecture can be seen as a stack with the followingtypical layers:

At the bottom is the actual quantum hardware (usually held at near-absolute zero temperaturesto minimize thermal noise, and/or in a vacuum)

The next level up comprises the control systems that regulate thequantum hardware and enable the calculation

Above those comes the software layer that implements the algorithms (and in future, alsowill do the error correction). It includes a quantum-classical interface thatcompiles source code into executable programs

The top of the stack comprises the wider varietyof services to utilize the QC, e.g. the operatingsystems and software platformsthat help translate real-life problems into a format suitable for quantumcomputing

There are many different ways to physically realize qubitsfrom using trapped calcium ions to superconducting structures.In each case, quantum states are being manipulated to perform calculations.Quantum computers can entangle qubits by passing them through quantum logic gates. For example, aCNOT (conditional NOT) gate flipsor doesnt flipa qubit based on the stateof another qubit. Stringing multiple quantum logic gates together creates a quantum circuit.

The designers of QCs need to master and control both superposition and entanglement:

Without superposition, qubits wouldbehave like classical bits, and would not be in the multiple states that allowquantum programmers to run the equivalent of many calculations at once. Withoutentanglement, the qubits would sit in superposition without generatingadditional insight by interacting. No calculation would take place because thestate of each qubit would remain independent from the others. The key tocreating business value from qubits is to manage superposition and entanglementeffectively.[i]

The simplest and most typical physical properties that canserve as a qubit is the electrons internal angular momentum, spin for short. It has the quantumproperty of having only two possible projections on any coordinate axis, +1/2or -1/2 in units of the Planck constant. For any chosen axis the two basicquantum states of the electrons spin can be denoted as (up) or (down). Butthese are not the only states possible for a quantum bit, because the spinstate of an electron is described by a quantum-mechanical wave function. Thatfunction includes two complexnumbers, called quantum amplitudes, and , each with its own magnitude. The rules of quantum mechanics dictate thatQUOTE 162+2=1"> 162+2=1"> . Both and have real and imaginaryparts. The squared magnitudes 2 and 2 correspond to theprobabilities of the spin of the electron to be in the basic states or whenthey are measured. Since those are the only two outcomes possible, their squaredmagnitudes must equal 1. In contrast to a classical bit, which can only be inone of its two binary states, a qubit can be in any continuum of possiblestates, as defined by the quantum amplitudes and . In the popular press thisis often explained by the oversimplified, and somewhat mystical, statement thata qubit can exist simultaneously in both its or states. That is analogousto saying that a plane flying northwest is simultaneously flying both west andnorth, which is not incorrect strictly speaking, but not a particularly helpfulmental model either.

Because a qubit can only be measured in one of these two states, qubits cannot store any moredata than conventional computer bits. There is thus no quantum advantage indata storage. The advantage is in information processing, and that advantagecomes from the special quantum properties of a qubit meaning it can occupy asuperposition of states when not being measured. During computation, qubits caninteract with one another while in their superposition state. For example, aset of 6 qubits can occupy any linear combination of all the 26 = 64different length 6-bit strings. With 64 continuous variables describing thisstate, the space of configurations available to a QC during a calculation is muchgreater than a classical one. The measurement limitations of storinginformation do not apply during the runtime execution of a quantum algorithm:During processing every qubit in a quantum algorithm can occupy asuperposition. Thus, in a superposition state, every possible bit string (inthis example, 26 = 64 different strings)) can be combined. Each bitstring in the superposition has an independent complex number coefficient witha magnitude (A) and a phase ():

i = Aieii

A modern digital computer, with billions of transistors inits processors, typically has 64 bits, not 6 as in our quantum example above.This allows it to consider 64 bits at once, which allows for 264states. While 264 is a large number, equal to approximately 2 x 1019,quantum computing can offer much more. The spaceof continuous states of QCs is much larger than the space of classical bitstates. That is because the possibility of many particles interacting at thequantum level to form a common wave function, allowing changes in one particleto affect all others instantaneously and in a well-ordered manner. That is akinto massive parallel computing, which can beat classical multicore systems.

Quantum computing operations can mostly be handled accordingto the standard rules of linear algebra, in particular matrix multiplication. The quantum state is represented by a state vectorwritten in matrix form, and the gates in the quantum circuit (whereby the calculations are executed) arerepresented as matrices too. Multiplying a state vector by a gate matrix yieldsanother state vector. Recent progress has been made to use quantum algorithmsto crack non-linear equations, by using techniques that disguise non-linearsystems as linear ones.[ii]

The possibility of quantum computing was raised by Caltechphysicist, Richard Feynman, in 1981. The person considered by most to be thefounder of quantum computing, David Deutsch, first defined a QC in a seminalpaper in 1985.[iii]

In 1994, a Bell Labs mathematician, Peter Shor, developed aquantum computing algorithm that can efficiently decompose any integer numberinto its prime factors.[iv]It has since become known as the Shoralgorithm and has great significance for quantum computing. Shorsalgorithm was a purely theoretical exercise at the time, but it anticipatedthat a hypothetical QC could one day solve NP-hard problems of the type used asthe basis for modern cryptography. Shors algorithm relies on the specialproperties of a quantum machine. While the most efficient classical factoringalgorithm, known as the general number field sieve, uses an exponential function of a constant x d1/3to factor an integer with d digits; Shors algorithm can do that byexecuting a runtime function that is only a polynomialfunction, namely a constant x d3. Accordingly, classicalcomputers are limited to factoring integers with only a few hundred digits,which is why using integers in the thousands in cryptography keys is consideredto make for practically unbreakable codes. But a QC using the Kitaev version ofShors algorithm only needs 10d qubits, and will have a runtime roughly equalto d3.[v]

In summary, the Shor algorithm means that a QC can solve anNP-hard mathematical problem in polynomial time that classical computers canonly solve in exponential time.Therefore, Shors algorithm can demonstrate by how much quantum computing canimprove processing time over classical computing. While a full-scale QC withthe thousands of qubits needed to employ Shors algorithm in practice to crackcodes is not yet available, many players are working towards machines of thatsize.

Another important early QC algorithm is Grovers algorithm, a search algorithm which finds a particularregister in an unordered database. This problem can be visualized as aphonebook with N names arranged in completely random order. In order to findsomeone's phone number with a probability of , any classical algorithm(whether deterministic or probabilistic) will need to look at a minimum of N/2names. But the quantum algorithm needs only QUOTE 16ON"> 16ON"> steps.[vi]This algorithm can also be adapted for optimization problems.

Most quantum calculations are performed in what is called a quantum circuit. The quantum circuit isa series of quantum gates thatoperate on a system of qubits. Each quantum gate has inputs and outputs andoperates akin to the hardware logic gates in classical digital computers. Likedigital logic gates, the quantum gates are connected sequentially to implementquantum algorithms.

Quantum algorithmsare algorithms that run on QCs, and which are structured to use the uniqueproperties of quantum mechanics, such as superposition or quantum entanglement,to solve particular problem statements. Major quantum algorithms include thequantum evolutionary algorithm (QEA), the quantum particle swarm optimizationalgorithm (QPSO), the quantum annealing algorithm (QAA), the quantum neuralnetwork (QNN), the quantum Bayesian network (QBN), the quantum wavelettransform (QWT), and the quantum clustering algorithm (QC).[vii]A comprehensive catalog of quantum algorithms can be found online in the Quantum Algorithm Zoo.[viii]

Quantum softwareis the umbrella term used to describe the full collection of QC instructions,from hardware-related code, to compilers, to circuits, all algorithms andworkflow software.

Quantum annealingis an alternative model to circuit-based algorithms, as it is not built up outof gates. Quantum annealing naturally returns low-energy solutions by utilizinga fundamental law of physics that any system will tend to seek its minimumstate. In the case of optimization problems, quantum annealing uses quantumphysics to find the minimum energy state of the problem, which equates to theoptimal or near-optimal combination ofits constituent elements.[ix]

An Ising machineis a non-circuit alternative that works for optimization problems specifically.In the Ising model, the energy from interactions between the spins of everypair of electrons in a collection of atoms is summed. Since the amount of energydepends on whether spins are aligned or not, the total energy of the collectiondepends on the direction in which each spin in the system points. The generalIsing optimization problem is determining in which state the spins should be sothat the total energy of the system is minimized. To use the Ising model foroptimization requires mapping parameters of the original optimization problem,such as an optimal route for the TravelingSalesman,into a representative set of spins, and to define how the spins influence oneanother.[x]

Hybrid computingtypically entails transferring the problem (say optimization) into a quantumalgorithm, of which the first iteration is run on a QC. This provides a veryfast answer, but only a rough assessment of the valid total solution space. Therefined answer is then found with a powerful classical computer, which only hasto examine a subset of the original solution space.[xi]

The Achilles heel of the QC is the loss of coherence, or decoherence, caused by mechanical(vibration), thermal (temperature fluctuations), or electromagnetic disturbanceof the subatomic particles used as qubits. Until the technology improves,various workarounds are needed. Commonly algorithms are designed to reduce thenumber of gates in an attempt to finish execution before decoherence and othersources of errors can corrupt the results.[xii]This often entails a hybrid computing scheme which moves as much work aspossible from the QC to classical computers.

Current guestimates by experts are that truly useful QCswould need to be between 1,000 and 100,000 qubits. However, quantum-computingskeptics such as Mikhail Dyakonov, a noted quantum physicist, point out thatthe enormous number of continuous parameters that would describe the state of auseful QC might also be its Achilles heel. Taking the low end of a 1,000 qubitsmachine, would imply a QC with 21,000 parameters describing itsstate at any moment. That is roughly 10300, a number greater thanthe number of subatomic particles in the universe: A useful QC needs toprocess a set of continuous parameters that is larger than the number ofsubatomic particles in the observable universe.[xiii]How would error control be done for 10300 continuous parameters?According to quantum-computing theorists the threshold theorem proves that it can be done. Their argument isthat once the error per qubit per quantum gate is below a certain thresholdvalue, indefinitely long quantum computation becomes possible, at a cost ofsubstantially increasing the number of qubits needed. The extra qubits areneeded to handle errors by forming logical qubits using multiple physical qubits.(This is a bit like error correction in current telecom systems, which useextra bits to validate data.) But that greatly increases the number of physicalqubits to handle, which as we have seen, are already more than astronomical. Atthe very least, this brings into perspective the magnitude of the technologicalproblems that scientists and engineers will have to overcome.

To put the comparative size of the QC error-correctionproblem in practical terms: For a typical 3-Volt CMOS logic circuit used inclassical digital computers, a binary 0 would be any voltage measured between0V and 1V, while a binary 1 would be any voltage measured between 2V and 3V.Thus when e.g. 0.5V of noise is added to the signal for binary 0, themeasurement would be 0.5V which would still correctly indicate a binary valueof 0. For this reason, digital computers are very robust to noise. However, fora typical qubit, the difference in energy between a zero and a one is just 10-24Joulesone ten-trillionth as much energy as an X-ray photon. Error correctionis one of the biggest hurdles to overcome in quantum computing, the concernbeing that it will impose such a huge overhead, in terms of auxiliarycalculations, that it will make it very hard to scale QCs.

After Dyakonov published the skeptics viewpoint two yearsago, a vigorous debate followed.[xiv]A typical response to the skeptics case comes from an industry-insider,Richard Versluis, systems architect at QuTech, a Dutch QC collaboration.Versluis acknowledges the engineering challenges to control a QC and to makesure its state is not affected. However, he states that the challenge is tomake sure that the control signals and qubits perform as desired. Major sourcesof potential errors are quantum rotationsthat are not perfectly accurate, and decoherenceas qubits lose their entanglement and the information they contain. Versluisgoes on to define a five-layered QC architecture that he believes will be up tothe task. From top to bottom, the layers are 1. Application layer, 2. Classicalprocessing, 3. Digital processing, 4. Analog processing, and 5. Quantumprocessing. Together the digital-, analog-, and quantum-processing layerscomprise the quantum processing unit (QPU). But Versluis also has toacknowledge that quantum error correction could solve the fundamental problemof decoherence only at the expense of 100 to 10,000 error-correcting physicalqubits per logical (calculating) qubit. Furthermore, each of these millions ofqubits will need to be controlled by continuous analog signals. And the biggestchallenge of all is doing the thousands of measurements per second in a waythat they do not disturb quantum information (which must remain unknown untilthe end of the calculation), while catching and correcting errors. The currentparadigm of measuring all qubits with analog signals will not scale up tolarger machines, and a major advance in the technology will be required.[xv]

Most experts agree that we will have to live with QCs overthe next few years that will have high levels of errors that go uncorrected.There is even an accepted industry term and acronym for such QCs: NISQ (Noisy Intermediate-Scale Quantum)devices. The NISQ era is expected to last for the next five years at least, barany major breakthroughs that might shorten that timeline.

Once critical technical breakthroughs are made, QC adoptionmay happen faster than expected due to the prevalence of cloud computing.Making QC services easily accessible over the cloud speeds both adoption andlearning. It has the added advantage that it forces hardware makers to focus onbuilding QCs with a high percentage of uptime, so as to ensure continuedavailability over the cloud.

Most QC makers already offer cloud access to their latestQCs. There are programming environments softwaredevelopment kits (SDKs) that facilitate the building of quantum circuits available over the cloud for QC programmers to learn how to write the softwarethat unleashes the magic of quantum computing, and to experiment with it. Asmore functionality is added to the hardware, these SDKs are continually updated.

The implication is that a whole ecosystem is being broughtup to speed on how to make the best use of a quantum capability that does notquite exist yet. An analogy would be having had flight simulators to trainfuture pilots while the Wright brothers were still figuring out how to keeptheir plane in the air for more than a few hundred feet. The upside of thisapproach is that any real advances in making reliable QCs with capabilitiessuperior to classical computers will be very quickly exploited by real-worldapplications. This situation is in contrast to most major technologicalbreakthroughs we have seen in the past. For example, it took a generation ortwo for industrial engineers to learn how to properly use electrical power inthe place of steam power in factories. More recently, it took a generation tofully exploit the capabilities of digital computing in business and elsewhere.But in the case of quantum computing, all the knowledge building inanticipation of a successful QC could be rapidly translated into applicationsby a corps of developers who are all trained up and ready to fly the planeonce it is finally built. That is the optimistic perspective.

Quantum circuits are already being developed using quantumprogramming languages and so-called quantumdevelopment kits (QDKs),such as Qiskit by IBM and Google Cirq based on Python; and Q# by Microsoftbased on the C# language. The next stepis to develop libraries and workflows for different application domains.Examples of the former are IBMs Aqua and Q# libraries. Examples of the latterare D-Waves Ocean development tool kit for hybrid quantum-classicalapplications and to translate quantum optimization problems into quantumcircuits; or Zapatas Orquestra to compose, run and analyze quantum workflows.On top of the circuits and libraries come the domain-specific applicationplatforms. Orchestrating and integrating classical and quantum workflows tosolve real problems with hybrid quantum-classical algorithms is the name of thegame for the next few years.[xvi]

Quantum-inspired software is already in operation, becausethese applications run on classical computers and not on quantum machines. Amajor example is Fujitsu Quantum-Inspired Digital Annealer Services.[xvii]Even on a theoretical level, quantum ideas have already been fruitful inseveral problem areas, where restructuring problems using quantum principleshave resulted in improved algorithms, proofs, and refuting erroneous oldalgorithms.[xviii]Quantum-inspired software is closely related to quantum-ready software, which can be run on suitable QCs once theyare available.

The industrialization of QCs has entered a critical period.Major countries and leading enterprises in the world are investing huge humanand material resources to advance research in quantum computing.

Google perhaps prematurely used the term quantum supremacy in October 2019 whenit announced the results of its quantum supremacy experiment in a blog[xix]and an article in Nature.[xx]The experiment used Googles 54-qubit processor, named Sycamore, to perform acontrived benchmark test in 200 seconds that would take the fastestsupercomputer 10,000 years to do. But at some point in the future, true quantumsupremacy may indeed be achieved.

Quantum supremacywas originally defined by Caltechs John Preskill[xxi]as the point at which the capabilities of a QC exceed those of any availableclassical computer; the latter is usually understood to be the most advancedsupercomputer built on classical architecture. At one point this was estimatedbe when a QC with 50 or more qubits could be demonstrated. But some experts sayit depends more on how many logical operations (gates) can be implemented in asystem of qubits before their coherence decays, at which point errorsproliferate and further computation becomes impossible. How the qubits areconnected also matters.[xxii]

This led IBM-researchers to formulate the concept of quantum volume (QV) in 2017. More QVmeans a more powerful computer, but QV cannot be increased by increasing onlythe number of qubits. QV is a hardware-agnostic performance measurement forgate-based QCs that considers a number of elements including the number ofqubits, connectivity of the qubits, gate fidelity, cross talk, and circuitcompiler efficiency. In late 2020, IonQ announced that it has calculated a QVof 4 million for its 5th generation QC. Before this announcement, Honeywell's 7-qubition-trap QC had the industry's highest published quantum volume of 128, and IBMhad the next highest QV of 64 with its 27-qubit superconducting quantummachine.[xxiii]In early March 2021, Honeywell claimed to have regained the lead by achieving aQV of 512 with an updated version of System Model H1 QC.[xxiv]Alternating announcements like these from the major QC developers are likely tocontinue for the time being, as each compete for the title of most powerful QC.

Rather than thinking about quantum supremacy as an absolutethreshold or milestone, it is wiser to think about so-called quantum supremacyexperiments as benchmarking experiments for the new technology, perhaps similarto the way we came to express automobile engine power in measures of horsepower. There is also an intriguing question lingering over the whole concept ofquantum supremacy, which is: How could anyone know that a quantum computer isgenuinely doing something that is impossible for a classical one to do ratherthan that they just havent yet found a classical algorithm that is clever enoughto do the job?[xxv]It may be that the advent of quantum computing will force and inspire newdevelopments in classical computing algorithms, something we are already seeingin the concept of quantum-inspired computing software, which will be discussedfurther in a later section.

There is a difference between quantum advantage and quantum supremacy. Quantum supremacy is whenit can be demonstrated that a QC can do something that cannot be done on aclassical computer. Quantum advantage is that a quantum solution can provide areal-world advantage over using the classical approach. (It does not imply thata classical computer could not do it at all.)

There is a second meaning one could attach to quantumsupremacy, which is to mean which nation will hold the technological advantageto this technology of the future. The current list of Top 500 (classical)supercomputers[xxvi]provides a good indication of where the hot spots of quantum computing willlikely be, since no country or region will want to cede a hard-gained advantagein classical computing. Currently, 43 percent of supercomputers are in China,23 percent in the United States, 7 percent in Japan, and about 19 percent inEurope (including the United Kingdom but excluding Russia).

In the European Union, the European Commission founded the Quantum Flagship as a ten-yearcoordinated research initiative which will have at least 1 billion in funding.The long-term vision is the creation of a Quantum Web, defined as quantumcomputers, simulators and sensors interconnected via quantum networksdistributing information and quantum resources such as coherence andentanglement.[xxvii]

The equivalent U.S. initiative is known as the National Quantum Initiative (NQI), andthe $1.2 billion of U.S. government funds are going to the National Instituteof Standards and Technology (NIST), National Science Foundation (NSF)Multidisciplinary Centers for Quantum Research and Education and to theDepartment of Energy Research and National Quantum Information Science ResearchCenters.[xxviii]NIST partners with the University of Colorado Boulder on quantum computingresearch through JILAs Quantum Information Science&Technology (QIST).[xxix]NIST, the Laboratory for Physical Sciences (LPS), and the University of Marylandhave formed the Joint Quantum Institute (JQI)[xxx]to conduct fundamental quantum research. The Joint Center for QuantumInformation and Computer Science (QuICS)[xxxi]was founded in another partnership between NIST and the University of Marylandto specifically advance advances research QC science and quantum informationtheory.

The Chinese government is investing upwards of $10bn inquantum computing, an order of magnitude greater than the respectiveinvestments of $1.2bn by the U.S. government and the E.U. The U.K. and Japanesegovernments are each investing in the order of $300m, with Canada and SouthKorea investing about $40m each.[xxxii]

Chinas multi-billion quantum computing initiative aims toachieve significant breakthroughs by 2030. President Xi has committed billionsto establish the Chinese National Laboratory for Quantum Information Sciences.

The implication of the difference in funding with China, isthat United States is mostly relying on private investments by its tech giantsto remain competitive. Time will tell if that is a wise strategy. It is not asif large tech companies in China are not investing in quantum computing too Alibaba, Tencent, and Baidu are all known to be heavily investing in thetechnology. According to some metrics, China has already gained an early advantageby accumulating more quantum computing-related patents than the United States.[xxxiii]In 2019 Google announced that its a QC performed a particular computation in200 seconds that would take todays fastest supercomputers 10,000 years. But inDecember 2000, Chinese researchers at the University of Science and Technologyin China (USTC) claimed that their prototype QC (based on photons) is 10billion times faster than Googles.[xxxiv]

The Chinese desire to lead the world on quantum computing isnot purely motivated by a desire for industrial competitiveness and economicpower. Threat assessments[xxxv]point to Chinese quantum research and experiments in defense applications suchas:

Using entanglement for secure long-distancemilitary communications, e.g. between satellites and earth stations

Quantum radar that could nullify current U.S.advantages in stealth technology against conventional radars

Quantum submarine detection to ranges of overfive kilometers that would limit the operations of U.S. nuclear submarines

Quantum computers are very hard to build. They requireintricate manipulations of subatomic particles, and operating in a vacuumenvironment or at cryogenic temperatures.

The state of quantum computing resembles the early days ofthe aircraft and automobile industries, when there was a similar proliferationof diverse architectures and exotic designs. Eventually, as quantum technologymatures, a convergence can be expected similar to what we have seen in thoseindustries. In fact, the arrival of such a technological convergence would be agood measure of a growing maturity of quantum computing technology.

There are a number of technical criteria[xxxvi]for making a good QC:

Qubit must stay coherent for long enough to allow the computing to be completed inthe state of superposition. That requires isolation because decoherence occurswhen qubits interact with the outside world

Qubits must be highly connected. This occurs through entanglement and is needed foroperations to act on multiple qubits

High-fidelityoperations are needed. As pointed out above, classical digital computersrely on the digital nature of signals for noise resistance. However, sincequbits need to precisely represent numbers that are not just zero and oneduring the computation state, digital noise reduction is not possible and thenoise problem is more analogous to that in an old-fashioned analog computer.Since noise cannot be easily prevented and must therefore be mitigated, thefocus of current research is on noise-correction techniques

Gate operations must be fast. In practice, this is a trade-off between maintainingcoherence and high-fidelity

Highscalability. It should be obviousthat QCs will only be useful when they can be scaled large enough to solvevaluable problems

Currently, the two quantum technologies showing the greatestpromise and attracting the most interest and investment dollars are superconducting qubits and trapped ions. These and other morenascent or theoretical technologies are presented in Table 1, along with themain proponents in each technology.

Table SEQ Table * ARABIC1. Qubit Technologies and MainProponents

Technology

Main Proponents

Superconducting qubits (called transmons by some) are realized by using a microwave signal to put a resistance-free current in a superposition state. This technology has fast gate times and the advantage of more proven technologies superconducting circuits are based on well-known complementary metal-oxide semiconductor technology (CMOS) used in digital computers. But superconducting qubits have fast decoherence times and require more error correction. Superconduction requires cooling to a temperature very close to absolute zero. The technology is considered to be highly scalable.

IBM

Google

Rigetti

D-Wave

Alibaba

Intel

NEC

Quantum Circuits

Oxford Quantum Circuits

Ion Trap QCs work by trapping ions electric fields and holding them in place. The outermost electron orbiting the nucleus is put in different states and used as a qubit. Ion Trap qubits have longer coherence times and can operate with minor cooling, but do require a high vacuum. Thought the first quantum logic gate was demonstrated in 1995 using trapped atomic ions, at a system level this technology is less mature and require progress in multiple domains including vacuum, laser, and optical systems, radio frequency and microwave technology, and coherent electronic controllers

IonQ

Honeywell

Alpine Quantum Technologies

Photonic qubits are photons (light particles) that operate on silicon chip pathways. Such qubits do no require extreme cooling, and silicon chip fabrication techniques are well-established, making this technology highly scalable.

Continue reading here:

Quantum Computing In Finance Where We Stand And Where We Could Go - Science 2.0

Related Posts