Quantum computing is all the rage. It seems like hardly a day goes by without some news outlet describing the extraordinary things this technology promises. Most commentators forget, or just gloss over, the fact that people have been working on quantum computing for decadesand without any practical results to show for it.
We've been told that quantum computers could provide breakthroughs in many disciplines, including materials and drug discovery, the optimization of complex systems, and artificial intelligence." We've been assured that quantum computers will forever alter our economic, industrial, academic, and societal landscape." We've even been told that the encryption that protects the world's most sensitive data may soon be broken" by quantum computers. It has gotten to the point where many researchers in various fields of physics feel obliged to justify whatever work they are doing by claiming that it has some relevance to quantum computing.
Meanwhile, government research agencies, academic departments (many of them funded by government agencies), and corporate laboratories are spending billions of dollars a year developing quantum computers. On Wall Street, Morgan Stanley and other financial giants expect quantum computing to mature soon and are keen to figure out how this technology can help them.
It's become something of a self-perpetuating arms race, with many organizations seemingly staying in the race if only to avoid being left behind. Some of the world's top technical talent, at places like Google, IBM, and Microsoft, are working hard, and with lavish resources in state-of-the-art laboratories, to realize their vision of a quantum-computing future.
In light of all this, it's natural to wonder: When will useful quantum computers be constructed? The most optimistic experts estimate it will take 5 to 10 years. More cautious ones predict 20 to 30 years. (Similar predictions have been voiced, by the way, for the last 20 years.) I belong to a tiny minority that answers, Not in the foreseeable future." Having spent decades conducting research in quantum and condensed-matter physics, I've developed my very pessimistic view. It's based on an understanding of the gargantuan technical challenges that would have to be overcome to ever make quantum computing work.
The idea of quantum computing first appeared nearly 40 years ago, in 1980, when the Russian-born mathematician Yuri Manin, who now works at the Max Planck Institute for Mathematics, in Bonn, first put forward the notion, albeit in a rather vague form. The concept really got on the map, though, the following year, when physicist Richard Feynman, at the California Institute of Technology, independently proposed it.
Realizing that computer simulations of quantum systems become impossible to carry out when the system under scrutiny gets too complicated, Feynman advanced the idea that the computer itself should operate in the quantum mode: Nature isn't classical, dammit, and if you want to make a simulation of nature, you'd better make it quantum mechanical, and by golly it's a wonderful problem, because it doesn't look so easy," he opined. A few years later, University of Oxford physicist David Deutsch formally described a general-purpose quantum computer, a quantum analogue of the universal Turing machine.
The subject did not attract much attention, though, until 1994, when mathematician Peter Shor (then at Bell Laboratories and now at MIT) proposed an algorithm for an ideal quantum computer that would allow very large numbers to be factored much faster than could be done on a conventional computer. This outstanding theoretical result triggered an explosion of interest in quantum computing. Many thousands of research papers, mostly theoretical, have since been published on the subject, and they continue to come out at an increasing rate.
The basic idea of quantum computing is to store and process information in a way that is very different from what is done in conventional computers, which are based on classical physics. Boiling down the many details, it's fair to say that conventional computers operate by manipulating a large number of tiny transistors working essentially as on-off switches, which change state between cycles of the computer's clock.
The state of the classical computer at the start of any given clock cycle can therefore be described by a long sequence of bits corresponding physically to the states of individual transistors. With N transistors, there are 2N possible states for the computer to be in. Computation on such a machine fundamentally consists of switching some of its transistors between their on" and off" states, according to a prescribed program.
Illustration: Christian Gralingen
In quantum computing, the classical two-state circuit element (the transistor) is replaced by a quantum element called a quantum bit, or qubit. Like the conventional bit, it also has two basic states. Although a variety of physical objects could reasonably serve as quantum bits, the simplest thing to use is the electron's internal angular momentum, or spin, which has the peculiar quantum property of having only two possible projections on any coordinate axis: +1/2 or 1/2 (in units of the Planck constant). For whatever the chosen axis, you can denote the two basic quantum states of the electron's spin as and .
Here's where things get weird. With the quantum bit, those two states aren't the only ones possible. That's because the spin state of an electron is described by a quantum-mechanical wave function. And that function involves two complex numbers, and (called quantum amplitudes), which, being complex numbers, have real parts and imaginary parts. Those complex numbers, and , each have a certain magnitude, and according to the rules of quantum mechanics, their squared magnitudes must add up to 1.
That's because those two squared magnitudes correspond to the probabilities for the spin of the electron to be in the basic states and when you measure it. And because those are the only outcomes possible, the two associated probabilities must add up to 1. For example, if the probability of finding the electron in the state is 0.6 (60 percent), then the probability of finding it in the state must be 0.4 (40 percent)nothing else would make sense.
In contrast to a classical bit, which can only be in one of its two basic states, a qubit can be in any of a continuum of possible states, as defined by the values of the quantum amplitudes and . This property is often described by the rather mystical and intimidating statement that a qubit can exist simultaneously in both of its and states.
Yes, quantum mechanics often defies intuition. But this concept shouldn't be couched in such perplexing language. Instead, think of a vector positioned in the x-y plane and canted at 45 degrees to the x-axis. Somebody might say that this vector simultaneously points in both the x- and y-directions. That statement is true in some sense, but it's not really a useful description. Describing a qubit as being simultaneously in both and states is, in my view, similarly unhelpful. And yet, it's become almost de rigueur for journalists to describe it as such.
In a system with two qubits, there are 22 or 4 basic states, which can be written (), (), (), and (). Naturally enough, the two qubits can be described by a quantum-mechanical wave function that involves four complex numbers. In the general case of N qubits, the state of the system is described by 2N complex numbers, which are restricted by the condition that their squared magnitudes must all add up to 1.
While a conventional computer with N bits at any given moment must be in one of its 2N possible states, the state of a quantum computer with N qubits is described by the values of the 2N quantum amplitudes, which are continuous parameters (ones that can take on any value, not just a 0 or a 1). This is the origin of the supposed power of the quantum computer, but it is also the reason for its great fragility and vulnerability.
How is information processed in such a machine? That's done by applying certain kinds of transformationsdubbed quantum gates"that change these parameters in a precise and controlled manner.
Experts estimate that the number of qubits needed for a useful quantum computer, one that could compete with your laptop in solving certain kinds of interesting problems, is between 1,000 and 100,000. So the number of continuous parameters describing the state of such a useful quantum computer at any given moment must be at least 21,000, which is to say about 10300. That's a very big number indeed. How big? It is much, much greater than the number of subatomic particles in the observable universe.
To repeat: A useful quantum computer needs to process a set of continuous parameters that is larger than the number of subatomic particles in the observable universe.
At this point in a description of a possible future technology, a hardheaded engineer loses interest. But let's continue. In any real-world computer, you have to consider the effects of errors. In a conventional computer, those arise when one or more transistors are switched off when they are supposed to be switched on, or vice versa. This unwanted occurrence can be dealt with using relatively simple error-correction methods, which make use of some level of redundancy built into the hardware.
In contrast, it's absolutely unimaginable how to keep errors under control for the 10300 continuous parameters that must be processed by a useful quantum computer. Yet quantum-computing theorists have succeeded in convincing the general public that this is feasible. Indeed, they claim that something called the threshold theorem proves it can be done. They point out that once the error per qubit per quantum gate is below a certain value, indefinitely long quantum computation becomes possible, at a cost of substantially increasing the number of qubits needed. With those extra qubits, they argue, you can handle errors by forming logical qubits using multiple physical qubits.
How many physical qubits would be required for each logical qubit? No one really knows, but estimates typically range from about 1,000 to 100,000. So the upshot is that a useful quantum computer now needs a million or more qubits. And the number of continuous parameters defining the state of this hypothetical quantum-computing machinewhich was already more than astronomical with 1,000 qubitsnow becomes even more ludicrous.
Even without considering these impossibly large numbers, it's sobering that no one has yet figured out how to combine many physical qubits into a smaller number of logical qubits that can compute something useful. And it's not like this hasn't long been a key goal.
In the early 2000s, at the request of the Advanced Research and Development Activity (a funding agency of the U.S. intelligence community that is now part of Intelligence Advanced Research Projects Activity), a team of distinguished experts in quantum information established a road map for quantum computing. It had a goal for 2012 that requires on the order of 50 physical qubits" and exercises multiple logical qubits through the full range of operations required for fault-tolerant [quantum computation] in order to perform a simple instance of a relevant quantum algorithm." It's now the end of 2018, and that ability has still not been demonstrated.
Illustration: Christian Gralingen
The huge amount of scholarly literature that's been generated about quantum-computing is notably light on experimental studies describing actual hardware. The relatively few experiments that have been reported were extremely difficult to conduct, though, and must command respect and admiration.
The goal of such proof-of-principle experiments is to show the possibility of carrying out basic quantum operations and to demonstrate some elements of the quantum algorithms that have been devised. The number of qubits used for them is below 10, usually from 3 to 5. Apparently, going from 5 qubits to 50 (the goal set by the ARDA Experts Panel for the year 2012) presents experimental difficulties that are hard to overcome. Most probably they are related to the simple fact that 25 = 32, while 250 = 1,125,899,906,842,624.
By contrast, the theory of quantum computing does not appear to meet any substantial difficulties in dealing with millions of qubits. In studies of error rates, for example, various noise models are being considered. It has been proved (under certain assumptions) that errors generated by local" noise can be corrected by carefully designed and very ingenious methods, involving, among other tricks, massive parallelism, with many thousands of gates applied simultaneously to different pairs of qubits and many thousands of measurements done simultaneously, too.
A decade and a half ago, ARDA's Experts Panel noted that it has been established, under certain assumptions, that if a threshold precision per gate operation could be achieved, quantum error correction would allow a quantum computer to compute indefinitely." Here, the key words are under certain assumptions." That panel of distinguished experts did not, however, address the question of whether these assumptions could ever be satisfied.
I argue that they can't. In the physical world, continuous quantities (be they voltages or the parameters defining quantum-mechanical wave functions) can be neither measured nor manipulated exactly. That is, no continuously variable quantity can be made to have an exact value, including zero. To a mathematician, this might sound absurd, but this is the unquestionable reality of the world we live in, as any engineer knows.
Sure, discrete quantities, like the number of students in a classroom or the number of transistors in the on" state, can be known exactly. Not so for quantities that vary continuously. And this fact accounts for the great difference between a conventional digital computer and the hypothetical quantum computer.
Indeed, all of the assumptions that theorists make about the preparation of qubits into a given state, the operation of the quantum gates, the reliability of the measurements, and so forth, cannot be fulfilled exactly. They can only be approached with some limited precision. So, the real question is: What precision is required? With what exactitude must, say, the square root of 2 (an irrational number that enters into many of the relevant quantum operations) be experimentally realized? Should it be approximated as 1.41 or as 1.41421356237? Or is even more precision needed? There are no clear answers to these crucial questions.
While various strategies for building quantum computers are now being explored, an approach that many people consider the most promising, initially undertaken by the Canadian company D-Wave Systems and now being pursued by IBM, Google, Microsoft, and others, is based on using quantum systems of interconnected Josephson junctions cooled to very low temperatures (down to about 10 millikelvins).
The ultimate goal is to create a universal quantum computer, one that can beat conventional computers in factoring large numbers using Shor's algorithm, performing database searches by a similarly famous quantum-computing algorithm that Lov Grover developed at Bell Laboratories in 1996, and other specialized applications that are suitable for quantum computers.
On the hardware front, advanced research is under way, with a 49-qubit chip (Intel), a 50-qubit chip (IBM), and a 72-qubit chip (Google) having recently been fabricated and studied. The eventual outcome of this activity is not entirely clear, especially because these companies have not revealed the details of their work.
While I believe that such experimental research is beneficial and may lead to a better understanding of complicated quantum systems, I'm skeptical that these efforts will ever result in a practical quantum computer. Such a computer would have to be able to manipulateon a microscopic level and with enormous precisiona physical system characterized by an unimaginably huge set of parameters, each of which can take on a continuous range of values. Could we ever learn to control the more than 10300 continuously variable parameters defining the quantum state of such a system?
My answer is simple. No, never.
I believe that, appearances to the contrary, the quantum computing fervor is nearing its end. That's because a few decades is the maximum lifetime of any big bubble in technology or science. After a certain period, too many unfulfilled promises have been made, and anyone who has been following the topic starts to get annoyed by further announcements of impending breakthroughs. What's more, by that time all the tenured faculty positions in the field are already occupied. The proponents have grown older and less zealous, while the younger generation seeks something completely new and more likely to succeed.
All these problems, as well as a few others I've not mentioned here, raise serious doubts about the future of quantum computing. There is a tremendous gap between the rudimentary but very hard experiments that have been carried out with a few qubits and the extremely developed quantum-computing theory, which relies on manipulating thousands to millions of qubits to calculate anything useful. That gap is not likely to be closed anytime soon.
To my mind, quantum-computing researchers should still heed an admonition that IBM physicist Rolf Landauer made decades ago when the field heated up for the first time. He urged proponents of quantum computing to include in their publications a disclaimer along these lines: This scheme, like all other schemes for quantum computation, relies on speculative technology, does not in its current form take into account all possible sources of noise, unreliability and manufacturing error, and probably will not work."
Editor's note: A sentence in this article originally stated that concerns over required precision were never even discussed." This sentence was changed on 30 November 2018 after some readers pointed out to the author instances in the literature that had considered these issues. The amended sentence now reads: There are no clear answers to these crucial questions."
Mikhail Dyakonov does research in theoretical physics at Charles Coulomb Laboratory at the University of Montpellier, in France. His name is attached to various physical phenomena, perhaps most famously Dyakonov surface waves.
More here:
The Case Against Quantum Computing - IEEE Spectrum
- Global AI Chipsets Markets 2019-2024 for Wireless Networks and Devices, Cloud and Next Generation Computing, IoT, and Big Data Analytics -... [Last Updated On: December 3rd, 2019] [Originally Added On: December 3rd, 2019]
- AWS re:Invent re:turns with re:vised robo-car and Windows Server 2008 re:vitalization plan - The Register [Last Updated On: December 3rd, 2019] [Originally Added On: December 3rd, 2019]
- Researchers Discover New Way to Split and Sum Photons with Silicon - UT News | The University of Texas at Austin [Last Updated On: December 3rd, 2019] [Originally Added On: December 3rd, 2019]
- First quantum computing conference to take place in Cambridge - Cambridge Independent [Last Updated On: December 3rd, 2019] [Originally Added On: December 3rd, 2019]
- Amazon is now offering quantum computing as a service with Braket for AWS - The Verge [Last Updated On: December 3rd, 2019] [Originally Added On: December 3rd, 2019]
- Quantum Computers Are About to Forever Change Car Navigation - autoevolution [Last Updated On: December 7th, 2019] [Originally Added On: December 7th, 2019]
- How Countries Are Betting on to Become Supreme in Quantum Computing - Analytics Insight [Last Updated On: December 7th, 2019] [Originally Added On: December 7th, 2019]
- Quantum Trends And The Internet of Things - Forbes [Last Updated On: December 7th, 2019] [Originally Added On: December 7th, 2019]
- This Week in Tech: What on Earth Is a Quantum Computer? - The New York Times [Last Updated On: December 7th, 2019] [Originally Added On: December 7th, 2019]
- InfoQ's 2019, and Software Predictions for 2020 - InfoQ.com [Last Updated On: December 9th, 2019] [Originally Added On: December 9th, 2019]
- Breakthrough in creation of gamma ray lasers that use antimatter - Big Think [Last Updated On: December 9th, 2019] [Originally Added On: December 9th, 2019]
- Quantum supremacy is here, but smart data will have the biggest impact - Quantaneo, the Quantum Computing Source [Last Updated On: December 9th, 2019] [Originally Added On: December 9th, 2019]
- Quantum Computers Are the Ultimate Paper Tiger - The National Interest Online [Last Updated On: December 9th, 2019] [Originally Added On: December 9th, 2019]
- Atos Boosts Quantum Application Development Through the Creation of the First Quantum User Group - AiThority [Last Updated On: December 12th, 2019] [Originally Added On: December 12th, 2019]
- Shaping the technology transforming our society | News - Fermi National Accelerator Laboratory [Last Updated On: December 12th, 2019] [Originally Added On: December 12th, 2019]
- Inside the weird, wild, and wondrous world of quantum video games - Digital Trends [Last Updated On: December 12th, 2019] [Originally Added On: December 12th, 2019]
- China is beating the US when it comes to quantum security - MIT Technology Review [Last Updated On: December 12th, 2019] [Originally Added On: December 12th, 2019]
- Double eureka: Breakthroughs could lead to quantum 'FM radio' and the end of noise - The Next Web [Last Updated On: December 12th, 2019] [Originally Added On: December 12th, 2019]
- D-Wave partners with NEC to build hybrid HPC and quantum apps - TechCrunch [Last Updated On: December 12th, 2019] [Originally Added On: December 12th, 2019]
- Quantum computing will be the smartphone of the 2020s, says Bank of America strategist - MarketWatch [Last Updated On: December 12th, 2019] [Originally Added On: December 12th, 2019]
- Quantum computing leaps ahead in 2019 with new power and speed - CNET [Last Updated On: December 12th, 2019] [Originally Added On: December 12th, 2019]
- The Hits And Misses Of AWS re:Invent 2019 - Forbes [Last Updated On: December 14th, 2019] [Originally Added On: December 14th, 2019]
- Technology to Highlight the Next 10 Years: Quantum Computing - Somag News [Last Updated On: December 14th, 2019] [Originally Added On: December 14th, 2019]
- How quantum computing is set to impact the finance industry - IT Brief New Zealand [Last Updated On: December 14th, 2019] [Originally Added On: December 14th, 2019]
- Will quantum computing overwhelm existing security tech in the near future? - Help Net Security [Last Updated On: December 14th, 2019] [Originally Added On: December 14th, 2019]
- Quantum expert Robert Sutor explains the basics of Quantum Computing - Packt Hub [Last Updated On: December 14th, 2019] [Originally Added On: December 14th, 2019]
- ProBeat: AWS and Azure are generating uneasy excitement in quantum computing - VentureBeat [Last Updated On: December 14th, 2019] [Originally Added On: December 14th, 2019]
- Could quantum computing be the key to cracking congestion? - SmartCitiesWorld [Last Updated On: December 14th, 2019] [Originally Added On: December 14th, 2019]
- D-Wave Announces Promotion of Dr. Alan Baratz to CEO - GlobeNewswire [Last Updated On: December 15th, 2019] [Originally Added On: December 15th, 2019]
- What Was The Most Important Physics Of 2019? - Forbes [Last Updated On: December 18th, 2019] [Originally Added On: December 18th, 2019]
- AI, 5G, 'ambient computing': What to expect in tech in 2020 and beyond - USA TODAY [Last Updated On: December 18th, 2019] [Originally Added On: December 18th, 2019]
- What WON'T Happen in 2020: 5G Wearables, Quantum Computing, and Self-Driving Trucks to Name a Few - Business Wire [Last Updated On: December 18th, 2019] [Originally Added On: December 18th, 2019]
- How quantum computing could beat climate change - World Economic Forum [Last Updated On: December 18th, 2019] [Originally Added On: December 18th, 2019]
- How Quantum Computers Work | HowStuffWorks [Last Updated On: December 18th, 2019] [Originally Added On: December 18th, 2019]
- Quantum Computing Market Increase In Analysis & Development Activities Is More Boosting Demands - Market Research Sheets [Last Updated On: December 20th, 2019] [Originally Added On: December 20th, 2019]
- IBM partners with the University of Tokyo on quantum computing initiative - SiliconANGLE News [Last Updated On: December 20th, 2019] [Originally Added On: December 20th, 2019]
- 2020 and beyond: Tech trends and human outcomes - Accountancy Age [Last Updated On: December 20th, 2019] [Originally Added On: December 20th, 2019]
- IBM and the U. of Tokyo launch quantum computing initiative for Japan | - University Business [Last Updated On: December 20th, 2019] [Originally Added On: December 20th, 2019]
- The Quantum Computing Decade Is ComingHeres Why You Should Care - Observer [Last Updated On: December 20th, 2019] [Originally Added On: December 20th, 2019]
- Quantum Technology Expert to Discuss Quantum Sensors for Defense Applications at Office of Naval Research (ONR) - Business Wire [Last Updated On: December 23rd, 2019] [Originally Added On: December 23rd, 2019]
- IBM and Japan join hands in the development of quantum computers - Neowin [Last Updated On: December 23rd, 2019] [Originally Added On: December 23rd, 2019]
- IBM and the University of Tokyo Launch Quantum Computing Initiative for Japan - Martechcube [Last Updated On: December 23rd, 2019] [Originally Added On: December 23rd, 2019]
- IBM and the University of Tokyo partner to advance quantum computing - Help Net Security [Last Updated On: December 23rd, 2019] [Originally Added On: December 23rd, 2019]
- Reflections on 2019 in Technology Law, and a Peek into 2020 - Lexology [Last Updated On: December 24th, 2019] [Originally Added On: December 24th, 2019]
- IBM and the University of Tokyo Launch Quantum Computing Initiative for Japan - Quantaneo, the Quantum Computing Source [Last Updated On: December 24th, 2019] [Originally Added On: December 24th, 2019]
- 2020 Will be a Banner Year for AI Custom Chipsets and Heterogenous Computing; Quantum Computing Remains on the Far Horizon - Yahoo Finance [Last Updated On: December 24th, 2019] [Originally Added On: December 24th, 2019]
- IBM, University of Tokyo Partner on Quantum Computing Project - Yahoo Finance [Last Updated On: December 24th, 2019] [Originally Added On: December 24th, 2019]
- What's Not Likely To Happen In 2020 - RTInsights [Last Updated On: December 27th, 2019] [Originally Added On: December 27th, 2019]
- From space tourism to robo-surgeries: Investors are betting on the future like there's no tomorrow - Financial Post [Last Updated On: December 27th, 2019] [Originally Added On: December 27th, 2019]
- 2020 will be the beginning of the tech industry's radical revisioning of the physical world - TechCrunch [Last Updated On: December 27th, 2019] [Originally Added On: December 27th, 2019]
- Same Plastic That Make Legos Could Also Be The Best Thermal Insulators Used in Quantum Computers - KTLA Los Angeles [Last Updated On: December 27th, 2019] [Originally Added On: December 27th, 2019]
- Information teleported between two computer chips for the first time - New Atlas [Last Updated On: December 27th, 2019] [Originally Added On: December 27th, 2019]
- How This Breakthrough Makes Silicon-Based Qubit Chips The Future of Quantum Computing - Analytics India Magazine [Last Updated On: December 27th, 2019] [Originally Added On: December 27th, 2019]
- Quantum Computing Breakthrough: Silicon Qubits Interact at Long-Distance - SciTechDaily [Last Updated On: December 27th, 2019] [Originally Added On: December 27th, 2019]
- Donna Strickland appointed to Order of Canada - University of Rochester [Last Updated On: December 30th, 2019] [Originally Added On: December 30th, 2019]
- 20 technologies that could change your life in the next decade - Economic Times [Last Updated On: December 30th, 2019] [Originally Added On: December 30th, 2019]
- 5 open source innovation predictions for the 2020s - TechRepublic [Last Updated On: December 30th, 2019] [Originally Added On: December 30th, 2019]
- Quantum Supremacy and the Regulation of Quantum Technologies - The Regulatory Review [Last Updated On: December 30th, 2019] [Originally Added On: December 30th, 2019]
- Physicists Just Achieved The First-Ever Quantum Teleportation Between Computer Chips - ScienceAlert [Last Updated On: December 30th, 2019] [Originally Added On: December 30th, 2019]
- The 12 Most Important and Stunning Quantum Experiments of 2019 - Livescience.com [Last Updated On: December 30th, 2019] [Originally Added On: December 30th, 2019]
- Quantum Teleportation Has Been Achieved With the Help of Quantum Entanglement - Dual Dove [Last Updated On: January 4th, 2020] [Originally Added On: January 4th, 2020]
- Top 5 Cloud Computing Trends of 2020 - Analytics Insight [Last Updated On: January 4th, 2020] [Originally Added On: January 4th, 2020]
- News Content Hub - Five emerging technologies for the 2020s - Riviera Maritime Media [Last Updated On: January 4th, 2020] [Originally Added On: January 4th, 2020]
- US Government Looks To Restrict Exports Of AI, Quantum Computing And Self-Driving Tech - WebProNews [Last Updated On: January 4th, 2020] [Originally Added On: January 4th, 2020]
- Year 2019 in Science: History of Humans, Ebola Treatment and Quantum Computing - NewsClick [Last Updated On: January 4th, 2020] [Originally Added On: January 4th, 2020]
- Superconductor or not? They're exploring the identity crisis of this weird quantum material. - News@Northeastern [Last Updated On: January 4th, 2020] [Originally Added On: January 4th, 2020]
- The World Keeps Growing Smaller: The Reinvention Of Finance - Seeking Alpha [Last Updated On: January 8th, 2020] [Originally Added On: January 8th, 2020]
- Goldman Sachs and QC Ware Join Forces to Develop Quantum Algorithms in Finance - Quantaneo, the Quantum Computing Source [Last Updated On: January 8th, 2020] [Originally Added On: January 8th, 2020]
- January 9th: France will unveil its quantum strategy. What can we expect from this report? - Quantaneo, the Quantum Computing Source [Last Updated On: January 8th, 2020] [Originally Added On: January 8th, 2020]
- Where will technology take us in 2020? - Digital News Asia [Last Updated On: January 8th, 2020] [Originally Added On: January 8th, 2020]
- Superconductor or Not? Exploring the Identity Crisis of This Weird Quantum Material - SciTechDaily [Last Updated On: January 8th, 2020] [Originally Added On: January 8th, 2020]
- AI, ML and quantum computing to cement position in 2020: Alibabas Jeff Zhang - Tech Observer [Last Updated On: January 8th, 2020] [Originally Added On: January 8th, 2020]
- AI, edge computing among Austin tech trends to watch in 2020 - KXAN.com [Last Updated On: January 8th, 2020] [Originally Added On: January 8th, 2020]
- Charles Hoskinson Predicts Economic Collapse, Rise of Quantum Computing, Space Travel and Cryptocurrency in the 2020s - The Daily Hodl [Last Updated On: January 8th, 2020] [Originally Added On: January 8th, 2020]
- Global Quantum Computing Market: What it got next? Find out with the latest research available at PMI - Pro News Time [Last Updated On: January 12th, 2020] [Originally Added On: January 12th, 2020]
- Is Quantum Technology The Future Of The World? - The Coin Republic [Last Updated On: January 12th, 2020] [Originally Added On: January 12th, 2020]
- Were approaching the limits of computer power we need new programmers now - The Guardian [Last Updated On: January 12th, 2020] [Originally Added On: January 12th, 2020]
- Google and IBM square off in Schrodingers catfight over quantum supremacy - The Register [Last Updated On: January 12th, 2020] [Originally Added On: January 12th, 2020]
- Start-ups join Google, SpaceX and OneWeb to bring new technologies to space - CNBC [Last Updated On: January 12th, 2020] [Originally Added On: January 12th, 2020]
- Bleeding edge information technology developments - IT World Canada [Last Updated On: January 12th, 2020] [Originally Added On: January 12th, 2020]