Is Quantum Computing the Next Big Thing? – Business Insider

Posted: April 25, 2022 at 5:21 pm

Under normal circumstances, the arrival of a breakthrough technology creates its own hype. The first light bulb drew hundreds of gawkers to Menlo Park, New Jersey, in 1879. A century later, crowds gathered again to see artificial-intelligence-powered supercomputers defeat human grandmasters at chess and Go. Social media announced its arrival with hockey-stick growth. When you discover something that actually has the power to transform the world, the world usually takes notice.

But investors in quantum computing a technology that has the theoretical potential to make a traditional supercomputer look like a slide rule have been mostly tooting their own horns, promising a revolution that's just over the horizon. The biggest booster is IBM, which has been pushing quantum through YouTube videos and sponsored podcasts. The breaking of the "100-qubit processor barrier" in November, a feat performed by IBM's Eagle chip, may not have rocked the tech world, but you wouldn't know it from the company's online rollout.

"Dreams are time travel to the future," gushed Daro Gil, IBM's director of research, in a video announcing Eagle. "We have definitely traveled into the future," he continued, adding: "This is the real thing."

IBM has a big financial incentive to hype its quantum potential. With Eagle, the company is betting that quantum computing can return it to the first ranks of tech giants, alongside Google, Microsoft, and Amazon, all of which are bankrolling their own quantum efforts. Smaller quantum players are also scrambling to break into the space. IonQ, which went public last year through a special-purpose acquisition company, trades at 1,000 times its annual revenue. D-Wave, which has backing from Goldman Sachs and Jeff Bezos, also plans to go public through a SPAC . One recent report estimates that quantum computers could generate nearly a trillion dollars in annual revenue by 2050, with applications from auto and airplane manufacturing to pharmaceutical development and finance.

But the hype about quantum computing's future glosses over the limitations of its present. For now, quantum computers remain exceedingly slow and buggy to the point of uselessness. Unlike AI and augmented reality, which already enjoy robust pipelines of products heading to market, quantum computing lacks anything close to a working prototype with the power to draw a crowd. Both IBM and IonQ have "road maps" that promise an operational 1,000-qubit processor by the end of next year. But experts agree that even if the companies manage to hit that significant-sounding target and that's a big if a versatile quantum computer that can perform a range of practical operations on its own, outside a lab, is still many years away.

"It's going to be a lot of gradual improvement in capabilities," says Celia Merzbacher, who heads up the Quantum Economic Development Consortium. "There's a lot that has to happen to get to something that resembles what we think of today as a computer."

Which leads to the question most of us have about quantum computing: What the hell is it?

For decades, the fundamental unit of computing has been the "bit" either a one or a zero. Charles Babbage's mechanical computers used the position of gears and levers to record bits. On a flash drive, bits are stored as electrical charges on tiny magnetic cells.

Quantum computing, by contrast, operates on qubits, which can be a one, a zero, or a combination of both an uncanny, ambiguous state known as superposition. This is possible because subatomic particles defy common sense, appearing and disappearing in ways that continue to surprise and baffle physicists. Photons, to give just one example, form a pattern with light and dark bands when shot through a barrier with two slits. But try using a detector to observe which slit an individual photon passed through, and the pattern disappears.

The seductive promise of the qubit lies in its exponential power. Two regular bits can be used to represent four states 00, 01, 10, 11 but only one of those states at any given time. In theory, two qubits could represent all four states at the same time and then resolve to whichever state is needed to solve a given problem. That means the 127-qubit Eagle has a computing potential that is thousands of millions of millions times that of a classical supercomputer.

The problem is, it's incredibly difficult to get all those qubits working together. With today's technology, maintaining a state of superposition within even one qubit is a tall order. Subatomic particles are sensitive to tiny changes in their environment. Scientists have tried to stabilize their quantum processors by storing them at extremely cold temperatures, but it hasn't made much difference. So for now, quantum computing depends on a subdiscipline called "quantum error correction," which usually involves running the same code over and over again, through multiple qubits, and using probability to correct for random errors.

The need for error correction has led scientists to distinguish between physical qubits, like the ones that make up the Eagle, and more idealized logical qubits, which are sufficiently reliable to program with. By most estimates, it takes 1,000 physical qubits to yield one logical qubit. So even if IBM hits its 1,000-qubit benchmark by next year, it will have succeeded in achieving only the computing capability of a single traditional bit a computer with a fraction of the power of a video-game console from the 1980s.

Sankar Das Sarma, a theoretical physicist at the University of Maryland who has published widely on quantum computing, believes that the technology is real and has tremendous long-term potential. But he is skeptical about its short-term prospects.

"Claiming to have a thousand or a trillion qubits by some deadline is a meaningless statement unless the properties of those 'qubits' have extremely well-defined technical specifications," he told me. "One can easily have as many qubits as one wants, if they are sufficiently bad from a computational viewpoint."

The need for a new, more powerful computer model is certainly real enough. For decades, as predicted by Moore's law, computer power has been growing at an exponential rate. But that growth has begun to slow, hemmed in by physical reality. In simple terms, we're reaching the limit of how many transistors we can pack onto the chips that power classical computers. And if those transistors can't get smaller, the electrical signals that zip around on the chips can't get any faster. Our computers are still getting smarter and speedier, but those gains are beginning to level off.

But qubits aren't constrained by traditional limits of space and time. They exist in multiple states simultaneously meaning, at least in theory, that we can deploy vast armies of them to do our computational bidding, if we can figure out how to harness their shifty nature.

As is often the case, two primary applications are driving the new technology: surveillance and finance. As more and more data is protected by dual-key encryption, governments are eager to find a way to crack the code. That requires figuring out the factors of very large semiprime numbers a problem that would take the most powerful classical computers billions of years. A quantum processor with thousands of logical, error-corrected qubits, by contrast, could conceivably decrypt emails and other communications almost instantly, enabling governments to decode and read messages while they were still in transit. Many countries are said to be storing petabytes of encrypted data that was transmitted by their adversaries, in the hope that quantum computing will one day render it all legible.

At the same time, the US is working to build a standard for "post-quantum cryptography" that can survive a qubit attack. "It is not unreasonable to think we'll have total chaos," says Miles Taylor, who helped organize the effort as chief of staff at the Department of Homeland Security. "Someone will have a massive asymmetric advantage. It could be IBM. It could be the Chinese Communist Party." When I asked about timelines, Taylor said he believed we'd see a quantum computer capable of cracking current encryption technologies "within a decade."

Another sector that has been betting heavily on quantum's potential is finance. D-Wave, the firm with backing from Goldman Sachs, is marketing portfolio-optimization services to finance companies, promising higher returns at lower risk. Classical computers have trouble quickly solving what are known as "combinatorial optimization" problems, such as how best to allocate investments in a variety of scenarios. One analyst, for example, reported that classical computers took a month to run a detailed tail-risk simulation on the effects of a low-probability catastrophe on the markets.

Another real-world application in this category is the so-called traveling-salesman problem, which seeks to calculate the shortest possible route from city to city an area with obvious applications for delivery logistics and military supply lines. Last year, when the Australian Army used quantum computing to test its systems against known logistics challenges, one military leader cautioned that the technology was still in the "prototype stage" and that quantum computers remained "too small and fragile to give useful solutions."

Even the limited successes attributed to quantum computers aren't always as revolutionary as they seem. Many quantum computers including D-Wave's portfolio optimizers are "hybrid" machines that work in tandem with classical computers. The same is true of almost all of the quantum-computing power that is publicly accessible via the cloud. In some cases, it amounts to little more than sprinkling of quantum dust on problems that are teed up, coded, and transmitted by classical machines. The bit does all the heavy lifting, and qubit gets the credit.

The field is also plagued by a lack of agreement over basic definitions. In 2019, Google and IBM clashed over whether Google's Sycamore processor had truly achieved "quantum supremacy" by performing a tailor-made computational task in three minutes. Google insisted it would take a classical computer thousands of years to complete the same task. IBM argued it would take only days.

If your business is raising capital for quantum startups, however, such issues are often dismissed as petty details. Quantum computing arose in a culture of initial public offerings that has traditionally been eager to bet big on high-risk, high-reward technology. It often cloaks itself in the accoutrements of established scientific enterprises subzero chambers, scientists pacing around labs, university partnerships, huge research budgets without having any hard-earned results to show for itself. Like the qubit itself, the future of quantum computing remains highly theoretical.

Das Sarma, the physicist, compares current quantum efforts to trying to build a smartphone from hundred-year-old vacuum tubes. The basic principles may be in place, but the engineering hasn't had time to catch up. As a result, quantum computing, like its earliest predecessors, could remain in a rudimentary state for a long time to come. "The Egyptian abacus was actually a computer," Das Sarma observes. "But not a particularly good one."

Mattathias Schwartz is a senior correspondent at Insider.

Originally posted here:

Is Quantum Computing the Next Big Thing? - Business Insider

Related Posts