Page 58«..1020..57585960..7080..»

Category Archives: Quantum Computing

An Investment in IonQ Is a Sign of Faith in the Scale of Next-Gen Computers – InvestorPlace

Posted: November 9, 2021 at 1:53 pm

Typically, I loathe investing ideas generated from social-media communities such as diamond hands, or the concept that you buy shares of your target company irrespective of outside factors. However, when it comes to next-generation computer specialist IonQ (NYSE:IONQ), you might want to consider the wisdom of the masses. IONQ stock offers tremendous upside potential but youve got to be willing to absorb the risks.

Source: Shutterstock

First, lets set the framework for why the company is so groundbreaking. Focusing on the quantum computing sector, two methodologies exist to forward the next generation of digitalized innovations: superconducting qubits and the far more revolutionary trapped ion. One of the basic reasons why IONQ stock has attracted much attention is that, alongside Honeywell (NYSE:HON), IonQ is one of the few companies actively developing trapped-ion computers.

Thats not to say that superconducting qubits are out of fashion. Indeed, major tech firms like Alphabet (NASDAQ:GOOG, NASDAQ:GOOGL), IBM (NYSE:IBM) and Rigetti incorporate this methodology, in part because of familiarity. With superconducting qubits, producers can still incorporate standard fabrication technologies. Thus, as Ars Technica contributor John Timmer explains, participants can benefit from near-term semiconductor industry innovations.

However, Timmer also stated that superconducting qubits have issues. As manufactured devices, they are neither perfect nor perfectly identical. As a result, their developers have had to find ways to work around a relatively high error rate and some qubit-to-qubit variability. While these issues have been improved, theyre very unlikely to ever go away.

And thats where IONQ stock becomes very intriguing. Timmer wrote, The fundamental unit of a trapped-ion qubit, by contrast, is an atom, and all atoms of a given isotope are functionally equivalent and, quite obviously, dont suffer from manufacturing flaws.

In other words, trapped-ion computers have higher potential while the superconducting variant has more applicability right now.

Before you take a shot with IONQ stock, youre going to want to perform your due diligence. By no stretch of the imagination am I a semiconductor expert. And sending me angry emails to sway my opinion one way or the other wont change the narrative for the underlying company one iota.

Again, youve got to do your own research and make your decision off that analysis.

However, the reason why sentiment for IONQ stock is so strong likely ties into the unlimited potential of the tech firm. Trapped ions is an incredibly novel innovation. True, Timmer notes that through mechanisms like atomic clocks, weve become adept at manufacturing the devices needed to hold ions in traps. Still, trapped-ion computers are a massive leap.

On the other hand, companies like Alphabets Google are going for the sure thing. With superconducting qubits, youre still dealing with physical contraptions. thus, the ability for Google and others to incorporate current-gen fabrication technologies. But theoretically, as the quantum computing industry scales up, those firms playing it safe could find themselves behind the innovation curve.

Thats not to say that IONQ stock is an easy play because of its trapped-ion tech. While were speaking about hypotheticals, its very possible that this groundbreaking solution is found to be incompatible for commercialization. That would render the research poured into the company moot, with IonQ becoming nothing more than a glorified lab queen.

But what does provide bulls hope, though, is the potential of trapped ions seems very real. Per Nature.com, their operations are much less prone to errors and the delicate quantum states of individual ions last longer than those in superconducting qubits, which although small are still made of a very large number of atoms.

Another positive to consider regarding the novel approach is that superconducting qubits tend to interact only with their nearest neighbours, whereas trapped ions can interact with many others, which makes it easier to run some complex calculations.

Technically, its all compelling stuff and I do think its worth consideration for the risk-on portion of your portfolio. No, I wouldnt bet the house on it, but if youve got time to marinateIm talking five years or moreIONQ stock needs to go up high on your watch list.

On the date of publication, Josh Enomoto did not have (either directly or indirectly) any positions in the securities mentioned in this article.The opinions expressed in this article are those of the writer, subject to the InvestorPlace.comPublishing Guidelines.

A former senior business analyst for Sony Electronics, Josh Enomoto has helped broker major contracts with Fortune Global 500 companies. Over the past several years, he has delivered unique, critical insights for the investment markets, as well as various other industries including legal, construction management, and healthcare.

Read the original here:

An Investment in IonQ Is a Sign of Faith in the Scale of Next-Gen Computers - InvestorPlace

Posted in Quantum Computing | Comments Off on An Investment in IonQ Is a Sign of Faith in the Scale of Next-Gen Computers – InvestorPlace

Quantum Computing | Rigetti Computing

Posted: November 5, 2021 at 10:31 pm

Complex problems need powerful computing

We make it possible for everyone to think bigger, create faster, and see further. By infusing AI and machine learning, our quantum solutions give you the power to solve the worlds most important and pressing problems.

When the computer is operational, five casings (like the white one shown at the top of the image) envelop the machine. These cans nest inside each other and act as thermal shields, keeping everything super cold and vacuum-sealed inside.

These photon-carrying cables deliver signals to and from the chip to drive qubit operations and return the measured results.

Beneath the heat exchangers sits the mixing chamber. Inside, different forms of liquid heliumhelium-3 and helium-4separate and evaporate, diffusing the heat.

These gold plates separate cooling zones. At the bottom, they plunge to one-hundredth of a Kelvinhundreds of times as cold as outer space.

The QPU (quantum processing unit) features a gold-plated copper disk with a silicon chip inside that contains the machines brain.

Go here to see the original:

Quantum Computing | Rigetti Computing

Posted in Quantum Computing | Comments Off on Quantum Computing | Rigetti Computing

IonQ Is First Quantum Startup to Go Public; Will It be First to Deliver Profits? – HPCwire

Posted: at 10:31 pm

On October 1 of this year, IonQ became the first pure-play quantum computing start-up to go public. At this writing, the stock (NYSE: IONQ) was around $15 and its market capitalization was roughly $2.89 billion. Co-founder and chief scientist Chris Monroe says it was fun to have a few of the companys roughly 100 employees travel to New York to ring the opening bell of the New York Stock Exchange. It will also be interesting to listen to IonQs first scheduled financial results call (Q3) on November 15.

IonQ is in the big leagues now. Wall Street can be brutal as well as rewarding, although these are certainly early days for IonQ as a public company. Founded in 2015 by Monroe and Duke researcher Jungsang Kim who is the company CTO IonQ now finds itself under a new magnifying glass.

How soon quantum computing will become a practical tool is a matter of debate, although theres growing consensus that it will, in fact, become such a tool. There are several competing flavors (qubit modality) of quantum computing being pursued. IonQ has bet that trapped ion technology will be the big winner. So confident is Monroe that he suggests other players with big bets on other approaches think superconducting, for example are waking up to ion traps advantages and are likely to jump into ion trap technology as direct competitors.

In a wide-ranging discussion with HPCwire, Monroe talked about ion technology and IonQs (roughly) three-step plan to scale up quickly; roadblocks facing other approaches (superconducting and photonic); how an IonQ system with about 1,200 physical qubits and home-grown error-correction will be able to tackle some applications; and why IonQ is becoming a software company and thats a good thing.

In ion trap quantum computing, ions are held in position by magnetic forces where they can be manipulated by laser beams. IonQ uses ytterbium (Yb) atoms. Once the atoms are turned into ions by stripping off one valence electron, IonQ use a specialized chip called alinear ion trap to hold the ions precisely in 3D space. Literally, they sort of float above the surface. This small trap features around 100 tiny electrodes precisely designed, lithographed, and controlled to produce electromagnetic forces that hold our ions in place, isolated from the environment to minimize environmental noise and decoherence, as described by IonQ.

It turns out ions have naturally longer coherence times and therefore require somewhat less error correction and are suitable for longer operations. This is the starting point for IonQs advantage. Another plus is that system requirements themselves are less complicated and less intrusive (noise producing) than systems for semiconductor-based, superconducting qubits think of the need to cram control cables into a dilution refrigerator to control superconducting qubits. That said, all of the quantum computing paradigms are plenty complicated.

For the moment, ion traps using lasers to interact with the qubits is one of the most straightforward approaches. It has its own scaling challenge but Monroe contends modular scaling will solve that problem and leverage ion traps other strengths.

Repeatability [in manufacturing superconducting qubits] is wonderful but we dont need atomic scale deposition, like you hear of with five nanometer feature sizes on the latest silicon chips, said Monroe. The atoms themselves are far away from the chips, theyre 100 microns, i.e. a 10th of a millimeter away, which is miles atomically speaking, so they dont really see all the little imperfections in the chip. I dont want to say it doesnt matter. We put a lot of care into the design and the fab of these chips. The glass trap has certain features; [for example] its actually a wonderful material for holding off high voltage compared to silicon.

IonQ started with silicon-based traps and is now moving to evaporated glass traps.

What is interesting is that weve built the trap to have several zones. This is one of our strategies for scale. Right now, at IonQ, we have exactly one chain of atoms, these are the qubits, and we typically have a template of about 32 qubits. Thats as many as we control. You might ask, how come youre not doing 3200 qubits? The reason is, if you have that many qubits, you better be able to perform lots and lots of operations and you need very high quality operations to get there. Right now, the quality of our operation is approaching 99.9%. That is a part per 1000 error, said Monroe.

This is sort of back of the envelope calculations but that would mean that you can do about 1000 ops. Theres an intuition here [that] if you have n qubits, you really want to do about n2 ops. The reason is, you want these pairwise operations, and you want to entangle all possible pairs. So if you have 30 qubits, you should be able to get to about 1000 ops. Thats sort of where we are now. The reason we dont have 3200 yet is that if you have 3200 qubits, you should be able to do 10 million ops and that means your noise should be one part in 107. Were not there yet. We have strategy to get there, said Monroe.

While you could put more ions in a trap, controlling them becomes more difficult. Long chains of ions become soft and squishy. A smaller chain is really stiff [and] much less noisy. So 32 is a good number. 16 might be a good number. 64 is a good number, but its going to be somewhere probably under 100 ions, said Monroe.

The first part of the strategy for scaling is to have multiple chains on a chip that are separated by a millimeter or so which prevents crosstalk and permits local operations. Its sort of like a multi-core classical architecture, like the multi-core Pentium or something like that. This may sound exotic, but we actually physically move the atoms, we bring them together, the multiple chains to connect them. Theres no real wires. This is sort of the first [step] in rolling out a modular scale-up, said Monroe.

In proof of concept work, IonQ announced the ability to arbitrarily move four chains of 16 atoms around in a trap, bringing them together and separating them without losing any of the atoms. It wasnt a surprise we were able to do that, said Monroe. But it does take some design in laying out the electrodes. Its exactly like surfing, you know, the atoms are actually surfing on an electric field wave, and you have to design and implement that wave. That was that was the main result there. In 2022, were going to use that architecture in one of our new systems to actually do quantum computations.

There are two more critical steps in IonQs plan for scaling. Error correction is one. Clustering the chips together into larger systems is the other. Monroe tackled the latter first.

Think about modern datacenters, where you have a bunch of computers that are hooked together by optical fibers. Thats truly modular, because we can kind of plug and play with optical fibers, said Monroe. He envisions something similar for trapped ion quantum computers. Frankly, everyone in the quantum computing community is looking at clustering approaches and how to use them effectively to scale smaller systems into larger ones.

This interface between individual atom qubits and photonic qubits has been done. In fact, my lab at University of Maryland did this for the first time in 2007. That was 14 years ago. We know how to do this, how to move memory quantum bits of an atom onto a propagating photon and actually, you do it twice. If you have a chip over here and a chip over here, you bring two fibers together, and they interfere and you detect the photons. That basically makes these two photons entangled. We know how to do that.

Once we get to that level, then were sort of in manufacturing mode, said Monroe. We can stamp out chips. We imagine having a rack-mounted chips, probably multicore. Maybe well have several 100 atoms on that chip, and a few of the atoms on the chip will be connected to optical conduits, and that allows us to connect to the next rack-mounted system, he said.

They key enabler, said Monroe, is a nonblocking optical switch. Think of it as an old telephone operator. They have, lets say they have 100 input ports and 100 output ports. And the operator connects, connects with any input to any output. Now, there are a lot of connections, a lot of possibilities there. But these things exist, these automatic operators using mirrors, and so forth. Theyre called n-by-n, nonblocking optical switches and you can reconfigure them, he said.

Whats cool about that is you can imagine having several hundred, rack-mounted, multi-core quantum computers, and you feed them into this optical switch, and you can then connect any multi-core chip to any other multi-core chip. The software can tell you exactly how you want to network. Thats very powerful as an architecture because we have a so-called full connection there. We wont have to move information to nearest neighbor and shuttle it around to swap; we can just do it directly, no matter where you are, said Monroe.

The third leg is error correction, which without question is a daunting challenge throughout quantum computing. The relative unreliability of qubits means you need many redundant physical qubits estimates vary widely on how many to have a single reliable logical qubit. Ions are among the better behaving qubits. For starters, all the ions are literally identical and not subject to manufacturing defects. A slight downside is that Ion qubit switching speed is slower than other modalities, which some observers say may hamper efficient error correction.

Said Monroe, The nice thing about trapped ion qubits is their errors are already pretty good natively. Passively, without any fancy stuff, we can get to three or four nines[i] before we run into problems.

What are those problems? I dont want to say theyre fundamental, but there are brick walls that require a totally different architecture to get around, said Monroe. But we dont need to get better than three or four nines because of error correction. This is sort of a software encoding. The price you pay for error correction, just like in classical error correction encoding, is you need a lot more bits to redundantly encode. The same is true in quantum. Unfortunately, with quantum there are many more ways you can have an error.

Just how many physical qubits are needed for a logical qubit is something of an open question.

It depends what you mean by logical qubit. Theres a difference in philosophy in the way were going forward compared to many other platforms. Some people have this idea of fault tolerant quantum computing, which means that you can compute infinitely long if you want. Its a beautiful theoretical result. If you encode in a certain way, with enough overhead, you can actually you can run gates as long as you want. But to get to that level, the overhead is something like 100,000 to one, [and] in some cases a million to one, but that logical qubit is perfect, and you get to go as far as you want [in terms of number of gate operations], he said.

IonQ is taking a different tack that leverages software more than hardware thanks to ions stability and less noisy overall support system [ion trap]. He likens improving qubit quality to buying a nine in the commonly-used five nines vernacular of reliability. Five nines 99.999 percent (five nines) is used describe availability, or put another way, time between shutdowns because of error.

Were going to gradually leak in error correction only as needed. So were going to buy a nine with an overhead of about 16 physical qubits to one logical qubit. With another overhead of 32 to one, we can buy another nine. By then we will have five nines and several 100 logical qubits. This is where things are going to get interesting, because then we can do algorithms that you cant simulate classically, [such] as some of these financial models were doing now. This is optimizing some function, but its doing better than the classical result. Thats where we think we will be at that point, he said.

Monroe didnt go into detail about exactly how IonQ does this, but he emphasized that software is the big driver now at IonQ. Our whole approach at IonQ is to throw everything up to software as much as we can. Thats because we have these perfectly replicable atomic qubits, and we dont have manufacturing errors, we dont have to worry about a yield or anything like that everything is a control problem.

So how big a system do you need to run practical applications?

Thats a really good question, because I can safely say we dont exactly know the answer to that. What we do know if you get to about 100 qubits, maybe 72, or something like that, and these qubits are good enough, meaning that you can do 10s of 1000s of ops. Remember, with 100 qubits you want to do about 10,000 ops to something you cant simulate classically. This is where you might deploy some machine learning techniques that you would never be able to do classically. Thats probably where the lowest hanging fruit are, said Monroe.

Now for us to get to 100 [good] qubits and say 50,000 Ops, that requires about 1000 physical qubits, maybe 1500 physical qubits. Were looking at 1200 physical qubits, and this might be 16 cores with 64 ions in each core before we have to go to photonic connections. But the photonic connection is the key because [its] where you start to have a truly modular data center. You can stamp these things out. At that point, were just going to be making these things like crazy, and wiring them together. I think well be able to do interesting things before we get to that stage and it will be important if we can show some kind of value (application results/progress) and that we have the recipe for scaling indefinitely, thats a big deal, he said.

It is probably going too far to say that Monroe believes scaling up IonQs quantum computer is now just a straightforward engineering task, but it sometimes sounds that way. The biggest technical challenges, he suggests, are largely solved. Presumably, IonQ will successfully demonstrate its modular architecture in 2022. He said competing approaches superconducting and all-photonics, for example wont be able to scale. They are stuck, he said.

I think they will see atomic systems as being less exotic than they once thought. I mean, we think of computers as built from silicon and as solid state. For better for worse you have companies that that forgot that they supposed to build computers, not silicon or superconductors. I think were going to see a lot more fierce competition on our own turf, said Monroe. There are ion trap rivals. Honeywell is one such rival (Honeywell has announced plans to merge with Cambridge Quantum), said Monroe.

His view of the long-term is interesting. As science and hardware issues are solved, software will become the driver. IonQ already has a substantial software team. The company uses machine learning now to program its control system elements such as the laser pulses and connectivity. Were going to be a software company in the long haul, and Im pretty happy with that, said Monroe.

IonQ has already integrated with the three big cloud providers (AWS, Google, Microsoft) quantum offerings and embraced the growing ecosystem of software and tools providers and has APIs for use with a variety of tools. Monroe, like many in the quantum community, is optimistic but not especially precise about when practical applications will appear. Sometime in the next three years is a good guess, he suggests. As for which application area will be first, it may not matter in the sense that he thinks as soon as one domain shows benefit (e.g. finance or ML) other domains will rush in.

These are heady times at IonQ, as they are throughout quantum computing. Stay tuned.

[i] He likens improving qubit quality to buying a nine in the commonly-used five nines vernacular of reliability. Five nines 99.999 percent (five nines) is used describe availability, or put another way, time between shutdowns because of error.

Read more from the original source:

IonQ Is First Quantum Startup to Go Public; Will It be First to Deliver Profits? - HPCwire

Posted in Quantum Computing | Comments Off on IonQ Is First Quantum Startup to Go Public; Will It be First to Deliver Profits? – HPCwire

Quantum computers: Eight ways quantum computing is going to change the world – ZDNet

Posted: at 10:31 pm

From simulating new and more efficient materials to predicting how the stock market will change with greater precision, the ramifications of quantum computing for businesses are potentially huge.

The world's biggest companies are now launching quantum computing programs, and governments are pouring money into quantum research. For systems that have yet prove useful, quantum computers are certainly garnering lots of attention.

The CIO's guide to Quantum computing

Quantum computers offer great promise for cryptography and optimization problems, and companies are racing to make them practical for business use. ZDNet explores what quantum computers will and wont be able to do, and the challenges that remain.

Read More

The reason is that quantum computers, although still far from having reached maturity, are expected to eventually usher in a whole new era of computing -- one in which the hardware is no longer a constraint when resolving complex problems, meaning that some calculations that would take years or even centuries for classical systems to complete could be achieved in minutes.

From simulating new and more efficient materials to predicting how the stock market will change with greater precision, the ramifications for businesses are potentially huge. Here are eight quantum use cases that leading organisations are exploring right now, which could radically change the game across entire industries.

The discovery of new drugs relies in part on a field of science known as molecular simulation, which consists of modelling the way that particles interact inside a molecule to try and create a configuration that's capable of fighting off a given disease.

Those interactions are incredibly complex and can assume many different shapes and forms, meaning that accurate prediction of the way that a molecule will behave based on its structure requires huge amounts of calculation.

Doing this manually is impossible, and the size of the problem is also too large for today's classical computers to take on. In fact, it's expected thatmodelling a molecule with only 70 atoms would take a classical computer up to 13 billion years.

This is why discovering new drugs takes so long: scientists mostly adopt a trial-and-error approach, in which they test thousands of molecules against a target disease in the hope that a successful match will eventually be found.

Quantum computers, however, have the potential to one day resolve the molecular simulation problem in minutes. The systems are designed to be able to carry out many calculations at the same time, meaning that they could seamlessly simulate all of the most complex interactions between particles that make up molecules, enabling scientists to rapidly identify candidates for successful drugs.

This would mean that life-saving drugs, which currently take an average 10 years to reach the market, could be designed faster -- and much more cost-efficiently.

Pharmaceutical companies are paying attention: earlier this year, healthcare giant Roche announced a partnership with Cambridge Quantum Computing (CQC) tosupport efforts in research tackling Alzheimer's disease.

And smaller companies are also taking interest in the technology. Synthetic biology start-up Menten AI, for example,has partnered with quantum annealing company D-Waveto explore how quantum algorithms could help design new proteins that could eventually be used as therapeutic drugs.

From powering cars to storing renewable energy, batteries are already supporting the transition to a greener economy, and their role is only set to grow. But they are far from perfect: their capacity is still limited, and so is their charging speed, which means that they are not always a suitable option.

One solution consists of searching for new materials with better properties to build batteries. This is another molecular simulation problem -- this time modelling the behaviour of molecules that could be potential candidates for new battery materials.

SEE: There are two types of quantum computing. Now one company says it wants to offer both

Similar to drug design, therefore, battery design is another data-heavy job that's better suited to a quantum computer than a classical device.

This is why German car manufacturer Daimlerhas now partnered with IBMto assess how quantum computers could help simulate the behaviour of sulphur molecules in different environments, with the end-goal of building lithium-sulphur batteries that are better-performing, longer-lasting and less expensive that today's lithium-ion ones.

Despite the vast amounts of compute power available from today's cutting-edge supercomputers, weather forecasts -- particularly longer-range ones -- can still be disappointingly inaccurate. This is because there are countless ways that a weather event might manifest itself, and classical devices are incapable of ingesting all of the data required for a precise prediction.

On the other hand, just as quantum computers could simulate all of the particle interactions going on within a molecule at the same time to predict its behaviour, so could they model how innumerable environmental factors all come together to create a major storm, a hurricane or a heatwave.

SEE: Scientists are using quantum computing to help them discover signs of life on other planets

And because quantum computers would be able to analyse virtually all of the relevant data at once, they are likely to generate predictions that are much more accurate than current weather forecasts. This isn't only good for planning your next outdoor event: it could also help governments better prepare for natural disasters, as well as support climate-change research.

Research in this field is quieter, but partnerships are emerging to take a closer look at the potential of quantum computers. Last year, for instance, the European Centre for Medium-Range Weather Forecasts (ECMWF)launched a partnership with IT company Atosthat included access to Atos's quantum computing simulator, in a bid to explore how quantum computing may impact weather and climate prediction in the future.

JP Morgan, Goldman Sachs and Wells Fargo are all actively investigating the potential of quantum computers to improve the efficiency of banking operations -- a use case often put forward as one that could come with big financial rewards.

There are several ways that the technology could support the activities of banks, but one that's already showing promise is the application of quantum computing to a procedure known as Monte Carlo simulation.

SEE: Quantum computing is at an early stage. But investors are already getting excited

The Monte Carlo operation consists of pricing financial assets based on how the price of related assets changes over time, meaning that it's necessary to account for the risk inherent in different options, stocks, currencies and commodities. The procedure essentially boils down to predicting how the market will evolve -- an exercise that becomes more accurate with larger amounts of relevant data.

Quantum computers' unprecedented computation abilities could speed up Monte Carlo calculations by up to 1,000 times, according to research carried out by Goldman Sachs together with quantum computing company QC Ware. In even more promising news, Goldman Sachs' quantum engineers havenow tweaked their algorithmsto be able to run the Monte Carlo simulation on quantum hardware that could be available in as little as five years' time.

For decades, researchers have tried to teach classical computers how to associate meaning with words to try and make sense of entire sentences. This is a huge challenge given the nature of language, which functions as an interactive network: rather than being the 'sum' of the meaning of each individual word, a sentence often has to be interpreted as a whole. And that's before even trying to account for sarcasm, humour or connotation.

As a result, even state-of-the-art natural language processing (NLP) classical algorithms can still struggle to understand the meaning of basic sentences. But researchers are investigating whether quantum computers might be better suited to representing language as a network -- and, therefore, to processing it in a more intuitive way.

The field is known as quantum natural language processing (QNLP), and is a key focus of Cambridge Quantum Computing (CQC). The company hasalready experimentally shown that sentences can be parameterised on quantum circuits, where word meanings can be embedded according to the grammatical structure of the sentence. More recently, CQC released lambeq, a software toolkit for QNLP that can convert sentences into a quantum circuit.

A salesman is given a list of cities they need to visit, as well as the distance between each city, and has to come up with the route that will save the most travel time and cost the least money. As simple as it sounds, the 'travelling salesman problem' is one that many companies are faced with when trying to optimise their supply chains or delivery routes.

With every new city that is added to the salesman list, the number of possible routes multiplies. And at the scale of a multinational corporation, which is likely to be dealing with hundreds of destinations, a few thousand fleets and strict deadlines, the problem becomes much too large for a classical computer to resolve in any reasonable time.

Energy giant ExxonMobil, for example, has been trying to optimise the daily routing of merchant ships crossing the oceans -- that is, more than 50,000 ships carrying up to 200,000 containers each, to move goods with a total value of $14 trillion.

SEE: Quantum computers could read all your encrypted data. This 'quantum-safe' VPN aims to stop that

Some classical algorithms exist already to tackle the challenge. But given the huge number of possible routes to explore, the models inevitably have to resort to simplifications and approximations. ExxonMobil, therefore, teamed up with IBMto find out if quantum algorithms could do a better job.

Quantum computers' ability to take on several calculations at once means that they could run through all of the different routes in tandem, allowing them to discover the most optimal solution much faster than a classical computer, which would have to evaluate each option sequentially.

ExxonMobil's results seem promising: simulations suggest that IBM's quantum algorithms could provide better results than classical algorithms once the hardware has improved.

Optimising the timing of traffic signals in cities, so that they can adapt to the number of vehicles waiting or the time of day, could go a long way towards smoothing the flow of vehicles and avoiding congestion at busy intersections.

This is another problem that classical computers find hard: the more variables there are, the more possibilities have to be computed by the system before the best solution is found. But as with the travelling salesman problem, quantum computers could assess different scenarios at the same time, reaching the most optimal outcome a lot more rapidly.

Microsoft has been working on this use case together with Toyoto Tsusho and quantum computing startup Jij. The researchers have begun developing quantum-inspired algorithms in a simulated city environment, with the goal of reducing congestion. According to the experiment's latest results,the approach could bring down traffic waiting times by up to 20%.

Modern cryptography relies on keys that are generated by algorithms to encode data, meaning that only parties granted access to the key have the means to decrypt the message. The risk, therefore, is two-fold: hackers can either intercept the cryptography key to decipher the data, or they can use powerful computers to try and predict the key that has been generated by the algorithm.

This is because classical security algorithms are deterministic: a given input will always produce the same output, which means that with the right amount of compute power, a hacker can predict the result.

This approach requires extremely powerful computers, and isn't considered a near-term risk for cryptography. But hardware is improving, and security researchers are increasingly warning that more secure cryptography keys will be needed at some point in the future.

One way to strengthen the keys, therefore, is to make them entirely random and illogical -- in other words, impossible to guess mathematically.

And as it turns out, randomness is a fundamental part of quantum behaviour: the particles that make up a quantum processor, for instance, behave in completely unpredictable ways. This behaviour can, therefore, be used to determine cryptography keys that are impossible to reverse-engineer, even with the most powerful supercomputer.

Random number generation is an application of quantum computing that is already nearing commercialisation. UK-based startup Nu Quantum, for example,is finalizing a system that can measure the behavior of quantum particlesto generate streams of random numbers that can then be used to build stronger cryptography keys.

See the rest here:

Quantum computers: Eight ways quantum computing is going to change the world - ZDNet

Posted in Quantum Computing | Comments Off on Quantum computers: Eight ways quantum computing is going to change the world – ZDNet

An Early Investor In Twitch Explains Why He’s Betting Big On Quantum Computing – Forbes

Posted: at 10:31 pm

David Cowan had already been an investor at Bessemer Venture Partners for 20 years when he came across an upstart company that was rapidly building an audience around a novel idea: Watching other people play video games. The company was called Twitch. Shortly thereafter, Cowan and Bessemer led a $15 million Series B investment in the business. Less than two years later, Amazon came calling with an acquisition offer Twitch and Cowan couldnt refuse. It was, in many ways, the dream scenario for a venture capitalist.

But it wasnt long before Cowan began to have regrets.

An early backer of Twitch, David Cowan is now investing in the transformative potential of quantum computers like this one.

I invested in the company at like a $65 million pre-money (valuation), Cowan says today. And then 18 months later I had the opportunity to sell it for a billion dollars. And I thought, Hurray. And that was a big mistake. Because, you know, only two years later, the company was clearly worth $10 billion.

Id say the biggest lesson of that was that I had to recalibrate my expectations for what successful companies can do."

These days, Cowan spends his days investing in areas like space technology, cybersecurity and sustainable agriculturesectors you might describe as deep tech or frontier tech. I spoke to him over Zoom this week about one particular investment thats been making headlines this month. And by the sounds of it, underestimating this companys potential is not going to be a problem.

Still a partner at Bessemer, Cowan is now also an investor in and a board member at Rigetti Computing, a quantum computing company that agreed to go public in early October by merging with a SPAC at a $1.5 billion valuation. Thats up from $129 million when Bessemer took its stake in the company last year.

Ive tried before to explain quantum computing, and you can certainly find other explanations elsewhere, so I wont go into too much detail here. Suffice it to say that quantum computers are a new kind of machine that exploits the inherent strangeness of very small particles to perform immensely complicated calculations, with the potential to be trillions of times more powerful than current supercomputers.

If the industry fulfills that potential, Cowan believes the consequences will be incredible.

I mean, simply put, curing cancer, he said.

Perhaps the most exciting applications of quantum computing are in medicine. There are trillions of atoms in each cell and trillions of cells in the human body, all interacting with each other in an unceasing biological dance. Current superconductors are seriously powerful machines, but unspooling that kind of choreography is beyond their reach.

Its also beyond the reach of modern quantum computers. The technology for these machines is still in its adolescence. Theoretically, though, a quantum computer could map the way molecules and data points interact in previously unimaginable ways. And doctors and researchers could use those maps to find new therapies and cures.

The potential is equally vast in a wide range of other industries.

Its not going to change how you get your scoop of ice cream from the local store, Cowan said. But anything that requires machine learning or optimization, or certainly anything that requires an understanding of physicslike biology, chemistry, materialsanything that involves simulation, like designing airplanes or cars, anything that uses heavy computation, which of course is lots and lots of interesting industriesall of those will get a huge boost."

Different companies are trying to build quantum computers in different ways. Rigettis technology is based in superconducting qubitsqubits being the quantum computing analog to the bits in a traditional computer. In Cowans view, Rigetti is engaged in a three-way race for supremacy in the superconductor space. You might have heard of its two rivals: Google and IBM.

But whats that old saying about the size of the dog in the fight?

David Cowan has been at Bessemer since 1992.

Why do I like Rigetti? Well, two reasons. One is that I can't buy a big piece of Google or IBM, Cowan said with a grin. But the second thing is that I've seen in many industries that, as formidable as the major tech companies are, a committed dedicated startup will usually out-innovate the tech giants. And so even though Google and IBM have more money and more people, I still believe that Rigetti is going to way outpace them."

Rigetti will bring in $458 million in proceeds from its SPAC merger to help fund its ongoing R&D and bring its quantum computing technology to market. Wall Street heavyweights T. Rowe Price and Franklin Templeton are both taking part in a $100 million PIPE investment to support the deal. So too is In-Q-Tel, the venture arm of the Central Intelligence Agency. And so too are Bessemer and Cowananother sign of his belief in Rigettis long-term potential.

Im a buyer, not a seller, Cowan said. This has the opportunity to become one of the massive tech companies on the planet. I mean, this is, this is no less important than the transistor for the 20th century in terms of computation."

It will be a while before we find out one way or the other. Quantum computers arent going to fully replace modern supercomputers any time soon. The technology is still developing. A lot could change for Rigetti over the next decade. One thing is certain, though: This time around, Cowan isnt going to have any regrets about cashing out early.

Who knows when, who knows how much money it'll take. It's a risky venture, Cowan said. But for this one, the payoff is worth it.

Go here to see the original:

An Early Investor In Twitch Explains Why He's Betting Big On Quantum Computing - Forbes

Posted in Quantum Computing | Comments Off on An Early Investor In Twitch Explains Why He’s Betting Big On Quantum Computing – Forbes

Innovative Chip Resolves Quantum Headache Paves Road to Supercomputer of the Future – SciTechDaily

Posted: at 10:31 pm

Size comparison of qubits The illustration shows the size difference between spin qubits and superconducting qubits. Credit: University of Copenhagen

Quantum physicists at the University of Copenhagen are reporting an international achievement for Denmark in the field of quantum technology. By simultaneously operating multiple spin qubits on the same quantum chip, they surmounted a key obstacle on the road to the supercomputer of the future. The result bodes well for the use of semiconductor materials as a platform for solid-state quantum computers.

One of the engineering headaches in the global marathon towards a large functional quantum computer is the control of many basic memory devices qubits simultaneously. This is because the control of one qubit is typically negatively affected by simultaneous control pulses applied to another qubit. Now, a pair of young quantum physicists at the University of Copenhagens Niels Bohr Institute PhD student, now Postdoc, Federico Fedele, 29 and Asst. Prof. Anasua Chatterjee, 32, working in the group of Assoc. Prof. Ferdinand Kuemmeth, have managed to overcome this obstacle.

The brain of the quantum computer that scientists are attempting to build will consist of many arrays of qubits, similar to the bits on smartphone microchips. They will make up the machines memory.

The famous difference is that while an ordinary bit can either store data in the state of a 1 or 0, a qubit can reside in both states simultaneously known as quantum superposition which makes quantum computing exponentially more powerful.

Global qubit research is based on various technologies. While Google and IBM have come far with quantum processors based on superconductor technology, the UCPH research group is betting on semiconductor qubits known as spin qubits.

Broadly speaking, they consist of electron spins trapped in semiconducting nanostructures called quantum dots, such that individual spin states can be controlled and entangled with each other, explains Federico Fedele.

Spin qubits have the advantage of maintaining their quantum states for a long time. This potentially allows them to perform faster and more flawless computations than other platform types. And, they are so minuscule that far more of them can be squeezed onto a chip than with other qubit approaches. The more qubits, the greater a computers processing power. The UCPH team has extended the state of the art by fabricating and operating four qubits in a 22 array on a single chip.

Thus far, the greatest focus of quantum technology has been on producing better and better qubits. Now its about getting them to communicate with each other, explains Anasua Chatterjee:

Now that we have some pretty good qubits, the name of the game is connecting them in circuits which can operate numerous qubits, while also being complex enough to be able to correct quantum calculation errors. Thus far, research in spin qubits has gotten to the point where circuits contain arrays of 22 or 33 qubits. The problem is that their qubits are only dealt with one at a time.

Federico Fedele, Anasua Chatterjee, and Ferdinand Kuemmeth. Credit: University of Copenhagen

It is here that the young quantum physicists quantum circuit, made from the semiconducting substance gallium arsenide and no larger than the size of a bacterium, makes all the difference:

The new and truly significant thing about our chip is that we can simultaneously operate and measure all qubits. This has never been demonstrated before with spin qubits nor with many other types of qubits, says Chatterjee, who is one of two lead authors of the study, which has recently been published in the journal Physical Review X Quantum.

The four spin qubits in the chip are made of the semiconducting material gallium arsenide. Situated between the four qubits is a larger quantum dot that connects the four qubits to each other, and which the researchers can use to tune all of the qubits simultaneously.

Being able to operate and measure simultaneously is essential for performing quantum calculations. Indeed, if you have to measure qubits at the end of a calculation that is, stop the system to get a result the fragile quantum states collapse. Thus, it is crucial that measurement is synchronous, so that the quantum states of all qubits are shut down simultaneously. If qubits are measured one by one, the slightest ambient noise can alter the quantum information in a system.

The realization of the new circuit is a milestone on the long road to a semiconducting quantum computer.

To get more powerful quantum processors, we have to not only increase the number of qubits, but also the number of simultaneous operations, which is exactly what we did states Professor Kuemmeth, who directed the research.

At the moment, one of the main challenges is that the chips 48 control electrodes need to be tuned manually, and kept tuned continuously despite environmental drift, which is a tedious task for a human. Thats why his research team is now looking into how optimization algorithms and machine learning could be used to automate tuning. To allow fabrication of even larger qubit arrays, the researchers have begun working with industrial partners to fabricate the next generation of quantum chips. Overall, the synergistic efforts from computer science, microelectronics engineering, and quantum physics may then lead spin qubits to the next milestones.

Reference: Simultaneous Operations in a Two-Dimensional Array of Singlet-Triplet Qubits by Federico Fedele, Anasua Chatterjee, Saeed Fallahi, Geoffrey C. Gardner, Michael J. Manfra and Ferdinand Kuemmeth, 8 October 2021, PRX Quantum.DOI: 10.1103/PRXQuantum.2.040306

Go here to see the original:

Innovative Chip Resolves Quantum Headache Paves Road to Supercomputer of the Future - SciTechDaily

Posted in Quantum Computing | Comments Off on Innovative Chip Resolves Quantum Headache Paves Road to Supercomputer of the Future – SciTechDaily

Is This the Right Time for a Cryptography Risk Assessment? – Security Boulevard

Posted: at 10:31 pm

If youre having trouble getting a handle on your cryptographical instances, youre not alone. According to Ponemon Institutes most recent Global Encryption Trends Study, Discovering where sensitive data resides is the number one challenge.[i] And its no surprise given the surge in cryptographical use cases spawned from modern IT practices such as DevOps, machine identity, cloud, and multi-cloud environments.

Discussions at the DHS (Department for Homeland Security) and NIST (National Institute of Standards and Technology) are raising awareness with urgency aimed at public and private organizations to find tools and methods that will give them visibility into their cryptographical instances in order to be able to monitor it.

Many information technology (IT) and operational technology (OT) systems are dependent on public-key cryptography, but many organizations have no inventory of where that cryptography is used. This makes it difficult to determine where and with what priority post-quantum algorithms will need to replace the current public-key systems. Tools are urgently needed to facilitate the discovery of where and how public-key cryptography is being used in existing technology infrastructures.[1] This concern was raised by NIST in a recent report on adopting and using post-quantum algorithms.

DHS recently partnered with NIST to create a roadmap designed to reduce the risks that are expected with advancements in technology, particularly quantum computing. The roadmap provides a guide for chief information officers on how to mitigate risks, advising them to: stay on top of changing standards, inventory and prioritize systems and datasets, audit vulnerabilities, and to use the gathered information for transition planning. In the statement, Homeland Security SecretaryAlejandro N. Mayorkas advised, Now is the time for organizations to assess and mitigate their related risk exposure. As we continue responding to urgent cyber challenges, we must also stay ahead of the curve by focusing on strategic, long-term goals.

The roadmap ostensibly advises organizations to embark on what industry analyst Gartner refers to as a Cryptographic Center of Excellence (CryptoCoE), which is a group within an organization that takes ownership of an enterprise-wide strategy for crypto and PKI: discovering, inventorying, monitoring, and executing.

By organizing the people, protocols, processes, and technology needed to prepare for quantum resilience, CIOs are laying the foundation for a strong crypto strategy and building a CryptoCoE within their organization to enforce governance and compliance and bring crypto agility.

Crypto agility describes a way for implementing cryptography that shouldnt be limited to preparations for post-quantum computing. Crypto agility means that cryptographical updates can be made without causing business disruption ensuring that algorithm replacement is relatively straightforward and can happen without changing the function of an application. This means being prepared to easily transition to new requirements as they are updated by standards groups and regulatory bodies. Requirements and regulations change in order to keep up with a threat climate that is always in motion, necessitating the need for stronger algorithms and longer key lengths.

Another driver for having an accurate picture of your cryptographic inventory is to know what certificates are in use throughout the organization, if they are in compliance, and when they expire. Certificate expiry causes outages that make business applications unavailable. Outages can be costly, cause potential breach of service-level agreements, and damage brand reputation.

The sooner an organization can gain visibility into all of its cryptographical instances, which means going behind the endpoints to uncover SSH keys, crypto libraries, and hardcoded cryptography hidden inside of hosts and applications, the better prepared it will be to avoid data breaches and maintain compliance as new key lengths and algorithms are required to defend an organization from known threats. If youre wondering whether or not its time to perform an enterprise-wide cryptography risk assessment, the time is now.

Other Resources:

DHS releases roadmap to post-quantum cryptography

DHS releases roadmap to post-quantum cryptography

Getting Ready for Post-Quantum Cryptography: Exploring Challenges Associated with Adopting and Using Post-Quantum Cryptographic Algorithms

NIST 4-28-2021, https://nvlpubs.nist.gov/nistpubs/CSWP/NIST.CSWP.04282021.pdf

[1] TheNational Institute of Standards and Technology(NIST), https://nvlpubs.nist.gov/nistpubs/CSWP/NIST.CSWP.04282021.pdf

[i] 2021 Global Encryption Trends Study, Ponemon Institute

The post Is This the Right Time for a Cryptography Risk Assessment? appeared first on Entrust Blog.

*** This is a Security Bloggers Network syndicated blog from Entrust Blog authored by Diana Gruhn. Read the original post at: https://www.entrust.com/blog/2021/11/is-this-the-right-time-for-a-cryptography-risk-assessment/

Continued here:

Is This the Right Time for a Cryptography Risk Assessment? - Security Boulevard

Posted in Quantum Computing | Comments Off on Is This the Right Time for a Cryptography Risk Assessment? – Security Boulevard

Quantum Xchange Joins the Hudson Institute’s Quantum Alliance Initiative – PRNewswire

Posted: at 10:31 pm

BETHESDA, Md., Nov. 3, 2021 /PRNewswire/ -- Quantum Xchange, delivering the future of encryption with its leading-edge key distribution platform, today announced its membership with the Hudson Institute's Quantum Alliance Initiative (QAI), a consortium of companies, institutions, and universities whose mission is to raise awareness and develop policies that promote the critical importance of U.S. leadership in quantum technology, while simultaneously working to ensure that the nation's commercial businesses, government agencies, and digital infrastructure will be safe from a future quantum computer cyberattack by 2025.

The arrival of quantum computers is expected to break popular encryption methods, e.g., Public Key Encryption (PKE), widely used to protect nearly every aspect of digital life. Earlier this month, the U.S. Department of Homeland Security released guidance to help organizations prepare for the largest cryptographic transition in the history of computing with Secretary Mayorkas stating, "We must prepare now to protect the confidentiality of data that already exists today and remains sensitive in the future." Despite these early warnings, most U.S. businesses and federal agencies have taken a lax position, waiting for NIST to publish its post-quantum cryptography (PQC) standard before any action is taken.

"Government and business leaders don't fully recognize the urgency of the quantum threat or magnitude of the multi-year crypto migration problem it will require after NIST publishes the PQC standard," said Eddy Zervigon, CEO of Quantum Xchange. "As a quantum security trailblazer, with an enterprise-ready solution, we believe it's our duty to help raise awareness and arm cybersecurity professionals, and lawmakers, with the information needed to become stewards of change within their organizations conveying to leadership and the public the severity and immediacy of the quantum security threat. We are pleased to be a member of QAI and to advance this common agenda."

Quantum Xchange's radically reimagined approach to data encryption addresses the weaknesses of legacy encryption systems and the quantum threat at once. Using the company's groundbreaking out-of-band symmetric key delivery technology, Phio Trusted Xchange, leading businesses and government agencies can simply and affordably future-proof the security of their data and communications networks, overcome the vulnerabilities of present-day encryption techniques, and better protect against known and future attacks.

"Hudson's Quantum Alliance Initiative aims to transform how we think about quantum, the science and technology that will dominate the world's economies, security, and prospects for freedom," said QAI Director Arthur Herman. "Having Quantum Xchange as a member is a welcome addition to the international coalition we are building, to make sure America is quantum ready for the 21st century."

About Quantum Xchange Quantum Xchangegives commercial enterprises and government agencies the ultimate solution for protecting data in motion today and in the quantum future. Its award-winning out-of-band symmetric key distribution system, Phio Trusted Xchange (TX), is uniquely capable of making existing encryption environments quantum safe and supports both post-quantum crypto (PQC) and Quantum Key Distribution (QKD). Only by decoupling key generation and delivery from data transmissions can organizations achieve true crypto agility and quantum readiness with no interruptions to underlying infrastructure or business operations. To learn more about future-proofing your data from whatever threat awaits, visit QuantumXC.com or follow us on Twitter @Quantum_Xchange #BeQuantumSafe.

SOURCE Quantum Xchange

See original here:

Quantum Xchange Joins the Hudson Institute's Quantum Alliance Initiative - PRNewswire

Posted in Quantum Computing | Comments Off on Quantum Xchange Joins the Hudson Institute’s Quantum Alliance Initiative – PRNewswire

AWS Announces Opening of the AWS Center for Quantum Computing – HPCwire

Posted: October 30, 2021 at 2:31 pm

Oct. 28, 2021 What if by harnessing the properties of quantum mechanics we could model and simulate the behavior of matter at its most fundamental level, down to how molecules interact? The machine that would make that possible would be transformative, changing what we know about science and how we probe nature for answers.

Quantum computers have the potential to be this machine: The scientific community has known for some time now that certain computational tasks can be solved more efficiently when qubits (quantum bits) are used to perform the calculations, and that quantum computers promise to solve some problems that are currently beyond the reach of classical computers. But many unknowns remain: How should we build such a machine so that it can handle big problems, useful problems of practical importance? How can we scale it to thousands and millions of qubits while maintaining precise control over fragile quantum states and protecting them from their environment? And what customer problems should we design it to tackle first? These are some of the big questions that motivate us at the AWS Center for Quantum Computing.

The Home of AWS Quantum Technologies

In this post I am excited to announce the opening of the new home of the AWS Center for Quantum Computing, a state-of-the-art facility in Pasadena, California, where we are embarking on a journey to build a fault-tolerant quantum computer. This new building is dedicated to our quantum computing efforts, and includes office space to house our quantum research teams, and laboratories comprising the scientific equipment and specialized tools for designing and running quantum devices. Here our team of hardware engineers, quantum theorists, and software developers work side by side totackle the many challengesof building better quantum computers. Our new facility includes everything we need to push the boundaries of quantum R&D, from making, testing, and operating quantum processors, to innovating the processes for controlling quantum computers and scaling the technologies needed to support bigger quantum devices, like cryogenic cooling systems and wiring.

From Research to Reality

A bold goal like building a fault-tolerant quantum computer naturally means that there will be significant scientific and engineering challenges along the way, andsupportingfundamental research and making a commitment to the scientific community working on these problems is essential for accelerating progress. Our Center is located on the Caltech campus, which enables us to interact with students and faculty from leading research groups in physics and engineering just a few buildings away. We chose topartner with Caltechin part due to the universitys rich history of contributions to computing both classical and quantum from pioneers like Richard Feynman, whose vision 40 years ago can be credited with kick-starting the field of quantum computing, to the current technical leads of the AWS Center for Quantum Computing: Oskar Painter (John G Braun Professor of Applied Physics, Head of Quantum Hardware), and Fernando Brandao (Bren Professor of Theoretical Physics, Head of Quantum Algorithms). Through this partnership were also supporting the next generation of quantum scientists, by providing scholarships and training opportunities for students and young faculty members.

But our connections to the research community dont end here. Our relationships with a diverse group of researchers help us stay at the cutting edge of quantum information sciences research. For example, several experts in quantum related fields are contributing to our efforts asAmazon Scholarsand Amazon Visiting Academics, includingLiang Jiang(University of Chicago), Alexey Gorshkov (University of Maryland),John Preskill(Caltech), Gil Refael (Caltech), Amir Safavi-Naeimi (Stanford), Dave Schuster (University of Chicago), andJames Whitfield(Dartmouth). These experts help us innovate and overcome technical challenges even as they continue to teach and conduct research at their universities. I believe such collaborations at this early stage of the field will be critical to fully understand the potential applications and societal impact of quantum technologies.

Building a Better Qubit

There are many ways to physically realize a quantum computer: quantum information can, for example, be encoded in particles found in nature, such as photons or atoms, but at the AWS Center for Quantum Computing we are focusing on superconducting qubits electrical circuit elements constructed from superconducting materials. We chose this approach partly because the ability to manufacture these qubits using well-understood microelectronic fabrication techniques makes it possible to make many qubits in a repeatable way, and gives us more control as we start scaling up the number of qubits. There is more to building a useful quantum computer than increasing the number of qubits, however. Another important metric is the computers clock speed, or the time required to perform quantum gate operations. Faster clock speeds means solving problems faster, and here again superconducting qubits have an edge over other modalities, as they provide very fast quantum gates.

The ultimate measure of the quality of our qubits will be the error rate, or how accurately we can perform quantum gates. Quantum devices available today are noisy and are as a result limited in the size of circuits that they can handle (a few thousands of gates is the best we can hope for withNoisy Intermediate-Scale Quantum (NISQ) devices). This in turn severely limits their computational power. There are two ways that we are approaching making better qubits at the AWS Center for Quantum Computing: the first is by improving error rates at the physical level, for example by investing in material improvements that reduce noise. The second is through innovative qubit architectures, including using Quantum Error Correction (QEC) to reduce quantum gate errors by redundantly encoding information into a protected qubit, called a logical qubit. This allows for the detection and correction of gate errors, and for the implementation of gate operations on the encoded qubits in a fault-tolerant way.

Innovating Error Correction

Typical QEC requires a large number of physical qubits to encode every qubit of logical information. At the AWS Center for Quantum Computing, we have been researching ways toreduce this overheadthrough the use of qubit architectures that allow us to implement error correction more efficiently in quantum hardware. In particular, we are optimistic about approaches that make use of linear harmonic oscillators such asGottesman-Kitaev-Preskill (GKP) qubitsand Schrdinger cat qubits, and recently proposed atheoretical designfor a fault-tolerant quantum computer based on hardware-efficient architecture leveraging the latter.

One thing that differentiatesthis approachis that we take advantage of a technique called error-biasing. There are two types of errors that can affect quantum computation: bit-flip (flips between the 0 and 1 state due to noise) and phase-flips (the reversal of parity in the superposition of 0 and 1). In error-biasing, we use physical qubits that allow us to suppress bit-flips exponentially, while only increasing phase-flips linearly. We then combine this error-biasing with an outer repetition code consisting of a linear chain of cat qubits to detect and correct for the remaining phase-flip errors. The result is a fault-tolerant logical qubit that has a lower error rate for storing and manipulating the encoded quantum information. Not having to correct for bit-flip errors is the reason this architecture is hardware efficient and shows tremendous potential for scaling.

Building the Future for Our Customers

The journey to an error-corrected quantum computer starts with a few logical qubits. A key milestone for our team and the quantum computing field will be demonstrating the breakeven point with a logical qubit, where the accuracy of the logical qubit surpasses the accuracy of the physical qubits that constitute its building blocks. Our ultimate goal is to deliver an error-corrected quantum computer that can perform reliable computations not just beyond what any classical computing technology is capable of, but at the scale needed to solve customer problems of practical importance.

Why set such an ambitious goal? The quantum algorithms that have the most potential for significant impact, for example in industries like manufacturing or pharmaceuticals, cant be solved by simply expanding todays quantum technologies. Pursuing breakthrough innovations rather than incremental improvements always takes longer, but I believe a bold approach that fundamentally reconsiders what makes a good qubit is the best way to deliver the ultimate computational tool: a machine that can execute algorithms requiring hundreds of thousands to billions of quantum gate operations on each qubit with at most one error over the total number of gates, a level of accuracy needed to solve the most complex computational problems that have societal and commercial value.

In talking to our AWS quantum customers over the last couple years Ive found that those that are most excited about the potential for quantum are also realistic about the challenges of realizing the full potential of this technology, and are eager tocollaboratewith us to make it a reality even as they build up their own internal expertise in quantum. At the AWS Center for Quantum computing, we have assembled a fantastic team that is committed to this exciting journey toward fault-tolerant quantum computing. Stay tuned, andjoin us.

Source: Nadia Carlsten, AWS

Follow this link:

AWS Announces Opening of the AWS Center for Quantum Computing - HPCwire

Posted in Quantum Computing | Comments Off on AWS Announces Opening of the AWS Center for Quantum Computing – HPCwire

Understanding Quantum Primacy And How We Got There – Science 2.0

Posted: at 2:27 pm

A quantum computer is a remarkable device. While, at current, it's still limited in its application, we now know that it can be faster than the fastest computers we currently have access to. As Scientific American reminds us, quantum primacy (also known as quantum supremacy) is the point at which a quantum machine outstrips a classical computer. Computers have helped advance civilization and increased our ability to process data many times over. Even so, there are some problems that not even they can solve. The more answers we find, the more questions we have. Quantum computing was built to multiply computing power by tapping into what we know about quantum states. While a traditional computer is limited by bits, quantum computers aren't, allowing them to perform calculations many times faster. Or so it's assumed. There's still much debate as to whether we've gotten to the point of quantum primacy or not. Are quantum computers faster than regular computers, or aren't they?

When we compare different classical computers, we use the clock speed of their processors to figure out how "fast" they are. For example, a 3GHz processor can perform two billion compute cycles per second. With quantum computers, it's a little more challenging to determine how fast it's doing its processes. Quantum computers use a system known as quantum bits or qubits. Classical computers contain binary bits that can either be a 1 or a 0. Qubits, by comparison, can be any of a set of quantum states and are sometimes superpositioned over one another. The physics behind it deals with the quantum particle's spin and considers that quantum states only collapse when they are observed. Because of these things, it can be challenging to put a value on a quantum computer's processing speed. Still, it's assumed that they will be many times faster than traditional computers using the same resources.

So why would we say that we've gotten to quantum primacy while some experts are still skeptical? The heart of the matter lies in how we do tests to verify the difference in processing speed between a quantum computer and a traditional one. In practice, when comparing a conventional system to a quantum computer, the go-to method is using sampling problems. Sampling problems are computational problems with solutions that are random instances of a given probability distribution. However, the question arises from a computing standpoint as to whether the best possible algorithm was used. Traditional computing places a lot of stock in the efficiency of algorithms, and by not using the best case for the random generation, experts argue that the quantum computer has an unfair advantage.

In a paper published in Physical Review Letters, a team from the University of Science and Technology of China has sought to challenge the limitations of testing. The team set up a case where they would use a superconducting system to demonstrate that a quantum computer could deliver accurate results in situations where classical computing cannot simulate anything similar. The non-specialist may find this description challenging to break down, but the easiest way to think about it is that the research team increased the number of sequential calculations within the system so that it would be impossible to have the same physical clock speed on a traditional computer. In this sense, raw processing power seems to be the metric by which the quantum computer is establishing its dominance.

The superconducting experiment takes a quantum processor and tasks it with a sampling problem. The problem is to produce random instances of measurements for experimental data, outputting in qubits. The logic behind this experiment is that this should be almost impossible for a traditional computer but feasible for a quantum machine. The only way that true quantum primacy can be established is by using large-scale sampling problems. In this experiment, the number of circuits used was large enough to guarantee a massive sample size but small enough to still be feasible to implement. However, the superconductor test was only one of two tests the team used to prove quantum primacy. The establishment of the investigation still has some detractors, but it's a step in the right direction.

The second test was known as the photonic experiment, seeking to solve the problem of boson sampling. The methodology for boson sampling requires a lot of processing for a traditional computing system, but theoretically, it should be a simple process for a quantum machine. While boson sampling has an ideal mathematical formula, it's difficult to realize experimentally. As a result, generalizations would need to be made, resulting in "Gaussian" boson sampling. While Gaussian boson sampling is experimentally viable, the results are much more challenging to prove quantum primacy with. However, given the assumptions made on classical systems and increasing the time scale to match what would be expected, the quantum computer managed to deliver results in a fraction of the calculated time. This result suggests that quantum machines are, indeed, "faster" than traditional computers.

Quantum computers are still a way off from being commercially viable machines. However, their power and the ability to deal with complex problems offer them a solution to questions in other areas of physics that need this type of computational system. When quantum primacy is mentioned, the issue that constantly comes up is whether classical computers can spoof results that "seem" right enough. These complex problems decrease the chance of spoofing, establishing a baseline that can be used to determine whether quantum computers are as fast as we assume they are. Can these samplers be used to solve complex physics problems? The possibility of using these systems to solve computationally difficult problems is one of the biggest questions of our time, but the initial results suggest that it can. On the other hand, researchers claim that, to date, there aren't any real, meaningful questions that the system can be used to test, leaving interpretation open. These two experiments are a great leap forward in putting quantum sampling to practical use in the field.

More:

Understanding Quantum Primacy And How We Got There - Science 2.0

Posted in Quantum Computing | Comments Off on Understanding Quantum Primacy And How We Got There – Science 2.0

Page 58«..1020..57585960..7080..»