The Prometheus League
Breaking News and Updates
- Abolition Of Work
- Ai
- Alt-right
- Alternative Medicine
- Antifa
- Artificial General Intelligence
- Artificial Intelligence
- Artificial Super Intelligence
- Ascension
- Astronomy
- Atheism
- Atheist
- Atlas Shrugged
- Automation
- Ayn Rand
- Bahamas
- Bankruptcy
- Basic Income Guarantee
- Big Tech
- Bitcoin
- Black Lives Matter
- Blackjack
- Boca Chica Texas
- Brexit
- Caribbean
- Casino
- Casino Affiliate
- Cbd Oil
- Censorship
- Cf
- Chess Engines
- Childfree
- Cloning
- Cloud Computing
- Conscious Evolution
- Corona Virus
- Cosmic Heaven
- Covid-19
- Cryonics
- Cryptocurrency
- Cyberpunk
- Darwinism
- Democrat
- Designer Babies
- DNA
- Donald Trump
- Eczema
- Elon Musk
- Entheogens
- Ethical Egoism
- Eugenic Concepts
- Eugenics
- Euthanasia
- Evolution
- Extropian
- Extropianism
- Extropy
- Fake News
- Federalism
- Federalist
- Fifth Amendment
- Fifth Amendment
- Financial Independence
- First Amendment
- Fiscal Freedom
- Food Supplements
- Fourth Amendment
- Fourth Amendment
- Free Speech
- Freedom
- Freedom of Speech
- Futurism
- Futurist
- Gambling
- Gene Medicine
- Genetic Engineering
- Genome
- Germ Warfare
- Golden Rule
- Government Oppression
- Hedonism
- High Seas
- History
- Hubble Telescope
- Human Genetic Engineering
- Human Genetics
- Human Immortality
- Human Longevity
- Illuminati
- Immortality
- Immortality Medicine
- Intentional Communities
- Jacinda Ardern
- Jitsi
- Jordan Peterson
- Las Vegas
- Liberal
- Libertarian
- Libertarianism
- Liberty
- Life Extension
- Macau
- Marie Byrd Land
- Mars
- Mars Colonization
- Mars Colony
- Memetics
- Micronations
- Mind Uploading
- Minerva Reefs
- Modern Satanism
- Moon Colonization
- Nanotech
- National Vanguard
- NATO
- Neo-eugenics
- Neurohacking
- Neurotechnology
- New Utopia
- New Zealand
- Nihilism
- Nootropics
- NSA
- Oceania
- Offshore
- Olympics
- Online Casino
- Online Gambling
- Pantheism
- Personal Empowerment
- Poker
- Political Correctness
- Politically Incorrect
- Polygamy
- Populism
- Post Human
- Post Humanism
- Posthuman
- Posthumanism
- Private Islands
- Progress
- Proud Boys
- Psoriasis
- Psychedelics
- Putin
- Quantum Computing
- Quantum Physics
- Rationalism
- Republican
- Resource Based Economy
- Robotics
- Rockall
- Ron Paul
- Roulette
- Russia
- Sealand
- Seasteading
- Second Amendment
- Second Amendment
- Seychelles
- Singularitarianism
- Singularity
- Socio-economic Collapse
- Space Exploration
- Space Station
- Space Travel
- Spacex
- Sports Betting
- Sportsbook
- Superintelligence
- Survivalism
- Talmud
- Technology
- Teilhard De Charden
- Terraforming Mars
- The Singularity
- Tms
- Tor Browser
- Trance
- Transhuman
- Transhuman News
- Transhumanism
- Transhumanist
- Transtopian
- Transtopianism
- Ukraine
- Uncategorized
- Vaping
- Victimless Crimes
- Virtual Reality
- Wage Slavery
- War On Drugs
- Waveland
- Ww3
- Yahoo
- Zeitgeist Movement
-
Prometheism
-
Forbidden Fruit
-
The Evolutionary Perspective
Category Archives: Quantum Computing
Is Quantum Computing the Next Big Thing? – Business Insider
Posted: April 25, 2022 at 5:21 pm
Under normal circumstances, the arrival of a breakthrough technology creates its own hype. The first light bulb drew hundreds of gawkers to Menlo Park, New Jersey, in 1879. A century later, crowds gathered again to see artificial-intelligence-powered supercomputers defeat human grandmasters at chess and Go. Social media announced its arrival with hockey-stick growth. When you discover something that actually has the power to transform the world, the world usually takes notice.
But investors in quantum computing a technology that has the theoretical potential to make a traditional supercomputer look like a slide rule have been mostly tooting their own horns, promising a revolution that's just over the horizon. The biggest booster is IBM, which has been pushing quantum through YouTube videos and sponsored podcasts. The breaking of the "100-qubit processor barrier" in November, a feat performed by IBM's Eagle chip, may not have rocked the tech world, but you wouldn't know it from the company's online rollout.
"Dreams are time travel to the future," gushed Daro Gil, IBM's director of research, in a video announcing Eagle. "We have definitely traveled into the future," he continued, adding: "This is the real thing."
IBM has a big financial incentive to hype its quantum potential. With Eagle, the company is betting that quantum computing can return it to the first ranks of tech giants, alongside Google, Microsoft, and Amazon, all of which are bankrolling their own quantum efforts. Smaller quantum players are also scrambling to break into the space. IonQ, which went public last year through a special-purpose acquisition company, trades at 1,000 times its annual revenue. D-Wave, which has backing from Goldman Sachs and Jeff Bezos, also plans to go public through a SPAC . One recent report estimates that quantum computers could generate nearly a trillion dollars in annual revenue by 2050, with applications from auto and airplane manufacturing to pharmaceutical development and finance.
But the hype about quantum computing's future glosses over the limitations of its present. For now, quantum computers remain exceedingly slow and buggy to the point of uselessness. Unlike AI and augmented reality, which already enjoy robust pipelines of products heading to market, quantum computing lacks anything close to a working prototype with the power to draw a crowd. Both IBM and IonQ have "road maps" that promise an operational 1,000-qubit processor by the end of next year. But experts agree that even if the companies manage to hit that significant-sounding target and that's a big if a versatile quantum computer that can perform a range of practical operations on its own, outside a lab, is still many years away.
"It's going to be a lot of gradual improvement in capabilities," says Celia Merzbacher, who heads up the Quantum Economic Development Consortium. "There's a lot that has to happen to get to something that resembles what we think of today as a computer."
Which leads to the question most of us have about quantum computing: What the hell is it?
For decades, the fundamental unit of computing has been the "bit" either a one or a zero. Charles Babbage's mechanical computers used the position of gears and levers to record bits. On a flash drive, bits are stored as electrical charges on tiny magnetic cells.
Quantum computing, by contrast, operates on qubits, which can be a one, a zero, or a combination of both an uncanny, ambiguous state known as superposition. This is possible because subatomic particles defy common sense, appearing and disappearing in ways that continue to surprise and baffle physicists. Photons, to give just one example, form a pattern with light and dark bands when shot through a barrier with two slits. But try using a detector to observe which slit an individual photon passed through, and the pattern disappears.
The seductive promise of the qubit lies in its exponential power. Two regular bits can be used to represent four states 00, 01, 10, 11 but only one of those states at any given time. In theory, two qubits could represent all four states at the same time and then resolve to whichever state is needed to solve a given problem. That means the 127-qubit Eagle has a computing potential that is thousands of millions of millions times that of a classical supercomputer.
The problem is, it's incredibly difficult to get all those qubits working together. With today's technology, maintaining a state of superposition within even one qubit is a tall order. Subatomic particles are sensitive to tiny changes in their environment. Scientists have tried to stabilize their quantum processors by storing them at extremely cold temperatures, but it hasn't made much difference. So for now, quantum computing depends on a subdiscipline called "quantum error correction," which usually involves running the same code over and over again, through multiple qubits, and using probability to correct for random errors.
The need for error correction has led scientists to distinguish between physical qubits, like the ones that make up the Eagle, and more idealized logical qubits, which are sufficiently reliable to program with. By most estimates, it takes 1,000 physical qubits to yield one logical qubit. So even if IBM hits its 1,000-qubit benchmark by next year, it will have succeeded in achieving only the computing capability of a single traditional bit a computer with a fraction of the power of a video-game console from the 1980s.
Sankar Das Sarma, a theoretical physicist at the University of Maryland who has published widely on quantum computing, believes that the technology is real and has tremendous long-term potential. But he is skeptical about its short-term prospects.
"Claiming to have a thousand or a trillion qubits by some deadline is a meaningless statement unless the properties of those 'qubits' have extremely well-defined technical specifications," he told me. "One can easily have as many qubits as one wants, if they are sufficiently bad from a computational viewpoint."
The need for a new, more powerful computer model is certainly real enough. For decades, as predicted by Moore's law, computer power has been growing at an exponential rate. But that growth has begun to slow, hemmed in by physical reality. In simple terms, we're reaching the limit of how many transistors we can pack onto the chips that power classical computers. And if those transistors can't get smaller, the electrical signals that zip around on the chips can't get any faster. Our computers are still getting smarter and speedier, but those gains are beginning to level off.
But qubits aren't constrained by traditional limits of space and time. They exist in multiple states simultaneously meaning, at least in theory, that we can deploy vast armies of them to do our computational bidding, if we can figure out how to harness their shifty nature.
As is often the case, two primary applications are driving the new technology: surveillance and finance. As more and more data is protected by dual-key encryption, governments are eager to find a way to crack the code. That requires figuring out the factors of very large semiprime numbers a problem that would take the most powerful classical computers billions of years. A quantum processor with thousands of logical, error-corrected qubits, by contrast, could conceivably decrypt emails and other communications almost instantly, enabling governments to decode and read messages while they were still in transit. Many countries are said to be storing petabytes of encrypted data that was transmitted by their adversaries, in the hope that quantum computing will one day render it all legible.
At the same time, the US is working to build a standard for "post-quantum cryptography" that can survive a qubit attack. "It is not unreasonable to think we'll have total chaos," says Miles Taylor, who helped organize the effort as chief of staff at the Department of Homeland Security. "Someone will have a massive asymmetric advantage. It could be IBM. It could be the Chinese Communist Party." When I asked about timelines, Taylor said he believed we'd see a quantum computer capable of cracking current encryption technologies "within a decade."
Another sector that has been betting heavily on quantum's potential is finance. D-Wave, the firm with backing from Goldman Sachs, is marketing portfolio-optimization services to finance companies, promising higher returns at lower risk. Classical computers have trouble quickly solving what are known as "combinatorial optimization" problems, such as how best to allocate investments in a variety of scenarios. One analyst, for example, reported that classical computers took a month to run a detailed tail-risk simulation on the effects of a low-probability catastrophe on the markets.
Another real-world application in this category is the so-called traveling-salesman problem, which seeks to calculate the shortest possible route from city to city an area with obvious applications for delivery logistics and military supply lines. Last year, when the Australian Army used quantum computing to test its systems against known logistics challenges, one military leader cautioned that the technology was still in the "prototype stage" and that quantum computers remained "too small and fragile to give useful solutions."
Even the limited successes attributed to quantum computers aren't always as revolutionary as they seem. Many quantum computers including D-Wave's portfolio optimizers are "hybrid" machines that work in tandem with classical computers. The same is true of almost all of the quantum-computing power that is publicly accessible via the cloud. In some cases, it amounts to little more than sprinkling of quantum dust on problems that are teed up, coded, and transmitted by classical machines. The bit does all the heavy lifting, and qubit gets the credit.
The field is also plagued by a lack of agreement over basic definitions. In 2019, Google and IBM clashed over whether Google's Sycamore processor had truly achieved "quantum supremacy" by performing a tailor-made computational task in three minutes. Google insisted it would take a classical computer thousands of years to complete the same task. IBM argued it would take only days.
If your business is raising capital for quantum startups, however, such issues are often dismissed as petty details. Quantum computing arose in a culture of initial public offerings that has traditionally been eager to bet big on high-risk, high-reward technology. It often cloaks itself in the accoutrements of established scientific enterprises subzero chambers, scientists pacing around labs, university partnerships, huge research budgets without having any hard-earned results to show for itself. Like the qubit itself, the future of quantum computing remains highly theoretical.
Das Sarma, the physicist, compares current quantum efforts to trying to build a smartphone from hundred-year-old vacuum tubes. The basic principles may be in place, but the engineering hasn't had time to catch up. As a result, quantum computing, like its earliest predecessors, could remain in a rudimentary state for a long time to come. "The Egyptian abacus was actually a computer," Das Sarma observes. "But not a particularly good one."
Mattathias Schwartz is a senior correspondent at Insider.
Originally posted here:
Posted in Quantum Computing
Comments Off on Is Quantum Computing the Next Big Thing? – Business Insider
PsiQuantum’s Path to 1 Million Qubits by the Middle of the Decade – HPCwire
Posted: at 5:21 pm
PsiQuantum, founded in 2016 by four researchers with roots at Bristol University, Stanford University, and York University, is one of a few quantum computing startups thats kept a moderately low PR profile. (Thats if you disregard the roughly $700 million in funding it has attracted.) The main reason is PsiQuantum has eschewed the clamorous public chase for NISQ (near-term intermediate scale quantum) computers and set out to develop a million-qubit system the company says will deliver big gains on big problems as soon as it arrives.
When will that be?
PsiQuantum says it will have all the manufacturing processes in place by the middle of the decade and its working closely with GlobalFoundries (GF) to turn its vision into reality. The generous size of its funding suggests many think it will succeed. PsiQuantum is betting on a photonics-based approach called fusion-based quantum computing (paper) that relies mostly on well-understood optical technology but requires extremely precise manufacturing tolerances to scale up. It also relies on managing individual photons, something that has proven difficult for others.
Heres the companys basic contention:
Success in quantum computing will require large, fault-tolerant systems and the current preoccupation with NISQ computers is an interesting but ultimately mistaken path. The most effective and fastest route to practical quantum computing will require leveraging (and innovating) existing semiconductor manufacturing processes and networking thousands of quantum chips together to reach the million-qubit system threshold thats widely regarded as necessary to run game-changing applications in chemistry, banking, and other sectors.
Its not that incrementalism is bad. In fact, its necessary. But its not well served when focused on delivering NISQ systems argues Peter Shadbolt, one of PsiQuantum founders and the current chief scientific officer.
Conventional supercomputers are already really good. Youve got to do some kind of step change, you cant increment your way [forward], and especially you cant increment with five qubits, 10 qubits, 20 qubits, 50 qubits to a million. That is not a good strategy. But its also not true to say that were planning to leap from zero to a million, said Shadbolt. We have a whole chain of incrementally larger and larger systems that were building along the way. Those allow us to validate the control electronics, the systems integration, the cryogenics, the networking, etc. But were not spending time and energy, trying to dress those up as something that theyre not. Were not having to take those things and try to desperately extract computational value from something that doesnt have any computational value. Were able to use those intermediate systems for our own learnings and for our own development.
Thats a much different approach from the majority of quantum computing hopefuls. Shadbolt suggests the broad message about the need to push beyond NISQ dogma is starting to take hold.
There is a change that is happening now, which is that people are starting to program for error-corrected quantum computers, as opposed to programming for NISQ computers. Thats a welcome change and thats happening across the whole space. If youre programming for NISQ computers, you very rapidly get deeply entangled if youll forgive the pun with the hardware. You start looking under the hood, and you start trying to find shortcuts to deal with the fact that you have so few gates at your disposal. So, programming NISQ computers is a fascinating, intellectually stimulating activity, Ive done it myself, but it rapidly becomes sort of siloed and you have to pick a winner, said Shadbolt.
With fault tolerance, once you start to accept that youre going to need error correction, then you can start programming in a fault-tolerant gate set which is hardware agnostic, and its much more straightforward to deal with. There are also some surprising characteristics, which mean that the optimizations that you make to algorithms in a fault-tolerant regime are in many cases, the diametric opposite of the optimizations that you would make in the NISQ regime. It really takes a different approach but its very welcome that the whole industry is moving in that direction and spending less time on these kinds of myopic, narrow efforts, he said.
That sounds a bit harsh. PsiQuantum is no doubt benefitting from the manifold efforts by the young quantum computing ecosystem to tout advances and build traction by promoting NISQ use cases. Theres an old business axiom that says a little hype is often a necessary lubricant to accelerate development of young industries; quantum computing certainly has its share. A bigger question is will PsiQuantum beat rivals to the end-game? IBM has laid out a detailed roadmap and said 2023 is when it will start delivering quantum advantage, using a 1000-qubit system, with plans for eventual million-qubit systems. Intel has trumpeted its CMOS strength to scale up manufacturing its quantum dot qubits. D-Wave has been selling its quantum annealing systems to commercial and government customers for years.
Its really not yet clear which of the qubit technologies semiconductor-based superconducting, trapped ions, neutral atoms, photonics, or something else will prevail and for which applications. Whats not ambiguous is PsiQuantums Go Big or Go Home strategy. Its photonics approach, argues the company, has distinct advantages in manufacturability and scalability, operating environment (less frigid), ease of networking, and error correction. Shadbolt recently talked with HPCwire about the companys approach, technology and progress.
What is fusion-based quantum computing?
Broadly, PsiQuantum uses a form of linear optical quantum computing in which individual photons are used as qubits. Over the past year and a half, the previously stealthy PsiQuantum has issued several papers describing the approach while keeping many details close to the vest (papers listed at end of article). The computation flow is to generate single photons and entangle them. PsiQuantum uses dual rail entangling/encoding for photons. The entangled photons are the qubits and are grouped into what PsiQuantum calls resource states, a group of qubits if you will. Fusion measurements (more below) act as gates. Shadbolt says the operations can be mapped to a standard gate-set to achieve universal, error-corrected, quantum computing.
On-chip components carry out the process. It all sounds quite exotic, in part because it differs from more-widely used matter-based qubit technologies. The figure below taken from a PsiQuantum paper Fusion-based quantum computation issued about a year ago roughly describes the process.
Digging into the details is best served by reading the papers and the company has archived videos exploring its approach on its website. The video below is a good brief summation by Mercedes Gimeno-Segovia, vice president of quantum architecture at PsiQuantum.
Shadbolt also briefly described fusion-based quantum computation (FBQC).
Once youve got single photons, you need to build what we refer to as seed states. Those are pretty small entangled states and can be constructed again using linear optics. So, you take some single photons and send them into an interferometer and together with single photon detection, you can probabilistically generate small entangled states. You can then multiplex those again and basically the task is to get as fast as possible to a large enough, complex enough, appropriately structured, resource state which is ready to then be acted upon by a fusion network. Thats it. You want to kill the photon as fast as possible. You dont want photons living for a long time if you can avoid it. Thats pretty much it, said Shadbolt.
The fusion operators are the smallest simplest piece of the machine. The multiplex, single-photon sources are the biggest, most expensive piece. Everything in the middle is kind of the secret sauce of our architecture, some of that weve put out in that paper and you can see kind of how that works, he said. (At the risk of overkill, another brief description of the system from PsiQuantum is presented at the end of the article.)
One important FBQC advantage, says PsiQuantum, is that the shallow depth of optical circuits make error correction easier. The small entangled states fueling the computation are referred to as resource states. Importantly, their size is independent of code distance used or the computation being performed. This allows them to be generated by a constant number of operations. Since the resource states will be immediately measured after they are created, the total depth of operations is also constant. As a result, errors in the resource states are bounded, which is important for fault-tolerance.
Some of the differences between the PsiQuantums FBQC design and the more familiar MBQC (measurement-based quantum computing) paradigm are shown below.
Another advantage is the operating environment.
Nothing about photons themselves requires cryogenic operation. You can do very high fidelity manipulation and generation of qubits at room temperature, and in fact, you can even detect single photons at room temperature just fine. The efficiency of room temperature single photon detectors, is not good enough for fault tolerance. These room temperature detectors are based on pretty complex semiconductor devices, avalanche photodiodes, and theres no physical reason why you couldnt push those to the necessary efficiency, but it looks really difficult [and] people have been trying for a very long time, said Shadbolt
We use a superconducting single-photon detector, which can achieve the necessary efficiencies without a ton of development. Its worth noting those detectors run in the ballpark of 4 Kelvin. So liquid helium temperature, which is still very cold, but its nowhere near as cold as milli-Kelvin temperatures required for superconducting qubits or some of the competing technologies, said Shadbolt.
This has important implications for control circuit placement as well as for reduced power needed to generate the 4-degree Kelvin environment.
Theres a lot to absorb here and its best done directly from the papers. PsiQuantum, like many other quantum start-ups, was founded by researchers who were already digging into the quantum computing space and theyve shown that PsiQuantums FBQC flavor of linear optical quantum computing will work. While at Bristol, Shadbolt was involved in the first demonstration of running a Variational Quantum Eigensolver (VQE) on a photonic chip.
The biggest challenges for PsiQuantum, he suggests, are developing manufacturing techniques and system architecture around well-known optical technology. The company argues having a Tier-1 fab partner such as GlobalFoundries is decisive.
You can go into infinite detail on the architecture and how all the bits and pieces go together. But the point of optical quantum computing is that the network of components is pretty complicated all sorts of modules and structures and multiplexing strategies, and resource state generation schemes and interferometers, and so on but theyre all just made out of beam splitters, and switches, and single photon sources and detectors. Its kind of like in a conventional CPU, you can go in with a microscope and examine the structure of the cache and the ALU and whatever, but underneath its all just transistors. Its the same kind of story here. The limiting factor in our development is the semiconductor process enablement. The thesis has always been that if you tried to build a quantum computer anywhere other than a high-volume semiconductor manufacturing line, your quantum computer isnt going to work, he said.
Any quantum computer needs millions of qubits. Millions of qubits dont fit on a single chip. So youre talking about heaps of chips, probably billions of components realistically, and they all need to work and they all need to work better than the state of the art. That brings us to the progress, which is, again, rearranging those various components into ever more efficient and complex networks in pretty close analogy with CPU architecture. Its a very key part of our IP, but its not rate limiting and its not terribly expensive to change the network of components on the chip once weve got the manufacturing process. Were continuously moving the needle on that architecture development and weve improved these architectures in terms of their tolerance to loss by more than 150x, [actually] well beyond that. Weve reduced the size of the machine, purely through architectural improvements by many, many orders of magnitude.
The big, expensive, slow pieces of the development are in being able to build high quality components at GlobalFoundries in New York. What weve already done there is to put single photon sources and superconducting nanowire, single photon detectors into that manufacturing process engine. We can build wafers, 300-millimeter wafers, with tens of thousands of components on the wafer, including a full silicon photonics PDK (process design kit), and also a very high performing single photon detector. Thats real progress that brings us closer to being able to build a quantum computer, because that lets us build millions to billions of components.
Shadbolt says real systems will quickly follow development of the manufacturing process. PsiQuantum, like everyone in the quantum computing community, is collaborating closely with potential users. Roughly a week ago, it issued a joint paper with Mercedes-Benz discussing quantum computer simulation of Li-ion chemistry. If the PsiQuantum-GlobalFoundries process is ready around 2025, can a million-qubit system (100 logical qubits) be far behind?
Shadbolt would only say that things will happen quickly once the process has been fully developed. He noted there are three ways to make money with a quantum computer: sell machines, sell time, and sell solutions that come from that machine. I think we were exploring all of the above, he said.
Our customers, which is a growing list at this point pharmaceutical companies, car companies, materials companies, big banks are coming to us to understand what a quantum computer can do for them. To understand that, what we are doing, principally, is fault-tolerant resource counting, said Shadbolt. So that means were taking the algorithm or taking the problem the customer has, working with their technical teams to look under the hood, and understand the technical requirements of solving that problem. We are turning that into the quantum algorithms and sub routines that are appropriate. Were compiling that for the fault-tolerant gate set that will run on top of that fusion network, which by the way is a completely vanilla textbook fault-tolerant gate set.
Stay tuned.
PsiQuantum Papers
Fusion-based quantum computation, https://arxiv.org/abs/2101.09310
Creation of Entangled Photonic States Using Linear Optics, https://arxiv.org/abs/2106.13825
Interleaving: Modular architectures for fault-tolerant photonic quantum computing, https://arxiv.org/abs/2103.08612
Description of PsiQuantums Fusion-Based System from the Interleaving Paper
Useful fault-tolerant quantum computers require very large numbers of physical qubits. Quantum computers are often designed as arrays of static qubits executing gates and measurements. Photonic qubits require a different approach. In photonic fusion-based quantum computing (FBQC), the main hardware components are resource-state generators (RSGs) and fusion devices connected via waveguides and switches. RSGs produce small entangled states of a few photonic qubits, whereas fusion devices perform entangling measurements between different resource states, thereby executing computations. In addition, low-loss photonic delays such as optical fiber can be used as fixed-time quantum memories simultaneously storing thousands of photonic qubits.
Here, we present a modular architecture for FBQC in which these components are combined to form interleaving modules consisting of one RSG with its associated fusion devices and a few fiber delays. Exploiting the multiplicative power of delays, each module can add thousands of physical qubits to the computational Hilbert space. Networks of modules are universal fault-tolerant quantum computers, which we demonstrate using surface codes and lattice surgery as a guiding example. Our numerical analysis shows that in a network of modules containing 1-km-long fiber delays, each RSG can generate four logical distance-35 surface-code qubits while tolerating photon loss rates above 2% in addition to the fiber-delay loss. We illustrate how the combination of interleaving with further uses of non-local fiber connections can reduce the cost of logical operations and facilitate the implementation of unconventional geometries such as periodic boundaries or stellated surface codes. Interleaving applies beyond purely optical architectures, and can also turn many small disconnected matter-qubit devices with transduction to photons into a large-scale quantum computer.
Slides/Figures from various PsiQuantum papers and public presentations
View original post here:
PsiQuantum's Path to 1 Million Qubits by the Middle of the Decade - HPCwire
Posted in Quantum Computing
Comments Off on PsiQuantum’s Path to 1 Million Qubits by the Middle of the Decade – HPCwire
Earth Day 2022: Quantum Computing has the Key to Protect Environment! – Analytics Insight
Posted: at 5:21 pm
Can quantum computing hold the ultimate power to meet sustainable development?
Quantum computing has started gaining popularity with the integration of quantum mechanics through smart quantum computers. Yes, it can transform conventional computers with a highly complex nature. Meanwhile, quantum computing is ready to have the key to protecting the environment with technology. Lets celebrate Earth Day 2022 with sustainable development through quantum computing. Quantum computers hold the substantial potential to save the environment with technology and physics law. Thus, lets dig deeper into quantum computing to look out for ways how it holds the key to protecting the environment.
Earth Day 2022 is celebrated across the world to raise the awareness of environmental issues to human beings. It helps to come up with ideas to reduce the carbon footprint and energy consumption for effective sustainable development. Hence, quantum computing is determined to be the protector of the environment with technology to look out for sustainable development efficiently and effectively.
Quantum computers are a form of supercomputers with thousands of GPU and CPU cores with multiple high degrees of complex issues. It is used for performing multiple quantum calculations with Qubits for simulating the problems that human beings or classical computers cannot solve within a short period of time.
Now in the 21st century with the advancements in technologies, quantum computing can power sustainable development with smart functionalities. Quantum computers can protect the environment with technology by capturing carbon as well as fighting climate change for global warming.
Quantum computing can simulate large complicated molecules which can discover new catalysts for capturing sufficient carbon from the current environment. The room-temperature superconductors hold the key to decreasing the 10% of energy production that is lost in transmission. It will help in better processes to feed the increasing population as well as efficient batteries.
Quantum computing is set to address global challenges, raise awareness, generate solutions, and meet the sustainable development goals on Earth Day 2022. Quantum computers are transforming the illusion into reality with better climate models to protect the environment with technology. It is ready to provide sufficient in-depth insights into how the ways and activities of human beings are drastically affecting the environment and creating a barrier to sustainable development.
Multiple 200 Qubits quantum computers can help to find a catalyst to utilize the 3-5% of the worlds gas production as well as 1-2% of annual energy levels through multiple different tasks. It can be used to generate different catalysts for capturing carbon footprint from the air and decreasing carbon emissions by 80%-90%. Thus, quantum computing can control the rapid rise in temperature in the environment with technology.
That being said, lets celebrate Earth Day 2022 with quantum computing helping the world in ensuring carbon dioxide recycling and reducing harmful emissions of carbon monoxide.
Share This ArticleDo the sharing thingy
About AuthorMore info about author
Read the original:
Earth Day 2022: Quantum Computing has the Key to Protect Environment! - Analytics Insight
Posted in Quantum Computing
Comments Off on Earth Day 2022: Quantum Computing has the Key to Protect Environment! – Analytics Insight
Quantum Computing Market, Growth, Share, Size, Segmentations, Industry Trends, Demand and Forecasts 2022 to 2028 – Digital Journal
Posted: at 5:21 pm
Quantum Computing Market 2022-2028
A New Market Study, Titled Quantum Computing Market Upcoming Trends, Growth Drivers and Challenges has been featured on fusionmarketresearch.
Description
This global study of theQuantum ComputingMarketoffers an overview of the existing market trends, drivers, restrictions, and metrics and also offers a viewpoint for important segments. The report also tracks product and services demand growth forecasts for the market. There is also to the study approach a detailed segmental review. A regional study of the globalQuantum Computingindustryis also carried out in North America, Latin America, Asia-Pacific, Europe, and the Near East & Africa.The report mentions growth parameters in the regional markets along with major players dominating the regional growth.
Request Free Sample Report @https://www.fusionmarketresearch.com/sample_request/2022-2030-Report-on-Global-Quantum-Computing-Market-2022/85766
This research covers COVID-19 impacts on the upstream, midstream and downstream industries. Moreover, this research provides an in-depth market evaluation by highlighting information on various aspects covering market dynamics like drivers, barriers, opportunities, threats, and industry news & trends. In the end, this report also provides in-depth analysis and professional advices on how to face the post COIVD-19 period.
The research methodology used to estimate and forecast this market begins by capturing the revenues of the key players and their shares in the market. Various secondary sources such as press releases, annual reports, non-profit organizations, industry associations, governmental agencies and customs data, have been used to identify and collect information useful for this extensive commercial study of the market. Calculations based on this led to the overall market size. After arriving at the overall market size, the total market has been split into several segments and subsegments, which have then been verified through primary research by conducting extensive interviews with industry experts such as CEOs, VPs, directors, and executives. The data triangulation and market breakdown procedures have been employed to complete the overall market engineering process and arrive at the exact statistics for all segments and subsegments.
Leading players of Quantum Computing including:D-Wave SolutionsIBMGoogleMicrosoftRigetti ComputingIntelOrigin Quantum Computing TechnologyAnyon Systems Inc.Cambridge Quantum ComputingAirbus GroupNokia Bell LabsAlibaba Group HoldingToshiba
Quantum Computing Market split by Type, can be divided into:HardwareSoftwareCloud Services
Quantum Computing Market split by Application, can be divided into:Space and DefenseGovernmentAutomotiveManufacturing & LogisticsBanking and Finance
Quantum Computing Market split by Sales Channel, can be divided into:Direct ChannelDistribution Channel
Market segment by Region/Country including:North America (United States, Canada and Mexico)Europe (Germany, UK, France, Italy, Russia and Spain etc.)Asia-Pacific (China, Japan, Korea, India, Australia and Southeast Asia etc.)South America (Brazil, Argentina and Colombia etc.)Middle East & Africa (South Africa, UAE and Saudi Arabia etc.)
Ask Queries @https://www.fusionmarketresearch.com/enquiry.php/2022-2030-Report-on-Global-Quantum-Computing-Market-2022/85766
Table of Contents
Chapter 1 Quantum ComputingMarket Overview 1.1 Quantum Computing Definition1.2 Global Quantum Computing Market Size Status and Outlook (2016-2030)1.3 Global Quantum Computing Market Size Comparison by Region (2016-2030)1.4 Global Quantum Computing Market Size Comparison by Type (2016-2030)1.5 Global Quantum Computing Market Size Comparison by Application (2016-2030)1.6 Global Quantum Computing Market Size Comparison by Sales Channel (2016-2030)1.7 Quantum Computing Market Dynamics (COVID-19 Impacts)1.7.1 Market Drivers/Opportunities1.7.2 Market Challenges/Risks1.7.3 Market News (Mergers/Acquisitions/Expansion)1.7.4 COVID-19 Impacts1.7.5 Post-Strategies of COVID-19
Chapter 2 Quantum ComputingMarket Segment Analysis by Player 2.1 Global Quantum Computing Sales and Market Share by Player (2019-2021)2.2 Global Quantum Computing Revenue and Market Share by Player (2019-2021)2.3 Global Quantum Computing Average Price by Player (2019-2021)2.4 Players Competition Situation & Trends2.5 Conclusion of Segment by Player
Chapter 3 Quantum ComputingMarket Segment Analysis by Type 3.1 Global Quantum Computing Market by Type3.1.1 Hardware3.1.2 Software3.1.3 Cloud Services3.2 Global Quantum Computing Sales and Market Share by Type (2016-2021)3.3 Global Quantum Computing Revenue and Market Share by Type (2016-2021)3.4 Global Quantum Computing Average Price by Type (2016-2021)3.5 Leading Players of Quantum Computing by Type in 20213.6 Conclusion of Segment by Type
Chapter 4 Quantum ComputingMarket Segment Analysis by Application 4.1 Global Quantum Computing Market by Application4.1.1 Space and Defense4.1.2 Government4.1.3 Automotive4.1.4 Manufacturing & Logistics4.1.5 Banking and Finance4.2 Global Quantum Computing Revenue and Market Share by Application (2016-2021)4.3 Leading Consumers of Quantum Computing by Application in 20214.4 Conclusion of Segment by Application
Chapter 5 Quantum ComputingMarket Segment Analysis by Sales Channel 5.1 Global Quantum Computing Market by Sales Channel5.1.1 Direct Channel5.1.2 Distribution Channel5.2 Global Quantum Computing Revenue and Market Share by Sales Channel (2016-2021)5.3 Leading Distributors/Dealers of Quantum Computing by Sales Channel in 20215.4 Conclusion of Segment by Sales Channel
Continue
ABOUT US:
Fusion Market Research is one of the largest collections of market research reports from numerous publishers. We have a team of industry specialists providing unbiased insights on reports to best meet the requirements of our clients. We offer a comprehensive collection of competitive market research reports from a number of global leaders across industry segments.
CONTACT US
[emailprotected]
Phone:+ (210) 775-2636 (USA)+ (91) 853 060 7487
View post:
Posted in Quantum Computing
Comments Off on Quantum Computing Market, Growth, Share, Size, Segmentations, Industry Trends, Demand and Forecasts 2022 to 2028 – Digital Journal
Crypto research by Bank of Canada confirmed what we knew all along – CryptoSlate
Posted: at 5:21 pm
Officials at the Bank of Canada (BoC) are turning their attention to crypto research in a big way.
In line with the general stance held by central banks, the BoC has, in the past, voiced concerns about cryptocurrency. For example, two years ago, the BoC published a paper highlighting the risks faced by cryptocurrency users.
Two further recent studies conducted by the BoC, one on awareness and ownership, and the other, a quantum computing simulation, yielded predictable results.
In a report titled Bitcoin Awareness, Ownership and Use: 2016-20, BoC researchers conducted a study of 12,487 participants over five years.
They found that awareness and ownership of Bitcoin have increased since 2016. Analyzing the results further, researchers noted that awareness had stabilized from 2018 to 2020. The same pattern and trend were also noted concerning Bitcoin ownership.
Cross-referencing their findings with participants demographic profiles, researchers said with increased awareness and ownership in the latter period of the study, Bitcoin ownership concentrated among young, educated men with high household income and low financial literacy.
Researchers also found more than 20% of Bitcoin owners showed low Bitcoin knowledge in failing to answer any of the three knowledge questions. From that, they deduced many owners were hit-and-run investors.
Given that investment was the most common reason owners cited for owning Bitcoin, we see that many owners may be trying to profit from cryptocurrencies without fully understanding the technology.
It concluded that ownership and use of Bitcoin and other cryptocurrencies remain low. But its still important to monitor trends in this sector to allow for informed policy-making decisions.
In collaboration with Multiverse Computing, a quantum computing company, the BoC ran stimulation models on complex financial problems, including a simulation of crypto adoption.
Sam Mugel, the CTO at Multiverse Computing, said such a simulation is not possible using classical computers. The number of possible configurations, even with just a ten-person network, is mind-bogglingly large.
Quantum computing leverages quantum theory to run complex calculations and problems as well as assess probabilities. The computers can process exponentially more data than traditional computing models.
The results found that for some industries, crypto would operate alongside traditional banking as a payment mechanism. But the degree to which individual financial institutions adopt digital assets depends on how these institutions respond to crypto adoption and the economic costs of crypto trading.
Meanwhile, the Bank of England, through its Prudential Regulation Authority (PRA) arm, which is tasked with managing financial sector risk, said it was expanding its workforce by recruiting an additional 100 staff members.
PRA Deputy Governor and Head Sam Woods said the drive was necessary to handle new policy responsibilities, including risks associated with crypto assets.
Become a member of CryptoSlate Edge and access our exclusive Discord community, more exclusive content and analysis.
On-chain analysis
Price snapshots
More context
Read more:
Crypto research by Bank of Canada confirmed what we knew all along - CryptoSlate
Posted in Quantum Computing
Comments Off on Crypto research by Bank of Canada confirmed what we knew all along – CryptoSlate
China May Have Just Taken the Lead in the Quantum Computing Race – Defense One
Posted: April 15, 2022 at 1:10 pm
China may have taken the lead in the race to practical quantum computing with a recent announcement that it has shattered a record for solving a complex problem.
In 2019, Googlereported that its 53-qubit Sycamore processor had completed in 3.3 minutes a task that would have taken a traditional supercomputerat least 2.5 days. Last October, Chinas 66-qubit Zuchongzhi 2 quantum processor reportedly completed the same task 1 million times faster. That processor was developed by a team of researchers from the Chinese Academy of Sciences Center for Excellence in Quantum Information and Quantum Physics, in conjunction with the Shanghai Institute of Technical Physics and the Shanghai Institute of Microsystem and Information Technology.
Traditional supercomputers like those of the U.S. military and the Peoples Liberation Armys 56th Research Institute are used to conduct complex simulations for equipment design, process images and signals to spot targets and points of interest, and analyze oceans of data to understand hidden trends and connections. But some tasks remain time and resource intensive, for even the tiniest computing bits require time to flip between 1 and 0.
Superconducting quantum computers can bypass physical limits by creating a superposition of the 1 and 0 values. Essentially, standard computing bits must be either a 1 or a 0. But in extremely low temperatures, the physical properties of matter undergo significant changes. Superconducting quantum computers take advantage of these changes to create qubits (quantum bits), which are not limited by the processing hurdles that traditional computers face. Qubits can be both 1 or 0, simultaneously.This promises to speed up computing immensely, enabling assaults on henceforth uncrackable problems like decrypting currently unbreakable codes, pushing AI and machine learning to new heights, and designing entirely new materials, chemicals, and medicines.
The worlds scientific and military powers are spending billions of dollars in the race to turn this promise into reality. China has notched several notable advancements in recent years. In 2020, the University of Science and Technology of China, home of leading Chinese quantum computing scholarPan Jianwei, conducted the first space-based quantum communications, using the Micius satellite to create an ultra-secure data link between two ground stations separated by more than 1,000 miles.
In October, a Chinese teamreported that its light-based Jiuzhang 2 processor could complete a task in one millisecond that a conventional computer would require 30 trillion years to finish. This breakthrough marked a new top speed for a quantum processor whose qubits are light-based, not superconducting. The quantum states needed for the superconducting computers to function are delicate, can be unstable, and are prone to causing large numbers of errors. However, light-based supercomputers also have theirdrawbacks, as it is difficult to increase the number of photons in this type of quantum computer, due to their delicate state. It remains to be seen which method will be more prevalent.
These achievements stem from Beijings emphasis on quantum computing research. China is reportedly investing $10 billion in the field, and says it increased national R&D spending by 7 percent last year. By contrast, the U.S. government devoted $1.2 billion to quantum computing research in 2018 under a newnational strategy. Last year, the Senatepassed a bill to create aDirectorate of Technology and Innovation at the National Science Foundation, and add $29 billion for research into quantum computing and artificial intelligence from 2022 to 2026, but it awaits reconciliation with a similar billpassed by the House last month.
Chinese researchers, firms, and agencies now hold morepatents in quantum tech than does the United States (although U.S. companies have more in the specific field of quantum computing), amid allegations that these advancements benefit from stolen U.S. work. A year ago, the Commerce Departmentblacklisted seven supercomputing entities for their association with the Peoples Liberation Army. Further, there is evidence that the Chinese government has been stealing encrypted U.S. government and commercial data, warehousing it against the day when quantum computers can break todays encryption.
We are still a few years away from seeing a real advent of quantum computing. Currently, most quantum computers are able to coherently operate with around50 qubits. To realize quantum computings full potential in codebreaking, for example, would require qubit amounts in thethousands. But progress is being made. IBMreportedly produced a 127-qubit superconducting quantum computer in November,intends to unveil a 400-qubit processor this year, and aims to produce a 1,000-qubit processor in 2023.
Given the enormous strategic potential of quantum computing in a wide variety of fields, this competition is set to only grow more intense in the near future. Whether the U.S. can keep pace remains to be seen.
Thomas Corbett is a research analyst with BluePath Labs. His areas of focus include Chinese foreign relations, emerging technology, and international economics.
P.W. Singer is a strategist at New America and the author of multiple books on technology and security, includingWired for War,Ghost Fleet,Burn-In, andLikeWar: The Weaponization of Social Media.
Read the rest here:
China May Have Just Taken the Lead in the Quantum Computing Race - Defense One
Posted in Quantum Computing
Comments Off on China May Have Just Taken the Lead in the Quantum Computing Race – Defense One
After the IPO: IonQ takes on highly charged quantum computing challenge – VentureBeat
Posted: at 1:10 pm
We are excited to bring Transform 2022 back in-person July 19 and virtually July 20 - 28. Join AI and data leaders for insightful talks and exciting networking opportunities. Register today!
Trapped-ion quantum computer manufacturer IonQ is on a roll. Recently, the company said its IonQ Aria system hit the 20 algorithmic qubit level a measure said to reflect a quantum computers qubits actual utility in real-world settings. The company also made IonQ Aria available on Microsofts Azure Quantum platform for what it describes as an extended beta program.
Moreover, IonQ reported its first quarter as a publicly traded company. It reportedly gained $2.1 million in revenue in 2021 and expects revenue for 2022 to be between $10.2 million and $10.7 million. For quantum computing, it is still early days when the players seek big partners to test out concepts.
A net loss of $106.2 million for 2021 belies the challenges ahead for IonQ, as well as other multi-state quantum computing players that look to surpass conventional binary computers someday. Early application targets for such machines include cryptography, financial modeling, electric vehicle battery chemistry and logistics.
By some measures, IonQ was late to the quantum computing race in 2019, when it first announced access to its platform via cloud partnerships with Microsoft and Amazon Web Services. An appearance on Google Cloud marketplace followed, thus making a Big 3 cloud hat-trick, one that other quantum players can also assert.
But, if IonQ was later to the quantum computing race, it was early to the quantum computing IPO.
Last year, IonQ claimed standing as the worlds first public pure-play quantum computing company. The IPO transpired as part of a SPAC, or Special Purpose Acquisition Company, which has come to be seen as an easier mechanism companies might use to enter the public markets.
The SPAC path is not without controversy, as companies taking this route have seen their shares slide after less than splashy intros. That doesnt bother Peter Chapman, CEO of IonQ. The company grossed $636 million in a SPAC-borne IPO that will go toward the long-awaited commercializing of quantum hardware, Chapman told VentureBeat.
I no longer have to think about raising money and we are no longer subject to market whims or external affairs, which seems, with [war in] Ukraine and everything else going on, like a really good decision, he said.
The IPO funding also gives IonQ staff a clear gauge on their stock options worth, he said, adding that this is important in the quantum talent war that pits IonQ versus some of the biggest tech companies in the world, many of which use superconducting circuits rather than ion trapping.
Clearly, raising large sums from VCs or public markets is a to-do item for quantum computing hardware makers like IonQ. The company arose out of academic labs at the University of Maryland that were originally propelled by a research partnership in quantum science with the National Institute of Standards and Technology (NIST).
Now, it must move lab prototypes into production, which is where much of the moneys raised will be spent as quantum computers seek to go commercial, Chapman indicated.
We knew that within roughly 18 months from IPO, we were going to be gearing up for manufacturing and that was going to require a lot more money. And so being able to run faster, was also a huge piece of what we wanted to be able to do, Chapman said.
Moving to larger scale production is a hurdle for all quantum players. Ion-trapping technology advocates may claim some edge there, in that parts of their base technology employ methods have long been used in atomic clocks.
With atomic clocks, you take ions and suspend them in a vacuum, levitate them above the surface using an RF field and you isolate them perfectly. Theyre very stable and theyre extremely accurate, Chapman said, touching on a factor that leads ion-trapping advocates to claim qubits with better coherence that is, ability to retain information than competitive methods.
Chapman notes that important atomic clock components have undergone miniaturization over the years and versions now appear as compact modules in navigational satellites. That augurs the kind of miniaturization that would help move the quantum computer out of the lab and into data centers. Of course, there are other hurdles ahead.
For IonQ, another bow to manufacturability is seen in the companys recent move from ytterbium ions to barium ions. This is said to create qubits of much higher fidelity.
In February, IonQ announced a public-private partnership with Pacific Northwest National Lab (PNNL) to build a sustainable source of barium qubits to power its IonQ Aria systems.
Chapman said the ions of barium qubits are controlled primarily with visible light, rather than the ultraviolet light that ytterbium set-ups require. Such UV light can be damaging to hardware components, so visible light has benefits over UV light.
More important, according to Chapman, is the fact that so many commercial silicon photonics work in the visible spectrum. Using the same technology found in a range of existing commercial products is useful as quantum computing looks to miniaturize and boost reliability.
Along with IonQs partnerships with cloud players, comes a series of partnerships with industry movers such as Hyundai Motor (for electric battery chemistry modeling), GE Research (for risk management) and Fidelitys Center for Applied Technology (for quantum machine learning for finance). More such deals can be expected, as IonQs quantum computing efforts ramp up and roll out.
VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Learn more about membership.
Link:
After the IPO: IonQ takes on highly charged quantum computing challenge - VentureBeat
Posted in Quantum Computing
Comments Off on After the IPO: IonQ takes on highly charged quantum computing challenge – VentureBeat
10 Universities Unleashing the Best Quantum Computing Research – Analytics Insight
Posted: at 1:10 pm
Advancements in quantum computing have made it a popular career choice in 2022
Quantum computing has been on the slug for quite a long period of time. But recently, the technology has been buzzing with advanced innovations that are changing the modern tech industry. Quantum computing has become a game-changer in fields like cryptography, chemistry, material science, agriculture, and pharmaceuticals. As technology advances, the problems of global crises become even more complex. During the Covid pandemic, quantum research unraveled several creative tools and innovations that enhanced the confidence of researchers in quantum computing. Emerging as one of the trendiest technologies in the industry, there are several universities and colleges that are encouraging quantum research initiatives and programs for their students. These quantum computing universities possess the best faculty, laboratories, and tools that can help the students to develop their own creations. In this article, we have listed such top quantum computing universities that provide world-class infrastructure for tech aspirants to excel in the quantum computing domain through quantum research and other initiatives.
The University of Waterloo is offering quantum computing courses and advanced research programs for quantum students. It has published over 1500 research papers since its inception. This institute has the potential to combine academic excellence with entrepreneurial innovation to bring out the best of what technology and intellect have to offer.
The university was the first to work on the pure state NMR quantum computing which was demonstrated at Oxford and the University of York. The universitys quantum research department is among the top. It arranges research initiatives that aim to utilize the vast potential of quantum tech. The faculty aims to produce pioneers in the technology who will be responsible to innovate for the benefit of society.
Harvard claims that the Harvard Quantum Initiative in Science and Engineering involves a community of researchers with an intense interest in advancing the science and engineering of quantum systems and applications. The group of quantum researchers at Harvard is trying to build the second quantum revolution and accelerate advances in this domain.
MIT is known to be a research giant. Its branches extend to artificial intelligence and quantum computing. The universitys strength in theoretical physics is now leveraged into quantum information and computing. MIT researchers wish to explore quantum algorithms and complexity, quantum information theory, measurement, control, and connections.
The Berkeley Center for Quantum Computation and Information includes analysts from the domains of engineering, chemistry, and physical sciences. These researchers and analysts work on central issues in quantum gadgets, cryptography, quantum data hypothesis, calculations, and others for the introduction of advanced quantum PCs.
The Joint Quantum Institute includes quantum researchers from the Department of National Institute of Standards and Technology and the Department of Physics of the University of Maryland. Each of these institutes contributes to major hypothetical and exploratory examination programs with a focus on control and sending the quantum framework.
The University of Sydney focuses on the challenging problems of quantum computing and applying these insights to construct new technologies. The scientific research initiatives undertaken by this university mainly focus on deep industrial and entrepreneurial activities.
The Chicago Quantum Exchange showcases a distinct fascination for modern endeavors and propelling scholastics in the designing and study of quantum data and computing. Their goal is to advance the id and investigation of quantum data and computing advances and furthermore the improvement of new applications.
The Universitys division of quantum physics and information specializes in the domain of quantum optics. The division lists the primary foundation for foundations, which are filter-based quantum communication, quantum memory and quantum repeater, and other distinct paradigms.
Researchers at the University of Innsbrucks Quantum Information and Computation department study models for quantum information processing and fundamental aspects of quantum information theory. The focus of their research is the theory of measurement-based quantum computation, which will result in a new and more thorough understanding of multi-body entanglement as a resource.
Share This ArticleDo the sharing thingy
See original here:
10 Universities Unleashing the Best Quantum Computing Research - Analytics Insight
Posted in Quantum Computing
Comments Off on 10 Universities Unleashing the Best Quantum Computing Research – Analytics Insight
QpiAI and QuantrolOx sign a MoU to jointly develop India’s first 25-qubit quantum computing testbed and offerings for the European and Indian markets…
Posted: at 1:10 pm
BENGALURU, India & MILPITAS, Calif.--(BUSINESS WIRE)--QpiAI, a leader in quantum computing and AI today announced a MoU with Finland based, Oxford University spinout QuantrolOx. The two companies will work together to provide complete quantum solutions to customers in Europe, India and Asia.
Dr Nagendra Nagaraja CEO and Founder of QpiAI, suggested, India and Finland have had a close working relationship in telecom technologies. Both countries have now worked out a detailed plan for establishment of the Indo-Finnish Virtual Network Center on Quantum Computing. The MoU between QpiAI and QuantrolOx will further add to this momentum by providing advanced quantum computing testbeds for Indian and Finnish quantum companies and research labs to test and develop their quantum technologies.
"QpiAI is also working on semiconductor-based spin qubit technology, which is highly scalable and uses the mature semiconductor fabrication process. QpiAI has software and platform products in the market. It will transition its software platforms which currently run on the GPU and CPUs to its 3-chip solution of hybrid classical-Quantum compute by providing 20x -100x performance improvements in optimization workloads.
"There is a massive market opportunity in India, Asia and Europe for quantum computing technologies including quantum processors, quantum communication, quantum cloud computing and superconductors. Central to these opportunities is a scalable quantum computing technology that has the potential to automatically tune qubits and characterize quantum processors. While QpiAI is developing hardware including control board, RF and microwave-based quantum control chip and quantum processors and software applications software platforms and quantum algorithms, QuantrolOx is working on automated tuning and characterisation software."
Vishal Chatrath CEO of QuantrolOx and Prof. Andrew Briggs Executive Chairman of QuantrolOx, Professor at University of Oxford, are joining the advisory board of QpiAI to enable strategic alignment between the two companies. QpiAI Advisory board also has industry stalwarts like Dr Madhusudan Atre - former India head and Managing Director of AMD, Applied Materials, LSI, Lucent Microelectronics; and also advisor to startups and incubators. QpiAI board also has Dr Navakant Bhat who is former Chair of CeNSE and currently Dean of IISc.
With the Indian governments commitment of $1 Billion to the Indian quantum industry, we see India as a major market. We are excited to provide QuantrolOx technology to Indias first 25-qubit quantum computing testbed. This is just the beginning of our partnership with QpiAI to develop joint products and solutions for the global marketplace, added Vishal Chatrath, CEO and co-founder of QuantrolOx.
Ultimately, the challenge going forward for the industry to scale will be to create a supply chain in which specialist companies can focus on their core strengths. Such ecosystems will lead innovation and drive down costs so that in the next 5-10 years the world will have not 10s of quantum computers but thousands and possibly even 10s of thousands of quantum computers. I am excited at the prospect of a new quantum ecosystem developing between Oxford, Finland and India, added Prof Andrew Briggs, Executive Chair of QuantrolOx.
Having Quantum technology development in India will create next generation high-technology jobs in cutting edge research and technology development. This also builds ecosystem for leading inter-disciplinary R&D. I am very glad to see QpiAI forge partnership with QuantrolOx and lay the foundation for India-Finland partnership in the area of Quantum technology. We are glad that we will be working with Dr Andrew and Vishal to make Quantum computing commercially available across India, Europe and southeast Asia for industrial sectors," Dr Atre suggested.
"When we were first discussing Quantum hardware in India in 2020, that was before pandemic, Dr Nagendra was suggesting 20 qubit setup by 2024 in Bangalore. Now with this partnership it looks like we will be having multiple 25 Qubit testbeds right here in Bangalore by the end of 2022. This should enable a thriving Quantum ecosystem. Currently 25 qubits is based on superconductors and eventually 2048 qubits based on CMOS spin qubits will be very exciting. 2048 logical qubits can enable a lot of commercial applications. We would like to use all our expertise in CMOS fabrication to make this technology breakthrough happen. That would be a major technology breakthrough from India. We would like to see Dr Nagendra and team achieve the same as soon as possible with a lot of collaboration with Quantum ecosystems including QuantrolOx , Oxford and the IISc community. QpiAI has an excellent team and building commercial grade Quantum computers right here in Bangalore is very exciting. It is great to see collaboration between India and Finland to form this thriving Quantum ecosystem. The association of QpiAI with Dr Andrew and Vishal is a major step forward in achieving this goal," added Dr Navakant.
This partnership will create revenue generation opportunities for both the companies, QpiAI has QpiAISense hardware platform for controlling qubits ready to be shipped, on which QuantrolOx will develop control software for both superconducting and semiconductor-based spin qubits.
QpiAI is building its own quantum computing lab to house cryogenic electronics and is in process to acquire land for Indias first private quantum computing lab facility. The QpiAI quantum lab will be part of bigger Qpi Technology Quantum Park, which houses the labs and manufacturing facility for its subsidiaries like super conductor based single photon detector, single photon source, HTS tapes and HTS cables for SuperQ, solid state battery prototyping facility for Qpivolta using Quantum and AI technologies and labs for Qpivolta-ET for Energy transition experiments using material discovery and carbon capture, Silicon photonics testing lab for Qpisemi for its AI20P AI processors and lab scale model Quantum data center designed by Qpicloud. Currently Qpicloud which is incubated in DSCI (Data security Council of India) NCoE (National Center of Excellence) for cybersecurity, is also working on Quantum security for data centers and cloud computing, whose lab will be enabled in Qpi Technology Quantum Park.
QpiAI is expanding in Finland to enable partnership with European quantum ecosystems. QpiAI already has a subsidiary in the US QpiAI Inc. QpiAI also intends to open an office in Japan for customer support and after sales support for Japanese customers.
With partnerships and global presence, QpiAI which is a revenue generating and profitable Quantum compute and AI company, which is vertically integrating AI and Quantum compute and has customers across the world, is scaling its business to next level to become major global player in AI and Quantum compute.
About QpiAI
QpiAI (https://www.qpiai.tech) is World leader in AI and Quantum computing. QpiAI is integrating Quantum computing and AI vertically to offer solutions to areas like manufacturing, industrial, transportation, finance, pharma and materials. It has various software platforms and products including QpiAI-pro, QpiAI-explorer, QpiAI-logistics, QpiAIopt, QpiAIsim, QpiAIML. It is building complete hardware stack based on 3 chip solutions of Trion (universal optimizer Chip), Bumblebee (scalable cryogenic control chip) and scalable spin-qubit based QPU (Quantum processing unit) , which can be scalable to 2048 logical qubits. QpiAI is currently ready with room temperature control electronics based on its hardware platform QpiAIsense. QpiAI is subsidiary of Qpi Technology (https://www.qpitech.holdings).
About QuantrolOx
QuantrolOx (https://quantrolox.com/), an Anglo-Finnish spinout from University of Oxford, is building automated machine learning based control software for quantum technologies to tune, stabilise, and optimise qubits. QuantrolOxs software is technology agnostic and applicable to all types of quantum technologies. Initially the company is targeting solid-state qubits where the team has already demonstrated substantial practical benefits.
Original post:
Posted in Quantum Computing
Comments Off on QpiAI and QuantrolOx sign a MoU to jointly develop India’s first 25-qubit quantum computing testbed and offerings for the European and Indian markets…
A Look at Quantum Resistant Encryption & Why It’s Critical to Future Cybersecurity – Hashed Out by The SSL Store
Posted: at 1:10 pm
Quantum resistant cryptography will be a key part of cybersecurity in the future. Heres what to know about how to protect your data when hackers are armed with quantum computers
Quantum computing is a contentious topic that people tend to either love or hate depending on where theyre seated. On one hand, it represents an incredible opportunity in terms of data processing speeds and capabilities. On the other, its a means through which to destroy the cryptographic algorithms we now rely on to keep sensitive data secure online. This is where something known as quantum resistant encryption comes into play.
But what is quantum resistant encryption? This article explores the history of quantum computing in cryptography, why its a threat to modern online security, and what organizations can do to prepare to implement quantum safe cryptography within their IT environments.
Lets hash it out.
In a nutshell, quantum resistant encryption refers to a set of algorithms that are anticipated to remain secure once quantum computing moves out of the lab and into the real world. (They will replace the public key cryptography algorithms currently used by billions of people around the world every day.)
By the way, when people use any of the following terms, theyre typically talking about the same thing (in most cases):
All of the public key encryption algorithms we currently rely on today are expected to be broken once researchers succeed in building large enough quantum computer. Once that happens, quantum resistant encryption will need to be used everywhere (both by normal [i.e., classical] and quantum computers) so that attackers with quantum computers cant break the encryption to steal data.
Quantum computers are fundamentally different from the computers we use today. These devices use specialized hardware components that bring quantum physics into the equation and allows them to perform certain calculations exponentially faster than even the fastest supercomputer we currently have. (Well speak to that more later in the article.)
Current public key cryptographic algorithms rely on complex mathematics (for example, the RSA encryption algorithm relies on factoring prime numbers while Diffie-Hellman and elliptic curve cryptography, or ECC, rely on the discrete logarithm problem) to securely transmit data. This means that every time you buy an item on Amazon, your browser communicates with Amazons web server via a mathematically derived secure communication channel based on one of these mathematical approaches.
The problem is that some quantum computers will be able to solve these mathematical problems so quickly that hackers would be able to break modern public key encryption within minutes. (Basically, rendering the encryption public key algorithms provide useless.)
According to the National Security Agency (NSA), quantum resistant cryptography should be resistant to cryptanalytic attacks from both classical and quantum computers. With this in mind, these algorithms would be something that can be used both before and after quantum computers are put to use in real-world applications. Theyre designed with quantum computing threats in mind, but theyre not limited to being used only after a cryptographically relevant quantum computer (CRQC) is created.
Currently, encryption over insecure channels (e.g., the internet) relies on something known as public key cryptography. The idea behind traditional public key algorithms is that two parties (i.e., your websites server and the customer who wants to connect to it) can communicate securely using two separate but related keys: a public key that encrypts data and a private key that decrypts it. They use these keys to exchange secret information that they can use to create a secure, symmetrically encrypted communication channel. (Why symmetric encryption? Because its faster and less resource-intensive than public key encryption.)
Unlike modern algorithms, quantum resistant encryption algorithms will replace existing public key specifications with ones that are thought to be quantum resistant. Again, this is because the modern digital signature and key establishment algorithms we rely on in public key encryption now will no longer be secure when CRQCs become a thing.
NIST says that quantum resistant algorithms typically fall in one of three main camps:
There is a fourth category that some reference stateful hashed-based signatures. But according to NISTs PQC FAQs page:
It is expected that NIST will only approve a stateful hash-based signature standard for use in a limited range of signature applications, such as code signing, where most implementations will be able to securely deal with the requirement to keep state.
We cant give you a specific answer here because, well, nothing has really been decided yet. The National Institute of Standards and Technology (NIST) has been engaged in a large-scale cryptographic competition of sorts for the past several years. The competition is an opportunity for mathematicians, researchers, cryptographers, educators and scientists to submit algorithms for consideration as future federal standards.
The standards body announced their selection of seven candidates and eight alternate algorithm candidates from the third round of submissions. However, no final decisions have been made regarding which algorithm(s) will be standardized:
To better understand quantum resistant encryption and why its needed, you first need to understand quantum computers and their anticipated impact on cyber security. The idea behind quantum computing is that these devices use quantum mechanics to approach problem solving the general goal of all modern computers in a whole new way and at exponentially faster speeds.
According to research from Mavroeidis, Vishi, Zych, and Jsang at the University of Oslo, Norway, there are two types of quantum computers:
At a basic level, the computers we use today (classical computers) communicate data using specific combinations of 1s and 0s (binary numbers called bits). All modern computers play by these same rules. For example, if I type the word Howdy! the computer uses this combination of bits to communicate the precise combination of keys I press: 01001000 01101111 01110111 01100100 01111001 00100001.
Quantum computers, on the other hand, operate on a new playing field using a different set of rules. Instead of these traditional bits (1s or 0s), it relies on quantum bits, or qubits for short. In a nutshell, instead of looking at either 1s or 0s, quantum computers view data as existing in multiple states, meaning that it can be both 1s and 0s simultaneously (this is known as a superposition). It also uses two other quantum properties entanglement and interference to connect separate data elements and eliminate irrelevant guesses to solve problems more quickly.
Of course, not all qubits are the same. Microsoft recently announced that their Azure Quantum program has unlocked the first step to developing a new type of qubit called a topological qubit. The goal is to resolve the scaling-related issues that other quantum computers face and to eventually help lead to the creation of a quantum computer capable of employing one million or more qubits. (Check out the linked article for more information on Microsofts demonstration.)
Were not going to get into all of the technical aspects of the other quantum properties we mentioned here, either. If you want to learn more about superposition, entanglement and interference, check out this video that explains these concepts in a few different ways:
The takeaway we want you to have is that, on one hand, some quantum computers are poised to solve problems beyond what modern supercomputers can do but faster and more efficiently. They also have the potential for other unimaginable capabilities to do things we havent even thought of yet. On the other hand, some quantum computers are anticipated to be no better than classical computers for some types of tasks. But trying to predict the future in terms of the full impact of quantum computers in the future is easier said than done.
Our understanding of quantum computing is largely theoretical so far, quantum computers can only be used in laboratories due to the machines massive resource and cooling requirements. Quantum chips have to be kept super cold (at -273 degrees Celsius, or what amounts to nearly absolute zero) to operate, and they can only operate for very short bursts. But the concern that cybersecurity and industry leaders have is that as quantum computers eventually become more mainstream, theyll make existing public key encryption algorithms namely, RSA (Rivest Shamir Adleman) essentially useless.
This concern is due to a concept known as Shors Algorithm. The basic overview of the concern about this algorithm, which was first demonstrated in 1994 by the guy who created it (mathematician Peter Shor), is that a powerful enough quantum computer would be able to crack modern public key algorithms pretty much instantly. How would it do this? By having the ability to calculate the factors of enormous numbers i.e., the math that operates at the very heart of modern public key encryption at faster rates than any modern devices could manage.
When you try to crack asymmetric encryption (say, RSA) using a classical computer, youre essentially trying to guess the factors of those mega-sized integers. As you can imagine, this will take a really long time using a regular computer. But with quantum properties like superposition, entanglement and interference coming into play, it can reduce the time required to make those guesses (or eliminate the need to guess some of the numbers entirely) to basically nothing. For example, while it would take upwards of millions of years for traditional computers to figure out the prime factors of 2,000+ bit numbers, a quantum computer could complete the same task within minutes.
While this enhanced speed will be great for creating positive solutions to problems such as coming up with revolutionary new treatments or cures for medical conditions it also poses a problem if these devices fall into the wrong hands.
Now, were not telling you all of this to scare you. The truth is that the threats that quantum computing represents arent new concepts, nor do they represent threats to your business and customers right now. The concept of quantum computing and all of its benefits and dangers has been around for decades and isnt expected to come to fruition yet.
Heres an overview of the history of quantum computing and how the development of quantum resistant cryptography plays a key role in it:
Here are links relating to some of the points on the timeline above:
So, how long is all of this expected to take? The answer depends on who you ask and in what context:
As youve probably seen, change tends to be relatively slow in the cryptographic world. Lets think about it another way. When TLS 1.2 was developed, TLS versions 1.1 and 1.0 were outmoded, but theyre still in use on the web and havent gone away completely. (Were at 14 years and counting at this point since TLS 1.2 was initially released and we now have TLS 1.3, which came out in 2018!)
As we touched on earlier, NIST is working on finalizing the selection of the final algorithms that will become standardized. Once final PQC algorithms are selected, then the next move will be to publish PQC standards as Federal Information Processing Standards (FIPS) and move on to implementations and deployments. Once this occurs, the Cryptographic Algorithm Validation Program (CAPV) will provide certifications for approved implementations of these approved PQC algorithms.
We bring this all up now because were drawing closer to a future when quantum computers are anticipated to become mainstream. It wont happen today, tomorrow, or likely even five years from now. But when it does, organizations will need to be able to support and use the quantum resistant encryption algorithms necessary to help keep data secure in this super-powered computer processing world to come. And things are changing now to prepare for that inevitable future.
On Jan. 19, 2022, the White House released a memorandum specifying that agencies have 180 days to identify any instances of encryption not in compliance with NSA-approved Quantum-Resistant Algorithms or CNSA [] and must report the following to the National Manager:
What does all of this mean at the level of your organization or company? In reality, not much right now for everyday businesses. But lets be realistic here its virtually impossible to be compliant with rules that havent yet been implemented. Its kind of like playing a new sport say, soccer when you dont yet know the rules or how to play it. Sure, you can go through the motions and move the ball down the field. But if you dont know how youre supposed to do it or which goal to aim for specifically, no telling if youre doing it right or if youre moving in the right direction.
The National Institute of Standards and Technology (NIST) was anticipating the release of its PQC Round 3 Report by the end of March or early April 2022. (Theres also been talk about announcing a fourth round of study as well.) Now, in all fairness, weve just started the month of April a week ago. But considering that agencies are expected to be compliant with quantum-resistant algorithms by basically July 2022, and the algorithms themselves havent officially been decided upon well, that sure makes things a lot more difficult for organizations that have to be compliant.
However, once NIST decides which algorithm(s) will become the standard, then its up to businesses and organizations to ensure that theyre not using or relying upon any algorithms that may have been deprecated. The standards body is expected to have draft PQC standards available for public comment before the end of 2023 and aims to have a finalized standard ready the following year.
Youll find that many experts typically sit in one of two camps when it comes to the topic of quantum computing and quantum resistant cryptography. On one end of the spectrum, the first camp aptly named Panicville in the illustration above essentially operates under the assumption that the end of near! Cybersecurity as we know it is about to come crashing down around us at any moment! BEWARE!
The second camp, which weve named Chillville in the above graphic, tends to take very different approach. The perspective here is typically that quantum computing is still a long way off, that its too impractical for real-world applications, or that its something we likely wont have to deal with for years to come, so theres no point in worrying about it now.
Needless to say, neither of these approaches is particularly healthy or beneficial to the security of your organization and its data. Thankfully, though, other experts tend to fall somewhere in the middle lets call it Preparationville. The purveying mindset of experts who sit within this space between the two main camps is that:
Here at Hashed Out, we definitely fall more in the middle of the spectrum; were not panicking about the changes to come but are strongly encouraging customers to start preparing now to the best of their abilities. The NSA shares on its Post-Quantum Cybersecurity Resources site that while it doesnt know when or even if a system capable of cracking public key encryption will make its debut. However, it does make it clear that preparing for an eventual transition to post-quantum cryptographic standards is a must for data security in the future.
Better to be safe than sorry, right?
Great. So, youre being told to prepare, but its hard to prepare for something when you dont really know what tools youll have at your disposal to work with. Its like trying to prepare for a disaster as a homeowner you might not know when something bad will happen, but youre going to take steps to mitigate potential impacts as much as possible.
The same concept here applies with preparing for quantum cryptography. While you may not know which algorithms specifically will be standardized, or specifically when quantum resistant cryptography will need to be implemented, you know its likely going to happen and that you should take steps now to prepare for it.
We get it theres definitely a strong case of you dont know what you dont know going on here. However, you can take steps to stay ahead of the curve as much as possible by taking the time to research and plan your strategy now. Part of this planning should include:
We cant overstate the importance of this task as its something you should already be doing anyhow. Auditing your organizations cryptographic systems, IT infrastructure and applications is crucial for a multitude of reasons. Furthermore, it can aid you as well with the development of your PQC planning and deciding what gets upgraded and when.
If your organization is running on older servers and other related infrastructure, youre likely to need to upgrade before quantum cryptography makes its debut. Something to consider includes having servers with redundant distributed databases that use PQC digital signature algorithms that are connected via quantum key distributed (QKD) connections. (QKD is a concept thats been around since the 80s and involves using quantum mechanics to distribute keys between communicating parties in traditional symmetric algorithm-protected connections.) The idea here is that this may help to protect against quantum attacks and aid in recovery from successful attacks.
What about hardware security modules? Is your organization using one in-house? Is it relying on a third party system? Ensure that whatever HSM youre using has a roadmap to support quantum safe encryption.
We understand your hesitation and dread updating your existing infrastructure is a massive undertaking. It involves major investments in money, time, and personnel-related resources. But this is why its crucial to start planning for and begin implementing these upgrades now. If you roll out the upgrade to your systems over time, it means you wont have to blow all of your capital budget in a single year or two, or risk rushing implementation (which can lead to mistakes) because you decided to wait until crap hits the fan.
Essentially, youre carefully preparing for the impending storm ahead of time (as much as you can). This way, your organization will be less likely to get caught in the downpour others will get swept away in.
The NSA also offers the Commercial National Security Algorithm Suite (CNSA Suite), which is a set of algorithms that the Committee on National Security Systems Policy 15 (CNSSP-15) has identified for protecting classified information (listed in alphabetical order):
Broken cryptosystems are the ugly companion of all the advancements that quantum computing has to offer. This is why major certificate authorities like DigiCert and Sectigo are working now to help prepare for a PQC world on their ends by creating PQC certificate authorities (CAs) and certificates.
DigiCert, which plays a key role in multiple PQC projects, offers a PQC Toolkit to Secure Site Pro customers. This toolkit offers hybrid RSA/PQC certificates, which pair PQC algorithms with classical ones. The goal here is for these certificates to work on both legacy systems (to offer backwards compatibility) and quantum systems once quantum computers finally roll out.
DigiCert estimates that it would take a traditional computer a few quadrillion years to break modern 2048-bit encryption. But considering that we dont know exactly when quantum devices are going to come charging onto the scene, its a good idea to start preparing now for when it does happen. This is why the CA also has created a resource that breaks down the Post Quantum Cryptography Maturity Model. You can use this to figure out how well prepared your organization is (or isnt) for whats the come.
Sectigos Senior Vice President of Product Management Lindsay Kent spoke during one of the companys Identity-First Summit 2022 presentations on certificate lifecycle management. Kent said that the certificate authority expects to have quantum safe security in place by 2026. The plan includes providing customers with a Quantum Safe Toolkit as well that aims to help companies:
The goal here for both CAs is to help companies use these certificates to facilitate quantum safe application-based authentication (instead of network-based authentication) and secure communications via TLS sessions. Its also to ensure that organizations can have certificates in place that support both PQC algorithms and the traditional algorithms that we have in place now.
Wait, doesnt offering backwards compatibility mean that users on classical devices will still be connecting via protocols relying on insecure algorithms once quantum computers become mainstream? Yes. But if you want to continue providing services to customers using legacy systems, thats going to continue until they eventually make the change.
An important part of the planning we talked about earlier is taking the time to review and make changes to your organizations existing internal security procedures and related documentation. Some of the things youll want to consider is what quantum resistant secure access controls and authentication measures youll need to implement. As youve probably guessed, your existing controls wont cut it in a PQC world, so everything will need to be updated to be quantum resistant once NIST publishes its standards.
As we talked about earlier, the widespread use of quantum computing and, therefore, the deployment of quantum resistant cryptography is still on the horizon but is likely at least a good decade or so away. But thats why now is the time to prepare for PQC to help your business stay ahead of the curve. You dont want to be one of the organizations caught unprepared when quantum computers make their mainstream debut.
More here:
Posted in Quantum Computing
Comments Off on A Look at Quantum Resistant Encryption & Why It’s Critical to Future Cybersecurity – Hashed Out by The SSL Store