The Prometheus League
Breaking News and Updates
- Abolition Of Work
- Ai
- Alt-right
- Alternative Medicine
- Antifa
- Artificial General Intelligence
- Artificial Intelligence
- Artificial Super Intelligence
- Ascension
- Astronomy
- Atheism
- Atheist
- Atlas Shrugged
- Automation
- Ayn Rand
- Bahamas
- Bankruptcy
- Basic Income Guarantee
- Big Tech
- Bitcoin
- Black Lives Matter
- Blackjack
- Boca Chica Texas
- Brexit
- Caribbean
- Casino
- Casino Affiliate
- Cbd Oil
- Censorship
- Cf
- Chess Engines
- Childfree
- Cloning
- Cloud Computing
- Conscious Evolution
- Corona Virus
- Cosmic Heaven
- Covid-19
- Cryonics
- Cryptocurrency
- Cyberpunk
- Darwinism
- Democrat
- Designer Babies
- DNA
- Donald Trump
- Eczema
- Elon Musk
- Entheogens
- Ethical Egoism
- Eugenic Concepts
- Eugenics
- Euthanasia
- Evolution
- Extropian
- Extropianism
- Extropy
- Fake News
- Federalism
- Federalist
- Fifth Amendment
- Fifth Amendment
- Financial Independence
- First Amendment
- Fiscal Freedom
- Food Supplements
- Fourth Amendment
- Fourth Amendment
- Free Speech
- Freedom
- Freedom of Speech
- Futurism
- Futurist
- Gambling
- Gene Medicine
- Genetic Engineering
- Genome
- Germ Warfare
- Golden Rule
- Government Oppression
- Hedonism
- High Seas
- History
- Hubble Telescope
- Human Genetic Engineering
- Human Genetics
- Human Immortality
- Human Longevity
- Illuminati
- Immortality
- Immortality Medicine
- Intentional Communities
- Jacinda Ardern
- Jitsi
- Jordan Peterson
- Las Vegas
- Liberal
- Libertarian
- Libertarianism
- Liberty
- Life Extension
- Macau
- Marie Byrd Land
- Mars
- Mars Colonization
- Mars Colony
- Memetics
- Micronations
- Mind Uploading
- Minerva Reefs
- Modern Satanism
- Moon Colonization
- Nanotech
- National Vanguard
- NATO
- Neo-eugenics
- Neurohacking
- Neurotechnology
- New Utopia
- New Zealand
- Nihilism
- Nootropics
- NSA
- Oceania
- Offshore
- Olympics
- Online Casino
- Online Gambling
- Pantheism
- Personal Empowerment
- Poker
- Political Correctness
- Politically Incorrect
- Polygamy
- Populism
- Post Human
- Post Humanism
- Posthuman
- Posthumanism
- Private Islands
- Progress
- Proud Boys
- Psoriasis
- Psychedelics
- Putin
- Quantum Computing
- Quantum Physics
- Rationalism
- Republican
- Resource Based Economy
- Robotics
- Rockall
- Ron Paul
- Roulette
- Russia
- Sealand
- Seasteading
- Second Amendment
- Second Amendment
- Seychelles
- Singularitarianism
- Singularity
- Socio-economic Collapse
- Space Exploration
- Space Station
- Space Travel
- Spacex
- Sports Betting
- Sportsbook
- Superintelligence
- Survivalism
- Talmud
- Technology
- Teilhard De Charden
- Terraforming Mars
- The Singularity
- Tms
- Tor Browser
- Trance
- Transhuman
- Transhuman News
- Transhumanism
- Transhumanist
- Transtopian
- Transtopianism
- Ukraine
- Uncategorized
- Vaping
- Victimless Crimes
- Virtual Reality
- Wage Slavery
- War On Drugs
- Waveland
- Ww3
- Yahoo
- Zeitgeist Movement
-
Prometheism
-
Forbidden Fruit
-
The Evolutionary Perspective
Category Archives: Quantum Computing
The keys to QML patent success – IAM – IAM
Posted: June 30, 2022 at 9:05 pm
In this co-published article, Haseltine Lake Kempners Laura Compton takes a practical look at how to formulate claims and draft applications for quantum machine learning inventions in view of the EPOs patent eligibility requirements
In quantum machine learning (QML), classical machine learning algorithms, or expensive subroutines of them, are typically adapted to run on a quantum computing device. QML utilises quantum resources to improve the execution time and/or the performance of classical machine learning algorithms.
Aspects of QML that may be patentable include the utilisation of a quantum computing device to execute more efficiently all or part of a classical machine learning algorithm (for example, using a quantum computer to calculate classical distances more efficiently for nearest neighbour, kernel and clustering methods), or to execute a model itself (for example, reformulating a stochastic model as a quantum system). Other related aspects include the reformulation of an optimisation problem such that it may be solved using a quantum computing device.
Another aspect of QML that may be patentable includes improvements to existing QML algorithms or models (for example, an improvement that reduces the depth of the quantum circuit required to execute the algorithm or model, and/or uses gates that are less complex, and/or avoids repetition of certain subroutines of the algorithm). Some improvements may be specific to the problem being solved itself (for example, modifying the operations applied to a quantum computing device such that a more limited space of potential solutions to an optimisation problem is then searched over by the device).
Inventions relating to these aspects will be considered patentable subject matter at the EPO when the quantum computing device is an integral part of the invention.
For such inventions, the independent claims are likely to make some reference to the quantum computing device and the manner in which the algorithm has been adapted to be implemented on it. The dependent claims, if not the independent claim itself, should:
In view of the EPOs technicality requirements, having a dependent claim that specifies how the output of the quantum computing device and/or the output of the machine learning model, is then used in some technical process, is recommended.
Where the invention relates to more general QML methods, or improvements to such methods (which could be applied to a wide range of problems across a wide range of fields), it is also recommended to provide a number of different use cases that demonstrate how the invention can be applied to different practical problems in the dependent claims, or the description,.
Quantum computing generally, as well as QML, is a rapidly evolving and complex field. As such, drafting applications which meet the EPOs sufficiency and clarity requirements can be a challenge. Therefore, when drafting patent specifications, it is best practice to include a full mathematical description of the quantum implementation of the algorithm or model, alongside how each operation being applied to the qubits relates to the algorithm or model being implemented (for example, describing how a series of operations applied to the qubits are representative of an objective function that is to be minimised).
For inventions which relate to improving existing QML algorithms or models, detailed description on how the changes to the quantum circuit enable the improvement to be realised should be included. As with any rapidly evolving field where there is a lack of universally accepted terminologies, for applications relating to quantum computing generally, the terms used in the claims of the application should be defined in the description.
Finally, experimental data can be particularly useful in terms of demonstrating an improvement in speed or accuracy over the prior art and can be useful for supporting inventive step arguments in later prosecution. It is also worth considering setting up the technical problem the invention solves in terms of why classical processes suffer from disadvantages that make them commercially or technically non-viable (for example, too slow for real time deployment).
To summarise, the points above can be used to assist in drafting QML inventions suitable for submission to the EPO and can be used to provide the applicant with the best possible chance of obtaining a commercially useful patent.
Laura Compton is a patent attorney in the Bristol offices of Haseltine Lake Kempner
Previous articles by Haseltine Lake Kempner authors in this series can be accessed here:
How to secure AI patents in Europe
Drafting AI patent applications for success at the EPO eligibility and claim formulation
Drafting AI patent applications for success at the EPO drafting the full specification
Technology trends why patent your hidden AI?
Google and Samsung top the list of applicants for AI-related patents at the EPO
The EPO and UKIPO approaches to AI and patentable subject matter
How revised EPO guidelines affect treatment of AI inventions
Monetising data, machine learnings most valuable asset
Continue reading here:
Posted in Quantum Computing
Comments Off on The keys to QML patent success – IAM – IAM
EU fears falling behind in race to control key technologies – Science Business
Posted: at 9:05 pm
In a future-gazing report, the European Commission has warned that control over technology is an increasingly crucial geopolitical battleground, and that the EU is losing the investment race in quantum computing, 5G, artificial intelligence and biotechnology.
The communication, released on 29 June, concludes that, The EUs currently limited capacity in some horizontal technologies weakens its position.
It cites figures from the consultancy McKinsey showing an investment gap with the US, and in many cases China.
Half of all quantum computing companies are in the US, 40% are in China, and none are in the EU, warns the Commissions report, which focuses on how Europe will steer through a digital and environmental transformation of its society and economy.
In artificial intelligence, the US attracts 40% of investment funding. Asia, including China, has a 32% share, but Europe lags with just 12%.
On 5G, the next generation telecoms network, China attracts 60% of investment, far ahead of Europes 11% share. And US investments into biotechnology dwarf those made in Europe.
Unsurprisingly, the report urges big increases in spending on R&D.
The EU will need to leverage additional private and public long-term investments in [] R&I across critical technologies and sectors, uptake and synergies between technologies, human capital, and infrastructures, it says.
It doesnt specifically mention research funding programmes as a solution, but instead suggests deepening EU banking and capital market integration to allow more private investment.
There are also indications of which specific technologies the Commission sees as key to greening the economy. Most notably, the report mentions nuclear small modular reactors as an important part of sustainably offsetting the increasing power demands of the digital sector.
These are mini nuclear reactors that can be manufactured in a factory rather than assembled onsite. So far they are unproven, but a US-based company has plans to build one in Romania, with US government financial backing.
The Commission also moots the idea of electric aircraft connecting small regional airports throughout the EU.
Digital tech can help make Europe greener, through smarter control over power grids and transport systems, the report says. But the digital sector is also expected to be an increasingly hungry consumer of energy, powering everything from consumer computers to data centres and minting cryptocurrencies. ICT is thought to be responsible for 5-9% of global electricity use.
Reflecting a shift in Brussels towards the prioritisation of scientific links with fellow democracies, the Commissions report recommends a proactive research and innovation agenda with like-minded partners.
Original post:
EU fears falling behind in race to control key technologies - Science Business
Posted in Quantum Computing
Comments Off on EU fears falling behind in race to control key technologies – Science Business
Quantum Error Correction: Time to Make It Work – IEEE Spectrum
Posted: June 29, 2022 at 12:27 am
Dates chiseled into an ancient tombstone have more in common with the data in your phone or laptop than you may realize. They both involve conventional, classical information, carried by hardware that is relatively immune to errors. The situation inside a quantum computer is far different: The information itself has its own idiosyncratic properties, and compared with standard digital microelectronics, state-of-the-art quantum-computer hardware is more than a billion trillion times as likely to suffer a fault. This tremendous susceptibility to errors is the single biggest problem holding back quantum computing from realizing its great promise.
Fortunately, an approach known as quantum error correction (QEC) can remedy this problem, at least in principle. A mature body of theory built up over the past quarter century now provides a solid theoretical foundation, and experimentalists have demonstrated dozens of proof-of-principle examples of QEC. But these experiments still have not reached the level of quality and sophistication needed to reduce the overall error rate in a system.
The two of us, along with many other researchers involved in quantum computing, are trying to move definitively beyond these preliminary demos of QEC so that it can be employed to build useful, large-scale quantum computers. But before describing how we think such error correction can be made practical, we need to first review what makes a quantum computer tick.
Information is physical. This was the mantra of the distinguished IBM researcher Rolf Landauer. Abstract though it may seem, information always involves a physical representation, and the physics matters.
Conventional digital information consists of bits, zeros and ones, which can be represented by classical states of matter, that is, states well described by classical physics. Quantum information, by contrast, involves qubitsquantum bitswhose properties follow the peculiar rules of quantum mechanics.
A classical bit has only two possible values: 0 or 1. A qubit, however, can occupy a superposition of these two information states, taking on characteristics of both. Polarized light provides intuitive examples of superpositions. You could use horizontally polarized light to represent 0 and vertically polarized light to represent 1, but light can also be polarized on an angle and then has both horizontal and vertical components at once. Indeed, one way to represent a qubit is by the polarization of a single photon of light.
These ideas generalize to groups of n bits or qubits: n bits can represent any one of 2n possible values at any moment, while n qubits can include components corresponding to all 2n classical states simultaneously in superposition. These superpositions provide a vast range of possible states for a quantum computer to work with, albeit with limitations on how they can be manipulated and accessed. Superposition of information is a central resource used in quantum processing and, along with other quantum rules, enables powerful new ways to compute.
Researchers are experimenting with many different physical systems to hold and process quantum information, including light, trapped atoms and ions, and solid-state devices based on semiconductors or superconductors. For the purpose of realizing qubits, all these systems follow the same underlying mathematical rules of quantum physics, and all of them are highly sensitive to environmental fluctuations that introduce errors. By contrast, the transistors that handle classical information in modern digital electronics can reliably perform a billion operations per second for decades with a vanishingly small chance of a hardware fault.
Of particular concern is the fact that qubit states can roam over a continuous range of superpositions. Polarized light again provides a good analogy: The angle of linear polarization can take any value from 0 to 180 degrees.
Pictorially, a qubits state can be thought of as an arrow pointing to a location on the surface of a sphere. Known as a Bloch sphere, its north and south poles represent the binary states 0 and 1, respectively, and all other locations on its surface represent possible quantum superpositions of those two states. Noise causes the Bloch arrow to drift around the sphere over time. A conventional computer represents 0 and 1 with physical quantities, such as capacitor voltages, that can be locked near the correct values to suppress this kind of continuous wandering and unwanted bit flips. There is no comparable way to lock the qubits arrow to its correct location on the Bloch sphere.
Early in the 1990s, Landauer and others argued that this difficulty presented a fundamental obstacle to building useful quantum computers. The issue is known as scalability: Although a simple quantum processor performing a few operations on a handful of qubits might be possible, could you scale up the technology to systems that could run lengthy computations on large arrays of qubits? A type of classical computation called analog computing also uses continuous quantities and is suitable for some tasks, but the problem of continuous errors prevents the complexity of such systems from being scaled up. Continuous errors with qubits seemed to doom quantum computers to the same fate.
We now know better. Theoreticians have successfully adapted the theory of error correction for classical digital data to quantum settings. QEC makes scalable quantum processing possible in a way that is impossible for analog computers. To get a sense of how it works, its worthwhile to review how error correction is performed in classical settings.
Simple schemes can deal with errors in classical information. For instance, in the 19th century, ships routinely carried clocks for determining the ships longitude during voyages. A good clock that could keep track of the time in Greenwich, in combination with the suns position in the sky, provided the necessary data. A mistimed clock could lead to dangerous navigational errors, though, so ships often carried at least three of them. Two clocks reading different times could detect when one was at fault, but three were needed to identify which timepiece was faulty and correct it through a majority vote.
The use of multiple clocks is an example of a repetition code: Information is redundantly encoded in multiple physical devices such that a disturbance in one can be identified and corrected.
As you might expect, quantum mechanics adds some major complications when dealing with errors. Two problems in particular might seem to dash any hopes of using a quantum repetition code. The first problem is that measurements fundamentally disturb quantum systems. So if you encoded information on three qubits, for instance, observing them directly to check for errors would ruin them. Like Schrdingers cat when its box is opened, their quantum states would be irrevocably changed, spoiling the very quantum features your computer was intended to exploit.
The second issue is a fundamental result in quantum mechanics called the no-cloning theorem, which tells us it is impossible to make a perfect copy of an unknown quantum state. If you know the exact superposition state of your qubit, there is no problem producing any number of other qubits in the same state. But once a computation is running and you no longer know what state a qubit has evolved to, you cannot manufacture faithful copies of that qubit except by duplicating the entire process up to that point.
Fortunately, you can sidestep both of these obstacles. Well first describe how to evade the measurement problem using the example of a classical three-bit repetition code. You dont actually need to know the state of every individual code bit to identify which one, if any, has flipped. Instead, you ask two questions: Are bits 1 and 2 the same? and Are bits 2 and 3 the same? These are called parity-check questions because two identical bits are said to have even parity, and two unequal bits have odd parity.
The two answers to those questions identify which single bit has flipped, and you can then counterflip that bit to correct the error. You can do all this without ever determining what value each code bit holds. A similar strategy works to correct errors in a quantum system.
Learning the values of the parity checks still requires quantum measurement, but importantly, it does not reveal the underlying quantum information. Additional qubits can be used as disposable resources to obtain the parity values without revealing (and thus without disturbing) the encoded information itself.
Like Schrdingers cat when its box is opened, the quantum states of the qubits you measured would be irrevocably changed, spoiling the very quantum features your computer was intended to exploit.
What about no-cloning? It turns out it is possible to take a qubit whose state is unknown and encode that hidden state in a superposition across multiple qubits in a way that does not clone the original information. This process allows you to record what amounts to a single logical qubit of information across three physical qubits, and you can perform parity checks and corrective steps to protect the logical qubit against noise.
Quantum errors consist of more than just bit-flip errors, though, making this simple three-qubit repetition code unsuitable for protecting against all possible quantum errors. True QEC requires something more. That came in the mid-1990s when Peter Shor (then at AT&T Bell Laboratories, in Murray Hill, N.J.) described an elegant scheme to encode one logical qubit into nine physical qubits by embedding a repetition code inside another code. Shors scheme protects against an arbitrary quantum error on any one of the physical qubits.
Since then, the QEC community has developed many improved encoding schemes, which use fewer physical qubits per logical qubitthe most compact use fiveor enjoy other performance enhancements. Today, the workhorse of large-scale proposals for error correction in quantum computers is called the surface code, developed in the late 1990s by borrowing exotic mathematics from topology and high-energy physics.
It is convenient to think of a quantum computer as being made up of logical qubits and logical gates that sit atop an underlying foundation of physical devices. These physical devices are subject to noise, which creates physical errors that accumulate over time. Periodically, generalized parity measurements (called syndrome measurements) identify the physical errors, and corrections remove them before they cause damage at the logical level.
A quantum computation with QEC then consists of cycles of gates acting on qubits, syndrome measurements, error inference, and corrections. In terms more familiar to engineers, QEC is a form of feedback stabilization that uses indirect measurements to gain just the information needed to correct errors.
QEC is not foolproof, of course. The three-bit repetition code, for example, fails if more than one bit has been flipped. Whats more, the resources and mechanisms that create the encoded quantum states and perform the syndrome measurements are themselves prone to errors. How, then, can a quantum computer perform QEC when all these processes are themselves faulty?
Remarkably, the error-correction cycle can be designed to tolerate errors and faults that occur at every stage, whether in the physical qubits, the physical gates, or even in the very measurements used to infer the existence of errors! Called a fault-tolerant architecture, such a design permits, in principle, error-robust quantum processing even when all the component parts are unreliable.
A long quantum computation will require many cycles of quantum error correction (QEC). Each cycle would consist of gates acting on encoded qubits (performing the computation), followed by syndrome measurements from which errors can be inferred, and corrections. The effectiveness of this QEC feedback loop can be greatly enhanced by including quantum-control techniques (represented by the thick blue outline) to stabilize and optimize each of these processes.
Even in a fault-tolerant architecture, the additional complexity introduces new avenues for failure. The effect of errors is therefore reduced at the logical level only if the underlying physical error rate is not too high. The maximum physical error rate that a specific fault-tolerant architecture can reliably handle is known as its break-even error threshold. If error rates are lower than this threshold, the QEC process tends to suppress errors over the entire cycle. But if error rates exceed the threshold, the added machinery just makes things worse overall.
The theory of fault-tolerant QEC is foundational to every effort to build useful quantum computers because it paves the way to building systems of any size. If QEC is implemented effectively on hardware exceeding certain performance requirements, the effect of errors can be reduced to arbitrarily low levels, enabling the execution of arbitrarily long computations.
At this point, you may be wondering how QEC has evaded the problem of continuous errors, which is fatal for scaling up analog computers. The answer lies in the nature of quantum measurements.
In a typical quantum measurement of a superposition, only a few discrete outcomes are possible, and the physical state changes to match the result that the measurement finds. With the parity-check measurements, this change helps.
Imagine you have a code block of three physical qubits, and one of these qubit states has wandered a little from its ideal state. If you perform a parity measurement, just two results are possible: Most often, the measurement will report the parity state that corresponds to no error, and after the measurement, all three qubits will be in the correct state, whatever it is. Occasionally the measurement will instead indicate the odd parity state, which means an errant qubit is now fully flipped. If so, you can flip that qubit back to restore the desired encoded logical state.
In other words, performing QEC transforms small, continuous errors into infrequent but discrete errors, similar to the errors that arise in digital computers.
Researchers have now demonstrated many of the principles of QEC in the laboratoryfrom the basics of the repetition code through to complex encodings, logical operations on code words, and repeated cycles of measurement and correction. Current estimates of the break-even threshold for quantum hardware place it at about 1 error in 1,000 operations. This level of performance hasnt yet been achieved across all the constituent parts of a QEC scheme, but researchers are getting ever closer, achieving multiqubit logic with rates of fewer than about 5 errors per 1,000 operations. Even so, passing that critical milestone will be the beginning of the story, not the end.
On a system with a physical error rate just below the threshold, QEC would require enormous redundancy to push the logical rate down very far. It becomes much less challenging with a physical rate further below the threshold. So just crossing the error threshold is not sufficientwe need to beat it by a wide margin. How can that be done?
If we take a step back, we can see that the challenge of dealing with errors in quantum computers is one of stabilizing a dynamic system against external disturbances. Although the mathematical rules differ for the quantum system, this is a familiar problem in the discipline of control engineering. And just as control theory can help engineers build robots capable of righting themselves when they stumble, quantum-control engineering can suggest the best ways to implement abstract QEC codes on real physical hardware. Quantum control can minimize the effects of noise and make QEC practical.
In essence, quantum control involves optimizing how you implement all the physical processes used in QECfrom individual logic operations to the way measurements are performed. For example, in a system based on superconducting qubits, a qubit is flipped by irradiating it with a microwave pulse. One approach uses a simple type of pulse to move the qubits state from one pole of the Bloch sphere, along the Greenwich meridian, to precisely the other pole. Errors arise if the pulse is distorted by noise. It turns out that a more complicated pulse, one that takes the qubit on a well-chosen meandering route from pole to pole, can result in less error in the qubits final state under the same noise conditions, even when the new pulse is imperfectly implemented.
One facet of quantum-control engineering involves careful analysis and design of the best pulses for such tasks in a particular imperfect instance of a given system. It is a form of open-loop (measurement-free) control, which complements the closed-loop feedback control used in QEC.
This kind of open-loop control can also change the statistics of the physical-layer errors to better comport with the assumptions of QEC. For example, QEC performance is limited by the worst-case error within a logical block, and individual devices can vary a lot. Reducing that variability is very beneficial. In an experiment our team performed using IBMs publicly accessible machines, we showed that careful pulse optimization reduced the difference between the best-case and worst-case error in a small group of qubits by more than a factor of 10.
Some error processes arise only while carrying out complex algorithms. For instance, crosstalk errors occur on qubits only when their neighbors are being manipulated. Our team has shown that embedding quantum-control techniques into an algorithm can improve its overall success by orders of magnitude. This technique makes QEC protocols much more likely to correctly identify an error in a physical qubit.
For 25 years, QEC researchers have largely focused on mathematical strategies for encoding qubits and efficiently detecting errors in the encoded sets. Only recently have investigators begun to address the thorny question of how best to implement the full QEC feedback loop in real hardware. And while many areas of QEC technology are ripe for improvement, there is also growing awareness in the community that radical new approaches might be possible by marrying QEC and control theory. One way or another, this approach will turn quantum computing into a realityand you can carve that in stone.
This article appears in the July 2022 print issue as Quantum Error Correction at the Threshold.
From Your Site Articles
Related Articles Around the Web
View post:
Quantum Error Correction: Time to Make It Work - IEEE Spectrum
Posted in Quantum Computing
Comments Off on Quantum Error Correction: Time to Make It Work – IEEE Spectrum
QC Ware Announces Q2B22 Tokyo To Be Held July 13-14 – HPCwire
Posted: at 12:27 am
PALO ALTO, Calif., June 28, 2022 QC Ware, a leading quantum software and services company, today announced the inaugural Q2B22 Tokyo Practical Quantum Computing, to be held exclusively in person at The Westin Tokyo in Japan on July 13- 14, 2022. Q2B is the worlds largest gathering of the quantum computing community, focusing solely on quantum computing applications and driving the discourse on quantum advantage and commercialization. Registration and other information onQ2B22 Tokyo is available athttp://q2b.jp.
Q2B22 Tokyo will feature top academics, industry end users, government representatives, and quantum computing vendors from around the world.
Japan has led the way with ground-breaking research on quantum computing, said Matt Johnson, CEO of QC Ware. In addition, the ecosystem includes some of Japans largest enterprises, forward-thinking government organizations, and a thriving venture- backed startup community. Im excited to be able to connect the Japanese and international quantum computing ecosystems at this unique event.
QC Ware has been operating in Japan since 2019 and recently opened up an office in Tokyo.
Q2B22 Tokyo will be co-hosted by QunaSys, a leading Japanese developer company working on innovative algorithms focused on accelerating the development of quantum technology applicability in chemistry and sponsored by IBM Quantum.
Japans technology ecosystem is actively advancing quantum computing. QunaSys is a key player in boosting technology adoption, driving business, government, and academia collaboration to enable the quantum chemistry ecosystem. We are pleased to work with QC Ware and co-host Q2B22 Tokyo bringing Q2B to Japan, said Tennin Yan, CEO of QunaSys.
IBM Quantum has strategically invested in Japan to accelerate an ecosystem of world- class academic, private sector and government partners, including installation of the IBM Quantum System One at the University of Tokyo, and the co-development of the Quantum Innovation Initiative Consortium (QIIC), said Aparna Prabhakar, Vice President, Partners and Alliances, IBM Quantum. We are excited to work with QC Ware and QunaSys to bring experts from a wide variety of quantum computing fields to Q2B22 Tokyo.
Q2B22 Tokyo will feature keynotes from top academics such as:
Other keynotes include:
Japanese and international end-users discussing active quantum initiatives, such as:Automotive:
Materials and Chemistry:
Finance and more:
In addition to IBM Quantum, Q2B22 Tokyo, is sponsored by D-Wave Systems, KeysightTechnologies, NVIDIA, Quantinuum Ltd., Quantum Machines, andStrangeworks, Inc.Other sponsors include:
Q2B has been run by QC Ware since 2017, with the annual flagship event held in Northern Californias Silicon Valley. Q2B Silicon Valley is currently scheduled for December 6-8 at the Santa Clara Convention Center.
About QC Ware
QC Ware is a quantum software and services company focused on ensuring enterprises are prepared for the emerging quantum computing disruption. QC Ware specializes in the development of applications for near-term quantum computing hardware with a team composed of some of the industrys foremost experts in quantum computing. Its growing network of customers includes AFRL, Aisin Group, Airbus, BMW Group, Covestro, Equinor, Goldman Sachs, Itau Unibanco, and Total. QC Ware Forge, the companys flagship quantum computing cloud service, is built for data scientists with no quantum computing background. It provides unique, performant, turnkey quantum computing algorithms. QC Ware is headquartered in Palo Alto, California, and supports its European customers through its subsidiary in Paris and its Asian customers from a Tokyo office. QC Ware also organizes Q2B, the largest annual gathering of the international quantum computing community.
Source: QC Ware
Go here to see the original:
QC Ware Announces Q2B22 Tokyo To Be Held July 13-14 - HPCwire
Posted in Quantum Computing
Comments Off on QC Ware Announces Q2B22 Tokyo To Be Held July 13-14 – HPCwire
Quantum computing will revolutionize every large industry – CTech
Posted: at 12:27 am
Israeli Team8 venture group officially opened this years Cyber Week with an event that took place in Tel Aviv on Sunday. The event, which included international guests and cybersecurity professionals, showcased the country and the industry as a powerhouse in relation to Startup Nation.
Opening remarks were made by Niv Sultan, star of Apple TVs Tehran, who also moderated the event. She then welcomed Gili Drob-Heinstein, Executive Director at the Blavatnik Interdisciplinary Cyber Research Center (ICRC) at Tel Aviv University, and Nadav Zafrir, Co-founder of Team8 and Managing Partner of Team8 Platform to the stage.
I would like to thank the 100 CSOs who came to stay with us, Zafrir said on stage. Guests from around the world had flown into Israel and spent time connecting with one another ahead of the official start of Cyber Week on Monday. Team8 was also celebrating its 8th year as a VC, highlighting the work it has done in the cybersecurity arena.
The stage was then filled with Admiral Mike Rogers and Nir Minerbi, Co-founder and CEO of Classiq, who together discussed The Quantum Opportunity in computing. Classical computers are great, but for some of the most complex challenges humanity is facing, they are not suitable, said Minerbi. Quantum computing will revolutionize every large industry.
Classiq develops software for quantum algorithms. Founded in 2020, it has raised a total of $51 million and is funded by Team8 among other VC players in the space. Admiral Mike Rogers is the Former Director of American agency the NSA and is an Operating Partner at Team8.
We are in a race, Rogers told the large crowd. This is a technology believed to have advantages for our daily lives and national security. I told both presidents I worked under why they should invest billions into quantum, citing the ability to look at multiple qubits simultaneously thus speeding up the ability to process information. According to Rogers, governments have already publicly announced $29 billion of funding to help develop quantum computing.
Final remarks were made by Renee Wynn, former CIO at NASA, who discussed the potential of cyber in space. Space may be the final frontier, and if we do not do anything else than what we are doing now, it will be chaos 100 miles above your head, she warned. On stage, she spoke to the audience about the threats in space and how satellites could be hijacked for nefarious reasons.
Cybersecurity and satellites are so important, she concluded. Lets bring the space teams together with the cybersecurity teams and help save lives.
After the remarks, the stage was then transformed to host the evenings entertainment. Israeli-American puppet band Red Band performed a variety of songs and was then joined by Marina Maximilian, an Israeli singer-songwriter and actress, who shared the stage with the colorful puppets.
The event was sponsored by Meitar, Delloitte, LeumiTech, Valley, Palo Alto, FinSec Innovation Lab, and SentinelOne. It marked the beginning of Cyber Week, a three-day conference hosted by Tel Aviv University that will welcome a variety of cybersecurity professionals for workshops, networking opportunities, and panel discussions. It is understood that this year will have 9,000 attendees, 400 speakers, and host people from 80 different countries.
2 View gallery
Red Band performing 'Seven Nation Army'.
(Photo: James Spiro)
Original post:
Quantum computing will revolutionize every large industry - CTech
Posted in Quantum Computing
Comments Off on Quantum computing will revolutionize every large industry – CTech
IonQ and GE Research Demonstrate High Potential of Quantum Computing for Risk Aggregation – Business Wire
Posted: at 12:27 am
COLLEGE PARK, Md.--(BUSINESS WIRE)--IonQ (NYSE: IONQ), an industry leader in quantum computing, today announced promising early results with its partner, GE Research, to explore the benefits of quantum computing for modeling multi-variable distributions in risk management.
Leveraging a Quantum Circuit Born Machine-based framework on standardized, historical indexes, IonQ and GE Research, the central innovation hub for the General Electric Company (NYSE: GE), were able to effectively train quantum circuits to learn correlations among three and four indexes. The prediction derived from the quantum framework outperformed those of classical modeling approaches in some cases, confirming that quantum copulas can potentially lead to smarter data-driven analysis and decision-making across commercial applications. A blog post further explaining the research methodology and results is available here.
Together with GE Research, IonQ is pushing the boundaries of what is currently possible to achieve with quantum computing, said Peter Chapman, CEO and President, IonQ. While classical techniques face inefficiencies when multiple variables have to be modeled together with high precision, our joint effort has identified a new training strategy that may optimize quantum computing results even as systems scale. Tested on our industry-leading IonQ Aria system, were excited to apply these new methodologies when tackling real world scenarios that were once deemed too complex to solve.
While classical techniques to form copulas using mathematical approximations are a great way to build multi-variate risk models, they face limitations when scaling. IonQ and GE Research successfully trained quantum copula models with up to four variables on IonQs trapped ion systems by using data from four representative stock indexes with easily accessible and variating market environments.
By studying the historical dependence structure among the returns of the four indexes during this timeframe, the research group trained its model to understand the underlying dynamics. Additionally, the newly presented methodology includes optimization techniques that potentially allow models to scale by mitigating local minima and vanishing gradient problems common in quantum machine learning practices. Such improvements demonstrate a promising way to perform multi-variable analysis faster and more accurately, which GE researchers hope lead to new and better ways to assess risk with major manufacturing processes such as product design, factory operations, and supply chain management.
As we have seen from recent global supply chain volatility, the world needs more effective methods and tools to manage risks where conditions can be so highly variable and interconnected to one another, said David Vernooy, a Senior Executive and Digital Technologies Leader at GE Research. The early results we achieved in the financial use case with IonQ show the high potential of quantum computing to better understand and reduce the risks associated with these types of highly variable scenarios.
Todays results follow IonQs recent announcement of the companys new IonQ Forte quantum computing system. The system features novel, cutting-edge optics technology that enables increased accuracy and further enhances IonQs industry leading system performance. Partnerships with the likes of GE Research and Hyundai Motors illustrate the growing interest in our industry-leading systems and feeds into the continued success seen in Q1 2022.
About IonQ
IonQ, Inc. is a leader in quantum computing, with a proven track record of innovation and deployment. IonQ's current generation quantum computer, IonQ Forte, is the latest in a line of cutting-edge systems, including IonQ Aria, a system that boasts industry-leading 20 algorithmic qubits. Along with record performance, IonQ has defined what it believes is the best path forward to scale. IonQ is the only company with its quantum systems available through the cloud on Amazon Braket, Microsoft Azure, and Google Cloud, as well as through direct API access. IonQ was founded in 2015 by Christopher Monroe and Jungsang Kim based on 25 years of pioneering research. To learn more, visit http://www.ionq.com.
IonQ Forward-Looking Statements
This press release contains certain forward-looking statements within the meaning of Section 27A of the Securities Act of 1933, as amended, and Section 21E of the Securities Exchange Act of 1934, as amended. Some of the forward-looking statements can be identified by the use of forward-looking words. Statements that are not historical in nature, including the words anticipate, expect, suggests, plan, believe, intend, estimates, targets, projects, should, could, would, may, will, forecast and other similar expressions are intended to identify forward-looking statements. These statements include those related to IonQs ability to further develop and advance its quantum computers and achieve scale; IonQs ability to optimize quantum computing results even as systems scale; the expected launch of IonQ Forte for access by select developers, partners, and researchers in 2022 with broader customer access expected in 2023; IonQs market opportunity and anticipated growth; and the commercial benefits to customers of using quantum computing solutions. Forward-looking statements are predictions, projections and other statements about future events that are based on current expectations and assumptions and, as a result, are subject to risks and uncertainties. Many factors could cause actual future events to differ materially from the forward-looking statements in this press release, including but not limited to: market adoption of quantum computing solutions and IonQs products, services and solutions; the ability of IonQ to protect its intellectual property; changes in the competitive industries in which IonQ operates; changes in laws and regulations affecting IonQs business; IonQs ability to implement its business plans, forecasts and other expectations, and identify and realize additional partnerships and opportunities; and the risk of downturns in the market and the technology industry including, but not limited to, as a result of the COVID-19 pandemic. The foregoing list of factors is not exhaustive. You should carefully consider the foregoing factors and the other risks and uncertainties described in the Risk Factors section of IonQs Quarterly Report on Form 10-Q for the quarter ended March 31, 2022 and other documents filed by IonQ from time to time with the Securities and Exchange Commission. These filings identify and address other important risks and uncertainties that could cause actual events and results to differ materially from those contained in the forward-looking statements. Forward-looking statements speak only as of the date they are made. Readers are cautioned not to put undue reliance on forward-looking statements, and IonQ assumes no obligation and does not intend to update or revise these forward-looking statements, whether as a result of new information, future events, or otherwise. IonQ does not give any assurance that it will achieve its expectations.
Read more from the original source:
Posted in Quantum Computing
Comments Off on IonQ and GE Research Demonstrate High Potential of Quantum Computing for Risk Aggregation – Business Wire
Alan Turing’s Everlasting Contributions to Computing, AI and Cryptography – NIST
Posted: at 12:27 am
An enigma machine on display outside the Alan Turing Institute entrance inside the British Library, London.
Credit: Shutterstock/William Barton
Suppose someone asked you to devise the most powerful computer possible. Alan Turing, whose reputation as a central figure in computer science and artificial intelligence has only grown since his untimely death in 1954, applied his genius to problems such as this one in an age before computers as we know them existed. His theoretical work on this problem and others remains a foundation of computing, AI and modern cryptographic standards, including those NIST recommends.
The road from devising the most powerful computer possible to cryptographic standards has a few twists and turns, as does Turings brief life.
Alan Turing
Credit: National Portrait Gallery, London
In Turings time, mathematicians debated whether it was possible to build a single, all-purpose machine that could solve all problems that are computable. For example, we can compute a cars most energy-efficient route to a destination, and (in principle) the most likely way in which a string of amino acids will fold into a three-dimensional protein. Another example of a computable problem, important to modern encryption, is whether or not bigger numbers can be expressed as the product of two smaller numbers. For example, 6 can be expressed as the product of 2 and 3, but 7 cannot be factored into smaller integers and is therefore a prime number.
Some prominent mathematicians proposed elaborate designs for universal computers that would operate by following very complicated mathematical rules. It seemed overwhelmingly difficult to build such machines. It took the genius of Turing to show that a very simple machine could in fact compute all that is computable.
His hypothetical device is now known as a Turing machine. The centerpiece of the machine is a strip of tape, divided into individual boxes. Each box contains a symbol (such as A,C,T, G for the letters of genetic code) or a blank space. The strip of tape is analogous to todays hard drives that store bits of data. Initially, the string of symbols on the tape corresponds to the input, containing the data for the problem to be solved. The string also serves as the memory of the computer. The Turing machine writes onto the tape data that it needs to access later in the computation.
Credit: NIST
The device reads an individual symbol on the tape and follows instructions on whether to change the symbol or leave it alone before moving to another symbol. The instructions depend on the current state of the machine. For example, if the machine needs to decide whether the tape contains the text string TC it can scan the tape in the forward direction while switching among the states previous letter was T and previous letter was not C. If while in state previous letter was T it reads a C, it goes to a state found it and halts. If it encounters the blank symbol at the end of the input, it goes to the state did not find it and halts. Nowadays we would recognize the set of instructions as the machines program.
It took some time, but eventually it became clear to everyone that Turing was right: The Turing machine could indeed compute all that seemed computable. No number of additions or extensions to this machine could extend its computing capability.
To understand what can be computed it is helpful to identify what cannot be computed. Ina previous life as a university professor I had to teach programming a few times. Students often encounter the following problem: My program has been running for a long time; is it stuck? This is called the Halting Problem, and students often wondered why we simply couldnt detect infinite loops without actually getting stuck in them. It turns out a program to do this is an impossibility. Turing showed that there does not exist a machine that detects whether or not another machine halts. From this seminal result followed many other impossibility results. For example, logicians and philosophers had to abandon the dream of an automated way of detecting whether an assertion (such as whether there are infinitely many prime numbers) is true or false, as that is uncomputable. If you could do this, then you could solve the Halting Problem simply by asking whether the statement this machine halts is true or false.
Turing went on to make fundamental contributions to AI, theoretical biology and cryptography. His involvement with this last subject brought him honor and fame during World War II, when he played a very important role in adapting and extending cryptanalytic techniques invented by Polish mathematicians. This work broke the German Enigma machine encryption, making a significant contribution to the war effort.
Turing was gay. After the war, in 1952, the British government convicted him for having sex with a man. He stayed out of jail only by submitting to what is now called chemical castration. He died in 1954 at age 41 by cyanide poisoning, which was initially ruled a suicide but may have been an accident according to subsequent analysis. More than 50 years would pass before the British government apologized and pardoned him (after years of campaigning by scientists around the world). Today, the highest honor in computer sciences is called the Turing Award.
Turings computability work provided the foundation for modern complexity theory. This theory tries to answer the question Among those problems that can be solved by a computer, which ones can be solved efficiently? Here, efficiently means not in billions of years but in milliseconds, seconds, hours or days, depending on the computational problem.
For example, much of the cryptography that currently safeguards our data and communications relies on the belief that certain problems, such as decomposing an integer number into its prime factors, cannot be solved before the Sun turns into a red giant and consumes the Earth (currently forecast for 4 billion to 5 billion years). NIST is responsible for cryptographic standards that are used throughout the world. We could not do this work without complexity theory.
Technology sometimes throws us a curve, such as the discovery that if a sufficiently big and reliable quantum computer is built it would be able to factor integers, thus breaking some of our cryptography. In this situation, NIST scientists must rely on the worlds experts (many of them in-house) in order to update our standards. There are deep reasons to believe that quantum computers will not be able to break the cryptography that NIST is about to roll out. Among these reasons is that Turings machine can simulate quantum computers. This implies that complexity theory gives us limits on what a powerful quantum computer can do.
But that is a topic for another day. For now, we can celebrate how Turing provided the keys to much of todays computing technology and even gave us hints on how to solve looming technological problems.
More here:
Alan Turing's Everlasting Contributions to Computing, AI and Cryptography - NIST
Posted in Quantum Computing
Comments Off on Alan Turing’s Everlasting Contributions to Computing, AI and Cryptography – NIST
Multiverse Computing Named a 2022 Gartner Cool Vendor in Quantum Computing – Business Wire
Posted: June 24, 2022 at 9:35 pm
SAN SEBASTIN, Spain--(BUSINESS WIRE)--Multiverse Computing, a global leader in delivering value-based quantum computing solutions in finance and beyond, today announced it has been named a Gartner 2022 Cool Vendor in Quantum Computing.
Gartner states, This report is designed to highlight interesting, new and innovative vendors, products and services.
In the key findings of its Cool Vendors in Quantum Computing report, Gartner noted, Innovation in quantum systems technologies continues to ramp up with significant improvements in devising, controlling and scaling quantum systems that offer the promise of increased resiliency and scalability of usable qubits.
The report further noted that advances in quantum software technologies and services enable integration of quantum solutions exploration in the financial services industry.
Multiverses Singularity, its flagship product for the financial industry, provides quantum solutions for investment portfolio optimization and other finance applications through a simple and intuitive Microsoft Excel frontend. Singularity is designed to enable financial professionals to access the power of quantum computing without requiring previous expertise or knowledge. A video of Multiverses Singularity can be seen here.
We are honored to be recognized as a 2022 Gartner Cool Vendor, said Enrique Lizaso Olmos, CEO of Multiverse Computing. Being recognized by the knowledgeable and independent analysts at Gartner validates our company mission to deliver real-world business value from quantum computing to clients as early as possible in this nascent industry.
The Gartner report can be found here.
Gartner Disclaimer
GARTNER and COOL VENDORS are registered trademarks and service marks of Gartner, Inc. and/or its affiliates in the U.S. and internationally and are used herein with permission. All rights reserved. Gartner does not endorse any vendor, product or service depicted in our research publications, and does not advise technology users to select only those vendors with the highest ratings or other designation. Gartner research publications consist of the opinions of Gartners research organization and should not be construed as statements of fact. Gartner disclaims all warranties, expressed or implied, with respect to this research, including any warranties of merchantability or fitness for a particular purpose.
About Multiverse Computing
Multiverse Computing is a leading quantum software company that applies quantum and quantum-inspired solutions to tackle complex problems in finance to deliver value today and enable a more resilient and prosperous economy. The companys expertise in quantum control and computational methods as well as finance means it can secure maximum results from current quantum devices. Its flagship product, Singularity, allows financial professionals to leverage quantum computing with common software tools. The company is targeting additional verticals as well, including mobility, energy, the life sciences and industry 4.0.
Read the original:
Multiverse Computing Named a 2022 Gartner Cool Vendor in Quantum Computing - Business Wire
Posted in Quantum Computing
Comments Off on Multiverse Computing Named a 2022 Gartner Cool Vendor in Quantum Computing – Business Wire
AWS sent edgy appliance to the ISS and it worked just like all the other computers up there – The Register
Posted: at 9:35 pm
Amazon Web Services has proudly revealed that the first completely private expedition to the International Space Station carried one of its Snowcone storage appliances, and that the device worked as advertised.
The Snowcone is a rugged shoebox-sized unit packed full of disk drives specifically 14 terabytes of solid-state disk a pair of VCPUs and 4GB of RAM. The latter two components mean the Snowcone can run either EC2 instances or apps written with AWSs Greengrass IoT product. In either case, the idea is that you take a Snowcone into out-of-the-way places where connectivity is limited, collect data in situ and do some pre-processing on location. Once you return to a location where bandwidth is plentiful, it's assumed you'll upload the contents of a Snowcone into AWS and do real work on it there.
AWS sent this Snowcone aloft with the crewed Axiom Space mission to the ISS in April 2022. The four astronauts conducted a variety of experiments during their 17-day rotation, which stored data on the Snowcone.
AWS hardened the device to ensure it could survive the trip. Axiom and AWS were able to communicate with the device, which worked as intended and processed data it stored. The cloud colossus has hailed this achievement as proving that processing data on Snowcones can work even in edge locations as extreme as the ISS.
Which is true and yay and all. But let's not forget that the ISS houses myriad computers and has done for years. Running a computer up there does require a combination of rocket science and computer science, but humanity has already well and truly proven it can put them both to work on the space station.
Even for computers that are far more modest than an AWS Snowcone such as the Raspberry Pi.
The Pi Foundation and the European Space Agency have sent several AstroPi machines to the ISS. Just like AWS, those units were prepared especially for the rigors of space travel and were used to run multiple workloads.
The Pi guys even revealed an updated design last year, and this week reported the two units sent aloft in late 2021 have now run 17,168 programs written by young people from 26 countries.
The Register leaves the decision about which is the more impressive and/or inspiring achievement to you.
Continued here:
Posted in Quantum Computing
Comments Off on AWS sent edgy appliance to the ISS and it worked just like all the other computers up there – The Register
IDC Perspective on Integration of Quantum Computing and HPC – HPCwire
Posted: June 22, 2022 at 11:32 am
The insatiable need to compress time to insights from massive and complex datasets is fueling the demand for quantum computing integration into high performance computing (HPC) environments. Such an integration would allow enterprises to accelerate and optimize current HPC applications and processes by simulating and emulating them on todays noisy intermediate scale quantum (NISQ) computers.
Currently, enterprises are reliant on the advantages that can be achieved using only classical accelerator technology such as GPUs and FPGAs. However, HPC systems are limited in their ability to process and analyze large amounts of data needed to execute multiple workflows, even with the added compute power of classical accelerators. Using quantum computing technologies, not only will enterprises be able to accelerate current HPC processes, but they will also be empowered to solve intractable industry problems beyond the scope of the most advanced classical compute systems.
Today, quantum computing systems are still in early development and far from commercial maturity. Quantum computing hardware vendors are challenged in their ability to stabilize and scale the large number of qubits needed to solve complex problems and allow for error correction due to decoherence. As a result, NISQ machines cannot provide a means for enterprises to realize a quantum advantage, defined by IDC as being able to solve a problem that has actual value to a business, humanity, or otherwise.
Despite these challenges, enterprises are investing in quantum initiatives to identify uses cases and develop algorithms so that they are quantum ready when a fault-tolerant universal machine is realized. As a result, government entities, such as China, Germany and the US; IT industry leaders such as IBM, Google, Microsoft, and Amazon Web Services (AWS); and private investors are escalating funding for quantum computing to push this technology to new levels of maturity.
IDC expects investments in the quantum computing market will reach nearly $16.4 billion by the end of 2027. IDC believes that these investments will lead to waves of technology innovation and breakthroughs that will allow organizations to apply quantum computing to a diverse and expanding group of use cases that involve the analysis of huge amounts of diverse datasets, exponentially large numbers of variables, and an inexhaustible number of possible outcomes.
The ability to address large-scale use cases using quantum computing is possible due to the qubits unique superpositioning and entanglement properties. Quantum and classical computers store and compute data based on a series of 0s and 1s. In classical computing, this is done using a bit. Bits are only capable of holding the values of 0 or 1. Bits cannot hold the value of 0 and 1 simultaneously. Qubits do have this capability. This property is referred to as superposition. Through qubit entanglement, a pair of qubits is connected or linked. Change in the state of one qubit results in a simultaneous, predictable change in the other qubit. Combined, the quantum properties of superpositioning and entanglement provide qubits the ability to process more data faster, cheaper, and better (more accurately or precisely) than a classical computer. As a result, enterprises can use quantum computing systems to explore new and unique use cases which can accelerate current business processes and workloads.
The list of use cases is growing at a rapid pace. Included in this list are performance intensive compute (PIC) specific use cases that address newly defined problems, refine solutions generated and iterated in the PIC environment, simulate quantum algorithms, and more. Energized by this innovative technology, many enterprises dont want to delay the commencement of their quantum journey. Approximately 8 out of 10 enterprises that are currently investing, or planning to invest, in quantum computing expect to integrate quantum computing technologies as a hybrid model to enhance their current performance intensive computing (PIC) capabilities. Because of this trend, IDC anticipates that several performance-intensive computing workloads will initially be turbocharged by quantum computing-based accelerators. Yet, in the long-term many of these workloads will eventually cross the computing paradigm and become quantum only.
Quantum and classical hardware vendors are working to develop quantum and quantum-inspired computing systems dedicated to solving HPC problems. For example, using a co-design approach, quantum start-up IQM is mapping quantum applications and algorithms directly to the quantum processor to develop an application-specific superconducting computer. The result is a quantum system optimized to run particular applications such as HPC workloads. In collaboration with Atos, quantum hardware start-up, Pascal is working to incorporate its neutral-atom quantum processors into HPC environments. NVIDIAs cuQuantum Appliance and cuQuantum software development kit provide enterprises the quantum simulation hardware and developer tools needed to integrate and run quantum simulations in HPC environments.
At a more global level, the European High Performance Computing Joint Undertaking (EuroHPC JU) announced its funding for the High-Performance Computer and Quantum Simulator (HPCQS) hybrid project. According the EuroHPC JU, the goal of the project is to prepare Europe for the post-exascale era by integrating two 100+ qubit quantum simulators into two supercomputers and developing the quantum computing platform, both of which will be accessible via the cloud.
Due to the demand for hybrid quantum-HPC systems, other classical and quantum hardware and software vendors have announced that they too are working to develop a hybrid quantum-HPC solutions. For example, compute infrastructure vendor, HPE, is extending its R&D focus into quantum computing by specializing in the co-development of quantum accelerators. Because quantum software vendor, Zapata, foresees quantum computing, HPC, and machine learning converging, the company is creating the Orquestra Universal Scheduler to manage task executions on HPC clusters and current HPC resources.
Yet, recent results from an IDC survey indicate that approximately 15% of enterprises are still deterred from quantum computing adoption. For quantum computing to take off, a quantum computing workforce made up of quantum scientists, physicists, engineers, developers, and operators needs to evolve. However, this should not deter enterprises from beginning their quantum computing journeys. Instead, hesitant adopters should take advantage of the development and consulting services offered by quantum hardware and software vendors, as well as IT consultants that specialize in quantum computing technologies. Because the choice is clear, become quantum ready or be left behind. IDC projects that worldwide customer spend for quantum computing will grow to $8.6 billion in 2027.
Authors
Heather West, Ph.D., Senior Research Analyst, Infrastructure Systems, Platforms and Technologies Group, IDC
Ashish Nadkami, Group Vice President, Infrastructure Systems, Platforms and Technologies Group, IDC
Sample of IDC Reports
Worldwide Quantum Computing Forecast, 2021-2025: Imminent Disruption for the Next Decade
IDCs Worldwide Quantum Computing Taxonomy, 2022
Emerging Trends in End-User Adoption of Quantum Computing-as-a-Service Solutions
2021 Worldwide Quantum Technologies Use Case Report
Here is the original post:
IDC Perspective on Integration of Quantum Computing and HPC - HPCwire
Posted in Quantum Computing
Comments Off on IDC Perspective on Integration of Quantum Computing and HPC – HPCwire