Page 4«..3456..1020..»

Category Archives: Quantum Computing

Reaching absolute zero for quantum computing now much quicker thanks to breakthrough refrigerator design – Livescience.com

Posted: May 29, 2024 at 2:06 am

A breakthrough cooling technology could help invigorate quantum computing and slash costly preparation time in key scientific experiments by weeks.

Scientists often need to generate temperatures close to absolute zero for quantum computing and astronomy, among other uses. Known as the "Big Chill," such temperatures keep the most sensitive electrical instruments free from interference such as temperature changes. However, the refrigerators used to achieve these temperatures are extremely costly and inefficient.

However, scientists with the National Institute of Standards and Technology (NIST) a U.S. government agency have built a new prototype refrigerator that they claim can achieve the Big Chill much more quickly and efficiently.

The researchers published the details of their new machine April 23 in the journal Nature Communications. They claimed using it could save 27 million watts of power per year and reduce global energy consumption by $30 million.

Conventional household fridges work through a process of evaporation and condensation, per Live Science. A refrigerant liquid is pushed through a special low-pressure pipe called an "evaporator coil."

As it evaporates, it absorbs heat to cool the inside of the fridge and then passes through a compressor that turns it back into a liquid, raising its temperature as it is radiated through the back of the fridge.

Related: 'World's purest silicon' could lead to 1st million-qubit quantum computing chips

Get the worlds most fascinating discoveries delivered straight to your inbox.

To achieve required temperatures, scientists have used pulse tube refrigerators (PTRs) for more than 40 years. PTRs use helium gas in a similar process but with far better absorption of heat and no moving parts.

While effective, it consumes huge amounts of energy, is expensive to run, and takes a long time. However, the NIST researchers also discovered that PTRs are needlessly inefficient and can be greatly improved to reduce cooling times and lower overall cost.

In the study, the scientists said PTRs "suffer from major inefficiencies" such as being optimized "for performance only at their base temperature" usually near 4 Kelvin. It means that while cooling down, PTRs run at greatly inefficient levels, they added.

The team found that by adjusting the design of the PTR between the compressor and the refrigerator, helium was used more efficiently. While cooling down, some of it is normally pushed into a relief valve rather than being pushed around the circuit as intended.

Their proposed redesign includes a valve that contracts as the temperature drops to prevent any helium from being wasted in this way. As a result, the NIST teams modified PTR achieved the Big Chill 1.7 to 3.5 times faster, the scientists said in their paper.

In smaller experiments for prototyping quantum circuits where cooldown times are presently comparable to characterization times, dynamic acoustic optimization can substantially increase measurement throughput, the researchers wrote.

The researchers said in their study that the new method could shave at least a week off experiments at the Cryogenic Underground Observatory for Rare Events (CUORE) a facility in Italy thats used to look for rare events such as a currently theoretical form of radioactive decay. As little background noise as possible must be achieved to obtain accurate results from these facilities.

Quantum computers need a similar level of isolation. They use quantum bits, or qubits. Conventional computers store information in bits and encode data with a value of either 1 or 0 and perform calculations in sequence, but qubits occupy a superposition of 1 and 0, thanks to the laws of quantum mechanics, and can be used to process calculations in parallel. Qubits, however, are incredibly sensitive and need to be separated from as much background noise as possible including the tiny fluctuations of thermal energy.

The researchers said that even more efficient cooling methods could theoretically be achieved in the near future, which could lead to faster innovation in quantum computing space.

The team also said their their technology could alternatively be used to achieve extremely cold temperatures in the same time but at a much lower cost, which could benefit the cryogenics industry, cutting costs for non-time-intensive experiments and industrial applications. The scientists are currently working with an industrial partner to release their improved PTR commercially.

Original post:

Reaching absolute zero for quantum computing now much quicker thanks to breakthrough refrigerator design - Livescience.com

Posted in Quantum Computing | Comments Off on Reaching absolute zero for quantum computing now much quicker thanks to breakthrough refrigerator design – Livescience.com

D-Wave Quantum Set to Join Russell 3000 Index – HPCwire

Posted: at 2:06 am

PALO ALTO, Calif., May 28, 2024 D-Wave Quantum Inc., a leader in quantum computing systems, software, and services and the worlds first commercial supplier of quantum computers, today announced it is set to join the broad-market Russell 3000 Index at the conclusion of the 2024 Russell US Indexes annual Reconstitution, effective at the open of US equity markets on Monday, July 1st, 2024, according to a preliminary list of additions posted on Friday, May 24th, 2024.

The annual Russell US Indexes Reconstitution captures the 4000 largest US stocks as of Tuesday, April 30th, 2024, ranking them by total market capitalization. Membership in the US all-cap Russell 3000 Index, which remains in place for one year, means automatic inclusion in the large-cap Russell 1000 Index or small-cap Russell 2000 Index as well as the appropriate growth and value style indexes. FTSE Russell, a prominent global index provider, determines membership for its Russell indexes primarily by objective, market-capitalization rankings, and style attributes.

Its an honor for D-Wave to join the Russell 3000 Index, an important benchmark for the US stock market, said Dr. Alan Baratz, CEO of D-Wave. This recognition reflects D-Waves leadership in ushering in the era of commercial quantum computing and will greatly increase visibility among the global investor community for the innovative quantum solutions were bringing to market.

Russell indexes are widely used by investment managers and institutional investors for index funds and as benchmarks for active investment strategies. According to the data as of the end of December 2023, about $10.5 trillion in assets are benchmarked against the Russell US indexes, which belong to FTSE Russell.

Russell indexesnow in their 40th yearcontinue to evolve to reflect the dynamic US economy. Annual rebalancing plays a vital role in establishing accurate benchmarks, ensuring they correctly mirror their designated market segments and remain unbiased in terms of size and style, said Fiona Bassett, CEO of FTSE Russell, an LSEG Business.

For more information on the Russell 3000 Index and the Russell indexes Reconstitution, go to the Russell Reconstitution section on the FTSE Russell website.

About D-Wave Quantum Inc.

D-Wave is a leader in the development and delivery of quantum computing systems, software, and services, and is the worlds first commercial supplier of quantum computersand the only company building both annealing quantum computers and gate-model quantum computers. Our mission is to unlock the power of quantum computing today to benefit business and society. We do this by delivering customer value with practical quantum applications for problems as diverse as logistics, artificial intelligence, materials sciences, drug discovery, scheduling, cybersecurity, fault detection, and financial modeling. D-Waves technology has been used by some of the worlds most advanced organizations including Mastercard, Deloitte, Davidson Technologies, ArcelorMittal, Siemens Healthineers, Unisys, NEC Corporation, Pattison Food Group Ltd., DENSO, Lockheed Martin, Forschungszentrum Jlich, University of Southern California, and Los Alamos National Laboratory.

Source: D-Wave

See the original post here:

D-Wave Quantum Set to Join Russell 3000 Index - HPCwire

Posted in Quantum Computing | Comments Off on D-Wave Quantum Set to Join Russell 3000 Index – HPCwire

3 Quantum Computing Stocks to Buy Be Millionaire-Makers: May – InvestorPlace

Posted: at 2:06 am

Source: Bartlomiej K. Wroblewski / Shutterstock.com

Dont miss out on this exceptional chance to invest in quantum computing stocks to buy that could be millionaire makers while their valuations remain low. These innovative tech companies are developing cutting-edge quantum computing systems with the potential to generate massive returns for investors who get in early.

The quantum computing stocks featured below are poised to commercialize their technology across multiple industries. Quantum computing promises to transform various sectors of our world, from financial services to medical research. Also, it may enable groundbreaking advances and discoveries that arent possible with traditional classical computing.

The three quantum computing stocks to buy outlined in this article represent the best opportunities investors have to compound their wealth to seven figures. Weve only just started to see the potential of this industry and understand the implications of this new tech.

So, here are three quantum computing stocks for investors who want to earn a potential seven-figure sum.

Source: zakiahza / Shutterstock.com

Hewlett Packard Enterprise (NYSE:HPE) focuses on IT and quantum computing through its Intelligent Edge segment. The company has demonstrated significant achievements in quantum computing research.

HPEs Intelligent Edge segment provides solutions that bring computation closer to the data source. Integrating quantum computing capabilities with Intelligent Edge technologies can offer unique advantages, such as real-time data processing and enhanced decision-making capabilities at the networks edge.

Most recently, the Intelligent Edge segment reported revenue of $902 million, an increase of 9% year-over-year. This segment continues to grow, driven by strong demand for edge computing solutions. The company also achieved an EPS of $0.48, which surpassed the consensus estimate of $0.45. This compares to an EPS of $0.63 in the same quarter of the previous year.

HPE is a well-known brand akin to a more modern version of IBM (NYSE:IBM). It could be a good pick for those who like to stay with the blue-chip options while also having the potential to mint new millionaires.

IonQ (NYSE:IONQ) is a leader in developing trapped-ion quantum computers and making significant strides in the field. The company collaborates with major cloud platforms.

IonQs primary technology involves trapped-ion quantum computers, which utilize ions trapped in electromagnetic fields as qubits. This technology is known for its high-fidelity operations and stability.

Recently, IonQ achieved a milestone of 35 algorithmic qubits with its IonQ Forte system, a year ahead of schedule. This achievement allows the system to handle more sophisticated and more extensive quantum circuits. IonQs growth and technological advancements have been recognized in various industry lists, such as Fast Companys 2023 Next Big Things in Tech List and Deloittes 2023 Technology Fast 500 List.

With a market cap of just 1.79 billion, it remains a small-cap quantum computing stock that could hold significant upside potential for investors. Its developments so far have been promising, and it could prove to be a company that will make early investors rich.

Pure-play quantum computing company Rigetti Computing (NASDAQ:RGTI) is known for its vertically integrated approach. This includes designing and manufacturing quantum processors.

Rigetti has achieved a significant milestone with its 128-qubit chip, which promises to advance quantum computing capabilities and enable new applications. This development is a key part of Rigettis roadmap to scale up quantum systems and improve performance metrics.

Also, in Q1 2024, Rigetti reported a 99.3% median 2-qubit gate fidelity on its 9-qubit Ankaa-class processor. This high level of fidelity is crucial for reliable quantum computations and positions Rigetti well against competitors.

The market cap of RGTI is a fraction of IONQs at just under 200 million at the time of writing. Its progress is similarly impressive, so it could hold significant upside and potentially mint a new generation of millionaires with a large enough investment.

On the date of publication, Matthew Farley did not have (either directly or indirectly) any positions in the securities mentioned in this article. The opinions expressed are those of the writer, subject to the InvestorPlace.com Publishing Guidelines.

Matthew started writing coverage of the financial markets during the crypto boom of 2017 and was also a team member of several fintech startups. He then started writing about Australian and U.S. equities for various publications. His work has appeared in MarketBeat, FXStreet, Cryptoslate, Seeking Alpha, and the New Scientist magazine, among others.

See more here:

3 Quantum Computing Stocks to Buy Be Millionaire-Makers: May - InvestorPlace

Posted in Quantum Computing | Comments Off on 3 Quantum Computing Stocks to Buy Be Millionaire-Makers: May – InvestorPlace

Amazon taps Finland’s IQM for its first EU quantum computing service – TNW

Posted: at 2:06 am

IQM Garnet, a 20-qubit quantum processing unit (QPU) is now available via Amazon Web Services (AWS) the first quantum computer accessible via AWS cloud in the European Union.

Finnish quantum hardware startup IQM is based outside of Helsinki, Finland. AWS previously has collaborations in place with IonQ, Oxford Quantum Circuits, QuEra, and Rigetti for its quantum cloud service known as Braket, but this will be the first AWS quantum processor hosted within the EU.

This also means that it is the first time Amazons quantum services will be accessible to end users in its AWS Europe (Stockholm) Region. It is also the first time IQMs quantum computers will be available in an on-demand structure via the cloud, and with AWS pay-as-you-go pricing.

We are very honoured to be part of the Amazon network and work together with a global tech company, Jan Goetz, co-CEO and co-founder at IQM told TNW. For IQM, this is a great opportunity to scale our offering globally and collaborate with leading end-users around the world.

The <3 of EU tech

The latest rumblings from the EU tech scene, a story from our wise ol' founder Boris, and some questionable AI art. It's free, every week, in your inbox. Sign up now!

Goetz further added that the joint offering was a great step forward for cloud quantum computing, and would enable cloud users to test novel types of algorithms and use-cases to develop their business.

As most of our readers will probably know, todays noisy and error-prone quantum computers cannot really do all that much yet. However, the technology is currently advancing incredibly fast. Learning to work with it will not happen overnight.

As such a whole business model has sprung up around getting organisations and corporations quantum-ready, so that they wont be caught off guard when quantum utility arrives. Todays smaller qubit systems are also training grounds for software developers, many of whom are working on solving the issue of error correction. In the context of cloud, IQM Garnet is mostly used by quantum algorithm engineers to develop IP around quantum compilers, algorithms, and error correction schemes, Max Haeberlein, Head of Cloud at IQM told TNW. IQM Garnet offers a highly homogenous layout and has cutting-edge fidelities, allowing users to effectively expand algorithms to the full size of the chip.

At the same time, Haeberlein said, the company offers IQM Garnet at affordable rates, which is especially important for the growing quantum algorithm startup scene.

IQM, founded in 2018, is Europes leading quantum hardware developer in superconducting qubits. In the beginning of next year the company plans to add a high-fidelity IQM Radiance 54-qubit quantum computer to its portfolio.

This, according to Haeberlein, will enable users to extend quantum algorithms beyond the point where they can still be classically emulated by supercomputers. In 2026, we release IQM Radiance with 150 qubits, where we will see the first commercial algorithm applications of scale in the domain of finance, automotive, life sciences, and chemicals, he adds.

See the rest here:

Amazon taps Finland's IQM for its first EU quantum computing service - TNW

Posted in Quantum Computing | Comments Off on Amazon taps Finland’s IQM for its first EU quantum computing service – TNW

Here Come the Qubits? What You Should Know About the Onset of Quantum Computing – IPWatchdog.com

Posted: at 2:06 am

Nearly 5,000 patents were granted in [quantum computing] in 2022approximately 1% more than 2021. By January 2024, the United States had authorized and issued an aggregate of nearly 16,000 patents in the area of quantum technology (37% of the global total).

While artificial intelligence (AI) may occupy all the limelight from media, stock markets, large and small corporations, not to mention political figures, futurists and modernists know that the mainstreaming of quantum computing will enable the next real technology paradigm shift.

From its beginnings in the speculative musings of physicist Paul Benioff in 1980 to the groundbreaking algorithms of mathematician Peter Shor in 1994, quantum computing was a transformative discovery. However, it was not until Googles establishment of a quantum hardware lab in 2014 that the theoretical promises began to materialize into practical applications. This marked the onset of a new era, where quantum experimentation became increasingly accessible, with IBM democratizing access to prototype processors and Google achieving quantum advantage over classical supercomputers in 2019.

What is quantum computing?

It is a technology for performing computations much faster than classical computing by using quantum-mechanical phenomena. Indeed, quantum computing can theoretically provide exponential performance improvement for some applications and to potentially enable completely new territories of computing. It has applications beyond computing, including communications and sensing.

How does quantum computing work?

While digital computers store and process information using bits, which can be either 0 or 1, quantum computers use qubits (quantum bits) that differ from these traditional bits. A qubit can be either an electron or proton, and unlike traditional bits, can also exist in superposition states, be subjected to incompatible measurements (or interference), and even be entangled with other quantum bits, rendering them much more powerful.

What has delayed the obsolescence of traditional computers and blocked the dominance of quantum computers?

To build a quantum computer or other quantum information technologies, we need to produce quantum objects that can act as qubits and be harnessed and controlled in physical systems. Therein lies the challenge, but scientists are quietly making progress.

While the theoretical potential of quantum computing was identified decades ago, it has only begun to be realized in recent years. An accelerating, high-stakes arms race is afoot in the private and public sectors to build quantum processors and circuits capable of solving exponentially complex problems, and a growing number of working systems are in progress. Quantum computing will likely lead to a paradigm shift as it unlocks advancements in several scientific fields.

What has the government done about it?

The United States adopted the National Quantum Initiative Act in December 2018 for the first time, giving the United States a plan for advancing quantum technology and quantum computing. The National Quantum Initiative, or NQI, provided an umbrella under which government agencies could develop and operate programs for improving the climate for quantum science and technology in the U.S., coordinated by the National Quantum Coordination Office, or NQCO. Agencies include the National Institute of Standards and Technology or NIST, the National Science Foundation or NSF, and the Department of Energy or DOE. These agencies have combined to establish the Quantum Economic Development Consortium, or QED-C, a consortium of industrial, academic, and governmental entities. Five years later, Congress and the President adopted a rare bipartisan bill to reauthorize the NQIA to further accelerate quantum research and development economic and national security of the United States, with needed funding and support.

Most recently, on April 10, 2024, United States Senator Marsha Blackburn (R-TN) and Representative Elise Stefanik (R-NY) introduced the Defense Quantum Acceleration Act, which would, among other provisions, establish a quantum advisor and a new center of excellence. The preeminence of quantum computing technology within national defense initiatives just got strategic. For example, quantum-encrypted information can not be secretly intercepted, because attempting to measure a quantumproperty changes it.Similarly, in the domain of navigation, while global positioning systems or GPS can be spoofed, quantumsensors can securely relay information about location. Quantum computers have the capability of processing information infinitely faster and more complex than traditional computers.

Its still early days, but the quantum realm is heating up and rapidly evolving. While they currently face challenges such as size limitations, maintenance complexities, and error susceptibility compared to classical computers, experts envision a near-term future where quantum computing outperforms classical computing for specific tasks.

What is the potential impact of quantum technology on the U.S. economy?

Digital computers have been pivotal in information processing, but quantum computers offer a paradigm shift. With the capacity to tackle intricate statistical problems beyond current computational boundaries, quantum computing is a game changer. McKinsey projects it to contribute nearly $2.0 trillion in value by 2035. The industries most likely to see the earliest economic impact from quantum computing include automotive, chemicals, financial services, and life sciences.

A McKinsey study published in April 2024 also delves into various facets of the investment landscape within the Quantum Technology (Q.T.) sector:

Technological advancements in quantum computing have accelerated in recent years, enabling solutions to exceedingly complex problems beyond the capabilities of todays most influential classical computers. Such advancements could revolutionize various sectors, such as the chemicals, life sciences, finance and mobility sectors. The industry is poised to revolutionize, with quantum computers presenting new frontiers for personalized medicine, allowing for more accurate diagnostics and targeted treatment options. In life sciences, it could accelerate drug discovery, enable personalized medicine through genomic targeting, and revolutionize pharmaceutical research and development. In financial services, it could optimize portfolio management and risk assessment, potentially creating $622 billion in value.

Agricultural advancements enabled by quantum computing could enhance crop optimization and resource efficiency, addressing food security and climate concerns. In the automotive sector, quantum computing offers avenues for optimizing R&D, supply chain management, and production processes, reducing costs, and enhancing efficiency. Similarly, quantum computing holds promise in revolutionizing chemical catalyst design, facilitating sustainable production processes, and mitigating environmental impacts.

Where is intellectual property being created in quantum technology? Nearly 5,000 patents were granted in the area in 2022, the last period for which data is available, approximately 1% more than 2021. By January 2024, the United States had authorized and issued an aggregate of nearly 16,000 patents in the area of quantum technology (37% of the global total), Japan had over 8,600 (~20%), Germany just over 7,000, China almost at 7,000 with France closely behind. More notable perhaps are the numbers of patent applications filed globally, with the United States and China neck-and-neck at 30,099 and 28,593 as of January 2024. Strangely, and its worth thinking about why, granted patents decreased for the global top 10 players in 2021 and 2022.

The European Union has the highest number and concentration of Q.T. talent, per OECD data through 2021, with 113,000 graduates in QT-relevant fields, with India at 91,000 and China at 64,000 and the United States at 55,000. The number of universities with Q.T. programs increased 8.3% to 195, while those offering masters degrees in Q.T. increased by 10% to 55.

What are the legal considerations implicated by commercial quantum technology?

Despite the endless possibilities, legal considerations are looming with the rise of commercial quantum computing. In order to embrace the potential changes brought by quantum computing, legal experts must grasp its foundational principles, capabilities, and ramifications to maneuver through regulatory landscapes, safeguarding intellectual property rights, and resolving disputes.

Cybersecurity: Data is protected by cryptography and the use of algorithms. With exponentially higher computing power, the beginning of commercial quantum computing will require quantum cryptography that cannot be hacked. From when quantum computing becomes available to hackers until quantum cryptography can achieve ubiquity, how will we keep our networks and data safe from cyber-criminals? Can quantum-resistant cryptography protect against this obvious risk?

Privacy: Commercial enterprises will need to adopt procurement policies and implement security protocols that enable compliance with the General Directive on Privacy Regulation in Europe, the China Data Protection Act, and similar legislation in the United States, such as the California Consumer Privacy Act and its progeny. Companies that form the nucleus of our infrastructure for telecommunications, energy, water, waste, health, banking, and other essential services will need extra protection. The consequences of failure are immeasurable. How will we protect the terabytes of additional personal information that quantum computers can collect, transmit, store, analyze, monetize, and use? Existing regulations do not contemplate the gargantuan amount of personal data that will be collected, and new, sensible policies will need to be contemplated and created before the technology exists.

Competition: In the first, second, and third industrial revolutions, we saw first-movers acquire dominant market positions. The public responded by developing legislation to allow the government to break up private enterprises. How will we protect the marketplace from being dominated by a first mover in commercial quantum computing to ensure that healthy competition continues to exist?

Blockchains and smart contracts: The proliferation of quantum computing capabilities should enable greater use of distributed ledgers or blockchains to automate supply chains and commercial and financial transactions. How will they be enabled and protected? Who will be responsible if they are compromised or lost?

Cloud computing: The cloud will be disrupted. Conventional, slower computers will become obsolete when quantum computers enter the data center. Who will have access to quantum cloud computing, and when? The quantum divide could replace the digital divide.

Artificial intelligence: What will happen if quantum computing enables quantum computers to use A.I. to make decisions about people and their lives? Who will be responsible if the computer makes an error, discriminates on some algorithmic bias (e.g., profiling), or makes decisions against sound public policies?

Legal system:Quantum computing will profoundly disrupt the legal system, as it imports large scale efficiencies and speeds to processes, surpassing the capabilities of human intelligence, including that of the very best lawyers. Eventually, as quantum computing is miniaturized and placed on handheld devices, we approach singularity and a paradigm shift so profound that our entire legal system may be turned on its head.

Quantum computing embodies a future with possibilities akin to the pioneering spirit of space exploration. While classical computers retain prominence for many tasks, quantum computing offers unparalleled potential to tackle complex problems on an unprecedented scale, heralding a new era of innovation and discovery that fills us with hope and optimism. However, to fully capitalize on the potential of this tremendous technology, these kinds of legal concerns must be effectively addressed.

Image Source: Deposit Photos Author: perig76 Image ID: 241846620

More here:

Here Come the Qubits? What You Should Know About the Onset of Quantum Computing - IPWatchdog.com

Posted in Quantum Computing | Comments Off on Here Come the Qubits? What You Should Know About the Onset of Quantum Computing – IPWatchdog.com

The power of Quantum Computing – The Cryptonomist

Posted: at 2:06 am

One of the exponential technologies that is not yet getting its fair share of love from the general public and media is Quantum computing. In the past few years, I had the privilege of spending time discussing it with people from CERN and the Fermi Lab, but my conversation with Scott Crowder, Vice President IBM Quantum Adoption and Business Development, had the right mix of theory and real-life examples, which will make anyone understand the potential of this field of research and its business applications. AI will keep its hype for a good while, as we see from its pervasive presence in every corner of the internet. Quantum can be the next big thing. This is our dialogue.

Who are you and what do you do for a living?

My name is Scott Crowder, and I run IBMs Quantum efforts to boost its adoption, together with our partners and industry clients. Our goal is to build a useful Quantum computing infrastructure and to help the world make a Quantum-safe transition, in the next ten years or so. I am an engineer by training and had worked on semi-conductors in the past, before taking on the role of CTO for IBM Systems. With Quantum, its the first time where we have a use first attitude, where we try things with partners, we teach and learn with our clients, before we scale-up projects. Its interesting and its fun.

What are the three killer use cases for Quantum, for what we know now?

Firstly, simulating nature, like materials science new materials, or chemistry, for example better battery chemistry, to mention something that is very hot right now. We do physics simulations or try to understand how complex proteins would behave. These are operations that entail higher computing power than what we could do with todays computers.

Secondly, we try to find patterns out of complex data. For example, a classification of a piece of data as fraud or not. If there is some structure in the data before us, Quantum computing is much better than classical computers to give meaning to it and even pick up things like false positives. This is extremely useful, if we want to make sense of the world.

Lastly, I would say, portfolio optimization, finding efficiencies, and distribution optimization. There are direct and huge applications here, for multiple industries. Think of the mobility or logistics markets, for example. This third use case is slightly farther out from us, in terms of time to market, when compared to the first two.

Where are we really, when it comes to Quantum adoption in the real world?

To simplify it: Quantum is better at doing what it does best, namely simulations. For sure, to do it at scale, larger systems are needed. So, we are looking at 2030 and beyond. What we are doing now is, lets say, algorithmic explorations. We work with a mix of partners: heavy industry conglomerates, banking, pharma, transportation, and startups. And, obviously, universities and research institutions.

Big Tech is also into Quantum, even though the talk of the town is AI. Intel, Microsoft, Google, AWS: all have investments and programs in Quantum, with different approaches to it.

What is the future business model of Quantum? How are you going to sell it?

Its hard to say right now. We must make some assumptions. Its probably going to continue to be, in the medium term, a cloud service, where partners have access to the Quantum capabilities we have built, via API calls, and they can interact with our experts, who help with the prototyping and the training. Basically, its going to be the same as a standard cloud business model. There will be ad hoc projects for sure, where the stakes are high, and we can unlock tremendous economic value. In a way, the approach is more like how we weave CPUs and GPUs into a compute fabric, and not via a single application per se, like a Chat GPT for Quantum.

What would you say is the number one risk associated with Quantum?

Cybersecurity is for sure the number one risk. Future, more powerful Quantum computers will crack at some point the current asymmetric cryptography, which protects public and private information, for example (mobile data, payments, medical records, etc). The math for that already exists. There are Quantum-safe cryptography solutions, but a full ecosystem of security providers and coding will need to change, to account for the Quantum shift, and to make sure we have a Quantum safe era.

Where can we find you and learn more about Quantum?

A simple search for anything related to IBM Quantum will do. I am also active on social media, like LinkedIn. IBM writes a lot of articles on Quantum. We need to talk about it publicly, and have people understand this is real, and it has great potential to bring tremendous value to society and business, across all industries. You may think this is science fiction, as its going to hit us in our next decade, but it is a new way of approaching complex problems. It could help other applications and use cases, as well, like AI, and this is why its the right moment to talk Quantum.

View original post here:

The power of Quantum Computing - The Cryptonomist

Posted in Quantum Computing | Comments Off on The power of Quantum Computing – The Cryptonomist

ISC 2024 A Few Quantum Gems and Slides from a Packed QC Agenda – HPCwire

Posted: at 2:06 am

If you were looking for quantum computing content, ISC 2024 was a good place to be last week there were around 20 quantum computing related sessions. QC even earned a slide in Kathy Yelicks opening keynote Beyond Exascale. Many of the quantum sessions (and, of course, others) were video-recorded and ISC has now made them freely accessble.

Not all were recorded. For example what sounded like a tantalizing BOF panel Toward Hardware Agnostic Standards in Hybrid HPC/Quantum Computing featuring Bill Gropp (NCSA, University of Illinois), Philippe Deniel (Commissariat Energie Atomique (CEA)), Mitsuhisa Sato (RIKEN), Travis Humble (ORNL), Venkatesh Kannan (Irelands High Performance Centre), and Kristel Michielsen (Julich Supercomputing Center). Was sorry to miss that.

Regardless, theres a wealth of material online and its worth looking through the ISC 2024 inventory for subjects, speakers, and companies of interest (registration may be required). Compiled below are a few QC soundbites from ISC.

Yelick, vice chancellor for research at the University of California, covered a lot of ground in her keynote examining the tension and opportunities emerging from the clash of traditional FP64 HPC and mixed-precision AI and how the commercial supply line of advanced chips is changing. Quantum computing earned a much smaller slice.

I really just have this one slide about quantum. Theres been some really exciting progress if you have been following this and things like error correction over the last year with really, significant improvements in terms of the ability to build error corrected quantum systems. On the other hand, I would say we dont yet have an integrated circuit kind of transistor model yet, right. Weve got a bunch of transistors, [i.e.] weve got a whole bunch of different kinds of qubits that you can build, [and] theres still some debate [over them].

In fact, the latest one of the latest big error correction results was actually not for the superconducting qubits, which is what a lot of the early startups were in, but for the AMO (atomic, molecular, optical) physics. So this is really looking at the fact that were not yet at a place where we can rely on this for the next generation of computing, which is not to say that we should be ignoring it. Im really interested to see how [quantum computing evolves and] also thinking about how much classical computing were going to need with quantum because thats also going to be a big challenge with quantum. [Its] very exciting, but its not replacing also general purpose kind of computing that we do for science and engineering.

Not sure if thats a glass half-full or half-empty perspective. Actually, many of the remaining sessions tackled the questions she posed, including the best way to implement hyrbid HPC-Quantum system, error correction and error mitigation, and the jostling among competing qubit types.

It was easy to sympathize (sort of) with speakers presenting at the Quantum Computing Status of Technologies session, moderated by Valeria Bartsch of Fraunhofer CFL. The speakers came from companies developing different qubit modalities and, naturally, at least a small portion of their brief talks touted their company technology.

She asked, Heres another [submitted question]. What is the most promising quantum computing technology that your company is not developing yourself? I love that one. And everybody has to answer it now. You can think for a few seconds.

Very broadly speaking neutral atom, trapped ion, and superconducting are perhaps the most advanced qubit modalities currently and each speaker presented a bit of background on their companies technology and progress. Trapped ions boast long coherence times but somewhat slower swicthing speeds. Superconducting qubits are fast, and perhaps easier to scale, but error prone. Neutral atoms also have long coherence times but have so far been mostly used for analog computing though efforts are moving quickly to implement gate-based computing. To Hayes point, Marjorana (topology) qubits would be inherently resistant to error.

Not officially part of the ISC program, Hyperion delivered its mid-year HPC market update online just before the conference. The full HPCwire coverage is here and Hyperion said it planned to put its recorded presentation and slides available on its website. Chief Quantum Analyst Bob Sorensen provided a brief QC snapshot during the update predicting the WW QC market will surpass $1 billion in 2025.

Sorensen noted, So this is a quick chart (above) that just shows the combination of the last four estimates that we made, you can see starting in 2019, all the way up to this 2023 estimate that reaches that $1.5 billion in 2026 I talked about earlier. Now my concern here is always its dangerous to project out too far. So we do tend to limit the forecast to these kinds of short ranges, simply because a nascent sector like quantum, which has so much potential, but at the same time has some significant technical hurdles to overcome [which] means that there can be an inflection point most likely though in the upward direction.

He also pointed out that a new use case, a new breakthrough in modality or algorithms, any kind of significant driver that brings more interest in and performance to quantum kick can significantly change the trajectory here on the upside.

Sorensen said, Just to give you a sense of how these vendors that we spoke to looked at algorithms, we see the big three are still the big three in mod-sim, optimization, and AI with with some interest in cybersecurity aspects, post quantum encryption kinds of research and such as well as Monte Carlo processes taking advantage of quantum stability to generate random number generator, provable random numbers to support the Monte Carlo processing.

Interesting here is that were seeing a lot more other (17%). This is the first time weve seen that. We think it is [not so much] about new algorithms, but perhaps hybrid mod-sim optimized or machine learning that feeds into the optimization process. So we think were seeing more hybrid applications emerging as people take a look at the algorithms and decide what solves the use case that they have in hand, he said.

Satoshi Matsuoka, director of RIKEN Center for Computational Science, provided a quick overview of Fugaku plans for incorporating quantum computing as well as touching on the status of the ABCI-Q project. He, of course, has been instrumental with both systems. Both efforts emphasize creating a hybrid HPC-AI-Quantum infrastructure.

The ABCI-Q infrastructure (slide below) will be a variety of quantum-inspired and actual quantum hardware. Fujitsu will supply the former systems. Currently, quantum computers based on neutral atoms, superconducting qubits, and photonics are planned. Matsuoka noted this is well-funded a few $100 million with much of the work done geared toward industry.

Rollout of the integrated quantum-HPC hybrid infrastructure at Fugaku is aimed at the 2024/25 timeframe. Its also an ambitious effort.

About the Fugaku effort, Matsuoka said, [This] project is funded by a different ministry, in which we have several real quantum computers, IBMs Heron (superconducting QPU), a Quantinuum (trapped ion qubits), and quantum simulators. So real quantum computers and simulators to be coupled with Fugaku.

The objective of the project [is to] come up with a comprehensive software stack, such that when the real quantum computers that are more useful come online, then we can move the entire infrastructure along with any of those with quantum computers along with their successors to be deployed to solve real problems. This will be one of the largest hybrid supercomputers.

The aggressive quantum-HPC integration sounds a lot like what going on in Europe. (See HPCwire coverage, Europes Race towards Quantum-HPC Integration and Quantum Advantage)

The topic of benchmarking also came up during Q&A at one session. A single metric such as the Top500 is generally not preferred. But what then, even now during the so-called NISQ (noisy intermediate-scale quantum) computing era?

One questioner said, Lets say interesting algorithms and problems. Is there anything like, and Im not talking about a top 500 list for quantum computers, like an algorithm where we can compare systems? For example, Shors algorithm. So who did it and what is the best performance or the largest numbers you were able to factorize?

Hayes (Quantinuum) said, So we havent attempted to run Shors algorithm, and interesting implementations of Shors algorithm are going to require fault tolerance to factor a number that a classical computer cant. But you know, that doesnt mean it cant be a nice benchmark to see which company can factor the largest one. I did show some data on the quantum Fourier transform. Thats a primitive in Shors algorithm. I would say that thatd be a great candidate for benchmarking the progress and fault tolerance.

More interesting benchmarks for the NISC era are things like quantum volume, and theres some other ones that can be standardized, and you can make fair comparisons. So we try to do that. You know, theyre not widely or universally adopted, but there are organizations out there trying to standardize them. Its difficult getting everybody marching in the same direction.

Corcoles (IBM) added, I think benchmarking in quantum has an entire community around it, and they have been working on it for more than a decade. I read your question as focusing on application-oriented benchmarks versus system-oriented benchmarks. There are layers of subtlety there as well. If we think about Shors algorithm, for example, there were recent works last year suggesting theres more than one way to run Shors. Depending on the architecture, you might choose one or another way.

An architecture that is faster might choose to run many circuits in parallel that can capture Shors algorithm and then do a couple of processing or architecture that that might might take more time they just want to run one single circuit with high probability measure the right action. You could compare run times, but theres probably going to be differences that add to the uncertainty of what what technology you will use, meaning that there might be a regime of factoring, where you might want to choose one aspect or another, but then your particular physical implement, he said.

Macri (QuEra) said, My point is were not yet at the point where we can really [compare systems]. You know we dont want to compete directly with our technologies. I would say that especially in for what concerns applications we need to adopt a collaborative approach. So for example, there are certain areas where these benchmarks that you mentioned are not really applicable. One of them is a quantum simulation and we have seen really a lot of fantastic results from our technology, as well as from ion traps and superconducting qubits.

It doesnt really make sense really to compare the basic features of the technologies so that, you know, we can a priori, identify what is the specific application the result that you want to achieve. I would say lets focus on advancing the technology we see. We already know that there are certain types of devices that outperform others for specific applications. And then we will, we will decide these perhaps at a later stage. But I agreed for for very complex tasks, such as quantum Fourier transform, or perhaps the Shors algorithm, but I think, to be honest, its still too preliminary [for effective system comparisons].

As noted this was a break-out year for quantum at ISC which has long had quantum sessions but not as many. Europes aggressive funding, procurements, and HPC-quantum integration efforts make it clear it does not intend to be left behind in the quantum computing land rush, with, hopefully, a gold rush to follow.

Stay tuned.

Read more from the original source:

ISC 2024 A Few Quantum Gems and Slides from a Packed QC Agenda - HPCwire

Posted in Quantum Computing | Comments Off on ISC 2024 A Few Quantum Gems and Slides from a Packed QC Agenda – HPCwire

Major First: Quantum Information Produced, Stored, And Retrieved – ScienceAlert

Posted: April 20, 2024 at 9:20 am

The potential of quantum computing is immense, but the distances over which entangled particles can reliably carry information remains a massive hurdle. The tiniest of disturbances can make a scrambled mess of their relationship.

To circumvent the problem, quantum computing researchers have found ways to stabilize long lengths of optical fibers or used satellites to preserve signals through the near-vacuum of space.

Yet there's more to a quantum-based network than a transmission. Scientists struggled to crack their long sought-after goal of developing a system of interconnected units or 'repeaters' that can also store and retrieve quantum information much like classical computers do, to extend the network's reach.

Now, a team of researchers have created a system of atomic processing nodes that can contain the critical states created by a quantum dot at wavelengths compatible with existing telecommunications infrastructure.

It requires two devices: one to produce and potentially entangle photons, and another 'memory' component that can store and retrieve the all-important quantum states within those photons on demand without disturbing them.

"Interfacing two key devices together is a crucial step forward in allowing quantum networking, and we are really excited to be the first team to have been able to demonstrate this," says quantum optics physicist and lead author Sarah Thomas, from the Imperial College London (ICL).

Made partly in Germany and assembled at ICL, the newly proposed system places a semiconductor quantum dot capable of emitting a single photon at a time in a cloud of hot rubidium atoms, serving as quantum memory. A laser turns the memory component 'on' and 'off', allowing the photons' states to be stored and released from the rubidium cloud on demand.

The distances over which this particular system could transmit quantum memories haven't been tested it's just a proof-of-concept prototype in a basement lab, one based on photons that aren't even entangled. But the feat could lay a solid foundation for the quantum internet, better than relying on entangled photons alone.

"This first-of-its-kind demonstration of on-demand recall of quantum dot light from an atomic memory is the first crucial step toward hybrid quantum light-matter interfaces for scalable quantum networks," the team writes in their published paper.

Researchers in quantum computing have been trying to link up photon light sources and processing nodes that store quantum data for some time, without much success.

"This includes us, having tried this experiment twice before with different memory and quantum dot devices, going back more than five years, which just shows how hard it is to do," says study co-author Patrick Ledingham, an experimental quantum physicist from the University of Southampton in the UK.

Part of the problem was that the photon-emitting quantum dots and atomic 'memory' nodes used so far were tuned to different wavelengths; their bandwidths incompatible with each other.

In 2020, a team from China tried chilling rubidium atoms to lure them into the same entangled state as the photons, but those photons then had to be converted to a suitable frequency for transmitting them along optic fibers which can create noise, destabilizing the system.

The memory system designed by Thomas and colleagues has a bandwidth wide enough to interface with the wavelengths emitted by the quantum dot and low enough noise so as not to disturb entangled photons.

While the feat is significant, the researchers are still working to improve their prototype. To create quantum network-ready devices, they want to try extending storage times, increasing the overlap between the quantum dots and atomic nodes, and shrinking the size of the system. They also need to test their system with entangled photons.

For now, it remains a tenuous thread, but one day we could see this technology or something like it covering the world in a web of delicate yet stable quantum networks.

The study has been published in Science Advances.

Continued here:

Major First: Quantum Information Produced, Stored, And Retrieved - ScienceAlert

Posted in Quantum Computing | Comments Off on Major First: Quantum Information Produced, Stored, And Retrieved – ScienceAlert

Quantum Cloud Computing Secured in New Breakthrough at Oxford – TechRepublic

Posted: at 9:20 am

Businesses are one step closer to quantum cloud computing, thanks to a breakthrough made in its security and privacy by scientists at Oxford University.

The researchers used an approach dubbed blind quantum computing to connect two quantum computing entities (Figure A); this simulates the situation where an employee at home or in an office remotely connects to a quantum server via the cloud. With this method, the quantum server provider does not need to know any details of the computation for it to be carried out, keeping the users proprietary work secure. The user can also easily verify the authenticity of their result, confirming it is neither erroneous nor corrupted.

Figure A

Ensuring the security and privacy of quantum computations is one of the most significant roadblocks that has held the powerful technology back so far, so this work could lead to it finally entering the mainstream.

Despite only being tested on a small scale, the researchers say their experiment has the potential to be scaled up to large quantum computations. Plug-in devices could be developed that safeguard a workers data while they access quantum cloud computing services.

Professor David Lucas, the co-head of the Oxford University Physics research team, said in a press release: We have shown for the first time that quantum computing in the cloud can be accessed in a scalable, practical way which will also give people complete security and privacy of data, plus the ability to verify its authenticity.

Classical computers process information as binary bits represented as 1s and 0s, but quantum computers do so using quantum bits, or qubits. Qubits exist as both a 1 and a 0 at the same time, but with a probability of being one or the other that is determined by their quantum state. This property enables quantum computers to tackle certain calculations much faster than classical computers, as they can solve problems simultaneously.

Quantum cloud computing is where quantum resources are provided to users remotely over the internet; this allows anyone to utilise quantum computing without the need for specialised hardware or expertise.

FREE DOWNLOAD: Quantum computing: An insiders guide

With typical quantum cloud computing, the user must divulge the problem they are trying to solve to the cloud provider; this is because the providers infrastructure needs to understand the specifics of the problem so it can allocate the appropriate resources and execution parameters. Naturally, in the case of proprietary work, this presents a security concern.

This security risk is minimised with the blind quantum computing method because the user remotely controls the quantum processor of the server themselves during a computation. The information required to keep the data secure like the input, output and algorithmic details only needs to be known by the client because the server does not make any decisions with it.

Never in history have the issues surrounding privacy of data and code been more urgently debated than in the present era of cloud computing and artificial intelligence, said Professor Lucas in the press release.

As quantum computers become more capable, people will seek to use them with complete security and privacy over networks, and our new results mark a step change in capability in this respect.

Quantum computing is vastly more powerful than conventional computing, and could revolutionise how we work if it is successfully scaled out of the research phase. Examples include solving supply chain problems, optimising routes and securing communications.

In February, the U.K. government announced a 45 million ($57 million) investment into quantum computing; the money goes toward finding practical uses for quantum computing and creating a quantum-enabled economy by 2033. In March, quantum computing was singled out in the Ministerial Declaration, with G7 countries agreeing to work together to promote the development of quantum technologies and foster collaboration between academia and industry. Just this month, the U.K.s second commercial quantum computer came online.

Due to the extensive power and refrigeration requirements, very few quantum computers are currently commercially available. However, several leading cloud providers do offer so-called quantum-as-a-service to corporate clients and researchers. Googles Cirq, for example, is an open source quantum computing platform, while Amazon Braket allows users to test their algorithms on a local quantum simulator. IBM, Microsoft and Alibaba also have quantum-as-a-service offerings.

WATCH: What classic software developers need to know about quantum computing

But before quantum computing can be scaled up and used for business applications, it is imperative to ensure it can be achieved while safeguarding the privacy and security of customer data. This is what the Oxford University researchers hoped to achieve in their new study, published in Physical Review Letters.

Dr. Peter Dmota, study lead, told TechRepublic in an email: Strong security guarantees will lower the barrier to using powerful quantum cloud computing services, once available, to speed up the development of new technologies, such as batteries and drugs, and for applications that involve highly confidential data, such as private medical information, intellectual property, and defence. Those applications exist also without added security, but would be less likely to be used as widely.

Quantum computing has the potential to drastically improve machine learning. This would supercharge the development of better and more adapted artificial intelligence, which we are already seeing impacting businesses across all sectors.

It is conceivable that quantum computing will have an impact on our lives in the next five to ten years, but it is difficult to forecast the exact nature of the innovations to come. I expect a continuous adaptation process as users start to learn how to use this new tool and how to apply it to their jobs similar to how AI is slowly becoming more relevant at the mainstream workplace right now.

Our research is currently driven by quite general assumptions, but as businesses start to explore the potential of quantum computing for them, more specific requirements will emerge and drive research into new directions.

Blind quantum cloud computing requires connecting a client computer that can detect photons, or particles of light, to a quantum computing server with a fibre optic cable (Figure B). The server generates single photons, which are sent through the fibre network and received by the client.

Figure B

The client then measures the polarisation, or orientation, of the photons, which tells it how to remotely manipulate the server in a way that will produce the desired computation. This can be done without the server needing access to any information about the computation, making it secure.

To provide additional assurance that the results of the computation are not erroneous or have been tampered with, additional tests can be undertaken. While tampering would not harm the security of the data in a blind quantum computation, it could still corrupt the result and leave the client unaware.

The laws of quantum mechanics dont allow copying of information and any attempt to observe the state of the memory by the server or an eavesdropper would corrupt the computation, Dr Dmota explained to TechRepublic in an email. In that case, the user would notice that the server isnt operating faithfully, using a feature called verification, and abort using their service if there are any doubts.

Since the server is blind to the computation ie, is not able to distinguish different computations the client can evaluate the reliability of the server by running simple tests whose results can be easily checked.

These tests can be interleaved with the actual computation until there is enough evidence that the server is operating correctly and the results of the actual computation can be trusted to be correct. This way, honest errors as well as malicious attempts to tamper with the computation can be detected by the client.

Figure C

The researchers found the computations their method produced could be verified robustly and reliably, as per the paper. This means that the client can trust the results have not been tampered with. It is also scalable, as the number of quantum elements being manipulated for performing calculations can be increased without increasing the number of physical qubits in the server and without modifications to the client hardware, the scientists wrote.

Dr. Drmota said in the press release, Using blind quantum computing, clients can access remote quantum computers to process confidential data with secret algorithms and even verify the results are correct, without revealing any useful information. Realising this concept is a big step forward in both quantum computing and keeping our information safe online.

The research was funded by the UK Quantum Computing and Simulation Hub a collaboration of 17 universities supported by commercial and government organisations. It is one of four quantum technology hubs in the UK National Quantum Technologies Programme.

See original here:

Quantum Cloud Computing Secured in New Breakthrough at Oxford - TechRepublic

Posted in Quantum Computing | Comments Off on Quantum Cloud Computing Secured in New Breakthrough at Oxford – TechRepublic

Quantum Computing Could be the Next Revolution – Fair Observer

Posted: at 9:20 am

Every few decades, the world witnesses technological revolutions that profoundly change our lives. This happened when we first invented computers, when we created the Internet and most recently when artificial intelligence (AI) emerged.

Today, experts frequently speculate that the next revolution will involve technologies grounded in the principles of quantum mechanics. One such technology is quantum computing. Harnessing the unique properties of quantum mechanics, quantum computers promise to achieve superior computational power, solving certain tasks that are beyond the reach of classical computers.

Quantum computers can potentially transform many sectors, from defense and finance to education, logistics and medicine. However, we are currently in a quantum age reminiscent of the pre-silicon era of classical computers. Back then, state-of-the-art computers like ENIAC ran on vacuum tubes, which were large, clunky, and required a lot of power. During the 1950s, experts investigated various platforms to develop the most efficient and effective computing systems. This journey eventually led to the widespread adoption of silicon semiconductors, which we still use today.

Similarly, todays quantum quest involves evaluating different potential platforms to produce what the industry commonly calls a fault-tolerant quantum computer quantum computers that are able to perform reliable operations despite the presence of errors in their hardware.

Tech giants, including Google and IBM, are adapting superconductors materials that have zero resistance to electrical current to build their quantum computers, claiming that they might be able to build a reasonably large quantum computer by 2030. Other companies and startups dedicated to quantum computing, such as QuEra, PsiQuantum and Alice & Bob, are experimenting with other platforms and even occasionally declaring that they might be able to build one before 2030.

Until the so-called fault-tolerant quantum computer is built, the industry needs to go through an era commonly referred to as the Noisy Intermedia-Scale Quantum (NISQ) era. NISQ quantum devices contain a few hundred quantum bits (qubits) and are typically prone to errors due to various quantum phenomena.

NISQ devices serve as early prototypes of fault-tolerant quantum computers and showcase their potential. However, they are not expected to clearly demonstrate practical advantages, such as solving large scale optimization problems or simulating sufficiently complex chemical molecules.

Researchers attribute the difficulty of building such devices to the significant amount of errors (or noise) NISQ devices suffer from. Nevertheless, this is not surprising. The basic computational units of quantum computers, the qubits, are highly sensitive quantum particles easily influenced by their environment. This is why one way to build a quantum computer is to cool these machines to near zero kelvin a temperature colder than outer space. This reduces the interaction between qubits and the surrounding environment, thus producing less noise.

Another approach is to accept that such levels of noise are inevitable and instead focus on mitigating, suppressing or correcting any errors produced by such noise. This constitutes a substantial area of research that must advance significantly if we are to facilitate the construction of fault-tolerant quantum computers.

As the construction of quantum devices progresses, research advances rapidly to explore potential applications, not just for future fault-tolerant computers, but also possibly for todays NISQ devices. Recent advances show promising results in specialized applications, such as optimization, artificial intelligence and simulation.

Many speculate that the first practical quantum computer may appear in the field of optimization. Theoretical demonstrations have shown that quantum computers will be capable of solving optimization problems more efficiently than classical computers. Performing optimization tasks efficiently could have a profound impact on a broad range of problems. This is especially the case where the search for an optimized solution would usually require an astronomical number of trials.

Examples of such optimization problems are almost countless and can be found in major sectors such as finance (portfolio optimization and credit risk analysis), logistics (route optimization and supply chain optimization) and aviation (flight gate optimization and flight path optimization).

AI is another field in which experts anticipate quantum computers will make significant advances. By leveraging quantum phenomena, such as superposition, entanglement and interference which have no counterparts in classical computing quantum computers may offer advantages in training and optimizing machine learning models.

However, we still do not have concrete evidence supporting such claimed advantages as this would necessitate larger quantum devices, which we do not have today. That said, early indications of these potential advantages are rapidly emerging within the research community.

Simulating quantum systems was the original application that motivated the idea of building quantum computers. Efficient simulations will likely drastically impact many essential applications, such as material science (finding new material with superior properties, like for better batteries) and drug discovery (development of new drugs by more accurately simulating quantum interactions between molecules).

Unfortunately, with the current NISQ devices, only simple molecules can be simulated. More complex molecules will need to wait for the advent of large fault-tolerant computers.

There is uncertainty surrounding the timeline and applications of quantum computers, but we should remember that the killer application for classical computers was not even remotely envisioned by their inventors. A killer application is the single application that contributed the most to the widespread use of a certain technology. For classical computers, the killer application, surprisingly, turned out to be spreadsheets.

For quantum computers, speculation often centers around simulation and optimization being the potential killer applications of this technology, but a definite winner is still far from certain. In fact, the quantum killer application may be something entirely unknown to us at this time and it may even arise from completely uncharted territories.

[Will Sherriff edited this piece.]

The views expressed in this article are the authors own and do not necessarily reflect Fair Observers editorial policy.

Excerpt from:

Quantum Computing Could be the Next Revolution - Fair Observer

Posted in Quantum Computing | Comments Off on Quantum Computing Could be the Next Revolution – Fair Observer

Page 4«..3456..1020..»