Cleveland Clinic Selected as Founding Partner in Greater Washington, D.C., Quantum Computing Hub – Cleveland Clinic Newsroom

Cleveland Clinic has been selected as a founding partner and the leading healthcare system in a new initiative meant to spur collaboration and innovation in the quantum computing industry.

Based in Greater Washington D.C., Connected DMV and a cross-sector coalition of partners are developing the new Life Sciences and Healthcare Quantum Innovation Hub to prepare the industry for the burgeoning quantum era and align with key national and global efforts in life sciences and quantum technologies.

The U.S. Department of Commerces Economic Development Administration (EDA) has awarded more than $600,000 to Connected DMV for development of the Hub. This will include the formation of a collaboration of at least 25 organizations specializing in quantum end-use and technology build.

Cleveland Clinic was invited to join the Hub because of its work in advancing medical research through quantum computing. As the lead healthcare system in the coalition, Cleveland Clinic will help define quantums role in the future of healthcare and disseminate education to other health systems on its possibilities.

We believe quantum computing holds great promise for accelerating the pace of scientific discovery, said Lara Jehi, M.D., M.H.C.D.S., Cleveland Clinics Chief Research Information Officer. As an academic medical center, research, innovation and education are an integral part of Cleveland Clinics mission. Quantum, AI and other emerging technologies have the potential to revolutionize medicine, and we look forward to working with partners across healthcare and life sciences to solve complex medical problems and change the course of diseases like cancer, heart conditions and neurodegenerative disorders.

Last year, Cleveland Clinic announced a 10-year partnership with IBM to establish the Discovery Accelerator, a joint center focused on easing traditional bottlenecks in medical research through innovative technologies such as quantum computing, hybrid cloud and artificial intelligence. The partnership leverages Cleveland Clinics medical expertise with the technology expertise of IBM including its leadership in quantum technology which recently resulted in the Breakthrough Award in Fundamental Physics for quantum information science. The Discovery Accelerator will allow Cleveland Clinic to contribute to Connected DMVs Hub by advancing the pace of discovery with the first private sector on-premises Quantum System One being installed on Cleveland Clinics main campus.

Innovation is always iterative, and requires sustained collaboration between research, development and technology, and the industries that will benefit from the value generated, said George Thomas, Chief Innovation Officer of Connected DMV and lead of its Potomac Quantum Innovation Center initiative. Quantum has the potential to have a substantive impact on our society in the near future, and the Life Sciences and Healthcare Quantum Innovation Hub will serve as the foundation for sustained focus and investment to accelerate and scale our path into the era of quantum.

The Hub will be part of Connected DMVs Potomac Quantum Innovation Center initiative, which aims to: accelerate quantum investment, and research and development; develop an equitable and scalable talent pipeline; and scale collaboration between the public sector, academia, industry, community, and investors to accelerate the value of quantum. The Quantum Innovation Hubs are a part of this initiative to focus on accelerating quantum investment, research and development in key industry sectors.

Read more from the original source:
Cleveland Clinic Selected as Founding Partner in Greater Washington, D.C., Quantum Computing Hub - Cleveland Clinic Newsroom

What’s next for quantum computing | MIT Technology Review

For years, quantum computings news cycle was dominated by headlines about record-setting systems. Researchers at Google and IBM have had spats over who achieved whatand whether it was worth the effort. But the time for arguing over whos got the biggest processor seems to have passed: firms are heads-down and preparing for life in the real world. Suddenly, everyone is behaving like grown-ups.

As if to emphasize how much researchers want to get off the hype train, IBM is expected to announce a processor in 2023 that bucks the trend of putting ever more quantum bits, or qubits, into play. Qubits, the processing units of quantum computers, can be built from a variety of technologies, including superconducting circuitry, trapped ions, and photons, the quantum particles of light.

IBM has long pursued superconducting qubits, and over the years the company has been making steady progress in increasing the number it can pack on a chip. In 2021, for example, IBM unveiled one with a record-breaking 127 of them. In November, it debuted its 433-qubit Osprey processor, and the company aims to release a 1,121-qubit processor called Condor in 2023.

But this year IBM is also expected to debut its Heron processor, which will have just 133 qubits. It might look like a backwards step, but as the company is keen to point out, Herons qubits will be of the highest quality. And, crucially, each chip will be able to connect directly to other Heron processors, heralding a shift from single quantum computing chips toward modular quantum computers built from multiple processors connected togethera move that is expected to help quantum computers scale up significantly.

Heron is a signal of larger shifts in the quantum computing industry. Thanks to some recent breakthroughs, aggressive roadmapping, and high levels of funding, we may see general-purpose quantum computers earlier than many would have anticipated just a few years ago, some experts suggest. Overall, things are certainly progressing at a rapid pace, says Michele Mosca, deputy director of the Institute for Quantum Computing at the University of Waterloo.

Here are a few areas where experts expect to see progress.

IBMs Heron project is just a first step into the world of modular quantum computing. The chips will be connected with conventional electronics, so they will not be able to maintain the quantumness of information as it moves from processor to processor. But the hope is that such chips, ultimately linked together with quantum-friendly fiber-optic or microwave connections, will open the path toward distributed, large-scale quantum computers with as many as a million connected qubits. That may be how many are needed to run useful, error-corrected quantum algorithms. We need technologies that scale both in size and in cost, so modularity is key, says Jerry Chow, director at IBMQuantum Hardware System Development.

See the rest here:
What's next for quantum computing | MIT Technology Review

UMN-led team receives $1.4M Keck Foundation grant to study possible breakthrough in quantum computing – UMN News

A University of Minnesota Twin Cities-led team received a $1.4 million award from the W. M. Keck Foundation to study a new process that combines quantum physics and biochemistry. If successful, the research could lead to a major breakthrough in the quantum computing field.

The project is one of two proposals the University of Minnesota submits each year to the Keck Foundation and is the first grant of its kind the University has received in 20 years.

Quantum computers have the potential to solve very complex problems at an unprecedented fast rate. They have applications in fields like cryptography, information security, supply chain optimization and could one day assist in the discovery of new materials and drugs.

One of the biggest challenges for scientists is that the information stored in quantum bits (the building blocks of quantum computers) is often short-lived. Early-stage prototype quantum computers do exist, but they lose the information they store so quickly that solving big problems of practical relevance is currently unachievable.

One approach researchers have studied to attempt to make quantum devices more stable is by combining semiconductors and superconductors to obtain robust states called Majorana modes, but this approach has been challenging and so far inconclusive since it requires very high-purity semiconductors. U of M School of Physics and Astronomy Associate Professor Vlad Pribiag, who is leading the project, has come up with a new idea that could yield stable Majorana quantum structures.

Pribiags proposed method leverages recent advances in DNA nanoassembly, combined with magnetic nanoparticles and superconductors, in order to detect Majoranas, which are theoretical particles that could be a key element for protecting quantum information and creating stable quantum devices.

This is a radically new way to think about quantum devices, Pribiag said. When I heard about this technique of DNA nanoassembly, I thought it fit right into this problem I had been working on about Majoranas and quantum devices. Its really a paradigm shift in the field and it has tremendous potential for finding a way to protect quantum information so that we can build more advanced quantum machines to do these complex operations.

The project, entitled Topological Quantum Architectures Through DNA Programmable Molecular Lithography, will span three years. Pribiag is collaborating with Columbia University Professor Oleg Gang, whose lab will handle the DNA nanoassembly part of the work.

About the W. M. Keck FoundationBased in Los Angeles, the W. M. Keck Foundation was established in 1954 by the late W. M. Keck, founder of the Superior Oil Company. The Foundations grant making is focused primarily on pioneering efforts in the areas of medical research and science and engineering. The Foundation also supports undergraduate education and maintains a Southern California Grant Program that provides support for the Los Angeles community, with a special emphasis on children and youth. For more information, visit the Keck Foundation website.

About the College of Science and EngineeringThe University of Minnesota College of Science and Engineering brings together the Universitys programs in engineering, physical sciences, mathematics and computer science into one college. The college is ranked among the top academic programs in the country and includes 12 academic departments offering a wide range of degree programs at the baccalaureate, master's, and doctoral levels. Learn more at cse.umn.edu.

See more here:
UMN-led team receives $1.4M Keck Foundation grant to study possible breakthrough in quantum computing - UMN News

Leading in a changing world with Hybrid Cloud and AI, underpinned by security – Times of India

We live in a time of Digital Darwinism, where technology and society are evolving faster than businesses can adapt, and those who are digitally fit will survive and thrive. The pandemic has taught us that technology is undoubtedly the sutradhar- a unifying force of the digital economy and will continue to propel Indias digital mission. Technology is ingrained in every aspect of our lives, including how we work, live, build, connect, and transact. Exponential technologies like hybrid cloud and AI have taken center stage in the contactless world and are all set to scale through this techade. Harnessing the power of technology evolution hybrid cloud, AI, and game-changing quantum computing will be crucial for India to lead the world through this digital revolution.

Hybrid cloud and AI are inherently dynamic and can expose organisations to cyber-attacks if data security isnt taken seriously. The Cost of Data Breach Report shows that Indias data breach costs increased to $176 million in 2022, a 25% increase from 2020. Businesses must note that a constantly evolving and increasingly complex tech ecosystem that is not resilient to evolving threats will compromise the integrity and trust in next-generation technologies. For India to grow in the techade, it is critical to understand the dynamics of digitisation and its opportunities and challenges. This is particularly true for growing cybersecurity risk; it is the need of the hour for businesses to strengthen and continuously improve their security posture by establishing a zero-trust cyber security environment.

Hybrid Cloud a critical strategy for our times

In a fast-evolving digital world, every enterprises de facto infrastructure has become hybrid. The pandemic significantly accelerated digital transformation in all businesses and cloud adoption helped in this journey. This urgent need for action prompted organisations to tactically assemble their current cloud estates quickly through a mix of public, private, and on-premises assetsthat may or may not work together efficiently. In fact, the average enterprise is expected to have 10 clouds by 2023, up from 8 in 2020. SaaS applications have also exploded, moving many standard business processes to the cloud. Without architectural guard rails, implementation pressures lead to corner-cutting, making the IT landscape more complex and costly, less secure, and less likely to deliver operational agility and better business outcomes.71% of executives cited data integration across the cloud estate as an enterprise problem.

Hybrid Cloud is a mix of cloud environments that includes public, private, as well as on-premises infrastructure. This mix of environments often falls short of financial and operational expectations without a level of integration across them with seamless interoperability and portability of applications and data across environments. Done right, a Hybrid Cloud Platform open by design and without vendor lock-in provides a fabric for orchestration, management, and application/data portability across these environments. This platform is increasingly relevant in a world where enterprises are building edge computing capabilities to monetize the opportunities available through the launch of 5G and 6G technology.

AI and Automation are foundational

With over two quintillion bytes created daily and much of it being unstructured data that our computers have not been able to interpret before Artificial Intelligence (AI) continues to gain relevance by augmenting human capabilities with analytics and insights to make informed decisions.

In India, data is generated everywhere. A village compounder, without Internet access, generates data when he requests the next batch of polio vaccine, even in a remote village. Likewise, the subsidy for medical procedures generates data about patients and their pre-existing conditions. Automation powered by AI will also help mitigate supply chain disruptions by making business processes and workflows more agile and efficient.

Through multi-lingual support, an NLP chatbot or conversational AI can bridge the language gap between Indias English-speaking and non-English-speaking populations. By using artificial intelligence to recognise handwriting across different languages in India, we can simplify data entry and create a clean data lake. From crop planning to precision farming, AI has the potential to empower farmers to deploy tailored interventions, whereas in the social sphere, AI will improve the quality of life for citizens, including hyper-personalisation in financial services and retail.

Evolution to Quantum

Advances in traditional classical computing, plus advances in AI, are driving the most important revolution in computing the emergence of Quantum computing. Enterprises will evolve from analysing data to discovering new ways to solve problems. When combined with open integration, AI, and hyper-automation, this will ultimately lead to new business models. As we have seen with the pandemic, disruption and uncertainties will make business models more sensitive to and dependent on new technologies. Quantum computing offers the potential to expand the scope and complexity of business problems we can solve. Quantum computing in combination with existing advanced technologies will dramatically impact how science and business evolve by accelerating the discovery of solutions to big global challenges. The integration of quantum computing, AI, and classical computing into hybrid multicloud workflows will drive the most significant computing revolution in 60 years.

Quantum computing will not replace classical computing it will extend and complement it by affording enterprises the opportunity to solve complex problems that test the limits of classical computing. Example of use cases that enterprises have started to experiment with include untangling operational disruption for airlines (IROPS), enhancing contextual personalised services for customers, optimising airline network/shipping logistics planning globally, and healthcare solutions related to genomics, single-cell transcriptomics, and population health.

Cyber Security the next strategic imperative

As next-generation technologies such as Hybrid Cloud, AI, IoT and Blockchain continue to evolve and become pervasive in a hyper-digitised world, cyber security becomes very critical and is fast becoming a Board level issues in all enterprises, irrespective of size.

Cyber security is now a strategic imperative for all enterprises they must strengthen their digital defence by establishing a zero-trust cyber security environment, one that must be continuously and frequently evaluated, particularly as bad actors continue to get smarter and bolder with technology evolution. In addition, enterprises and governments need the capability to detect and respond to threats at scale. With explosion of data and devices from digital economy there is an exponential increase in the number of devices connecting to the internet and the volume of data that is getting generated. With the evolution of 5G, the attack surface will increase due to increased volume of devices that will get onto the network.

With the emergence of advanced technologies, cyber threats will be more severe. For example, when quantum computers are scalable enough, they will be able to break the major cybersecurity protocols used today. Bad actors are looking to steal data today for breaking into it in the future. Enterprises must consider upgrading their systems to quantum safe cryptography on priority to avoid and protect against these attacks.

Views expressed above are the author's own.

END OF ARTICLE

Read the original:
Leading in a changing world with Hybrid Cloud and AI, underpinned by security - Times of India

Watch: How Abu Dhabi is ushering in a new era of computing with state-of-the-art quantum lab – Gulf News

Abu Dhabi: At the heart of Abu Dhabis science research hub in Masdar, a new era of computing is taking shape. With massive investments towards becoming a leader in the field, Abu Dhabi could well revolutionise quantum computing when a newly-developed foundry starts churning out quantum chips this summer.

With the world of computing still undecided on which platform works best to enable, and then scale up, quantum computing, chips manufactured at the laboratory will allow important experiments into the possibilities of various material and configurations.

Quantum foundry

The laboratory is part of the Quantum Research Centre, one of a number of research interests at the Technology Innovation Institute (TII), which focuses on applied research and is part of the over-arching Advanced Technology Research Council in Abu Dhabi.

TII Quantum Foundry will be the first quantum device fabrication facility in the UAE. At the moment, it is still under construction. We are installing the last of the tools needed to manufacture superconducting quantum chips. We are hoping that it will be ready soon, and hopefully by then, we can start manufacturing the first quantum chips in the UAE, Alvaro Orgaz, lead for the quantum computing control at the TIIs Quantum Research Centre, told Gulf News.

The design of quantum chips is an area of active research at the moment. We are also interested in this. So, we will manufacture our chips and install them into our quantum refrigerators, then test them and improve on each iteration of the chip, he explained.

What is quantum computing?

Classical computers process information in bits, tiny on and off switches that are encoded in zeroes and ones. In contrast, quantum computing uses qubits as the fundamental unit of information.

Unlike classical bits, qubits can take advantage of a quantum mechanical effect called superposition where they exist as 1 and 0 at the same time. One qubit cannot always be described independently of the state of the others either, in a phenomenon called entanglement. The capacity of a quantum computer increases exponentially with the number of qubits. The efficient usage of quantum entanglement drastically enhances the capacity of a quantum computer to be able to deal with challenging problems, explained Professor Dr Jos Ignacio Latorre, chief researcher at the Quantum Research Center.

Why quantum computing?

When quantum computers were first proposed in the 1980s and 1990s, the aim was to help computing for certain complex systems such as molecules that cannot be accurately depicted with classical algorithms.

Quantum effects translate well to complex computations in some fields like pharmaceuticals, material sciences, as well as optimisation processes that are important in aviation, oil and gas, the energy sector and the financial sector. In a classical computer, you can have one configuration of zeroes and ones or another. But in a quantum system, you can have many configurations of zeroes and ones processed simultaneously in a superposition state. This is the fundamental reason why quantum computers can solve some complex computational tasks more efficiently than classical computers, said Dr Leandro Aolita, executive director of quantum algorithms at the Quantum Research Centre.

Complementing classical computing

On a basic level, this means that quantum computers will not replace classical computers; they will complement them.

There are some computational problems in which quantum computers will offer no speed-up. There are only some problems where they will be superior. So, you would not use a quantum computer which is designed for high-performance computing to write an email, the researcher explained. This is why, in addition to research, the TII is also working with industry partners to see which computational problems may translate well to quantum computing and the speed-up this may provide, once the computers are mature enough to process them.

Quantum effect fragility

At this stage, the simplest quantum computer is already operational at the QRC laboratory in Masdar City. This includes two superconducting qubit chips mounted in refrigerators at the laboratory, even though quantum systems can be created on a number of different platforms.

Here, the super conducting qubit chip is in a cooler that takes the system down to a temperature that goes down to around 10 millikelvin, which is even cooler than the temperature of outer space. You have to isolate the system from the thermal environment, but you also need to be able to insert cables to control and read the qubits. This is the most difficult challenge from an engineering and a technological perspective, especially when you scale up to a million qubits because quantum effects are so fragile. No one knows exactly the exact geometric configurations to minimise the thermal fluctuations and the noise, [and this is one of the things that testing will look into once we manufacture different iterations of quantum chip], Dr Aolita explained.

Qubit quality

The quality of the qubit is also very important, which boils down to the manufacture of a chip with superconducting current that displays quantum effects. The chips at TII are barely 2x10 millimetres in size, and at their centre is a tiny circuit known as the Josephson junction that enables the control of quantum elements.

It is also not just a matter of how many qubits you have, as the quality of the qubits matters. So, you need to have particles that preserve their quantum superposition, you need to be able to control them, have them interact the way you want, and read their state, but you also have to isolate them from the noise of the environment, he said.

Optimistic timeline

Despite these massive challenges to perfect a minute chip, Dr Aolita was also quite hopeful about the work being accomplished at TII, including discussions with industry about the possible applications of quantum computing.

I think we could see some useful quantum advantages in terms of classical computing power in three to five years, he said. [Right now], we have ideas, theories, preliminary experiments and even some prototypes. Quantum computers even exist, but they are small and not still able to outperform classical supercomputers. But this was the case with classical computing too. In the 1950s and 1940s, a computer was like an entire gym or vault. Then the transistor arrived, which revolutionised the field and miniaturised computers to much smaller regions of space that were also faster. Something similar could happen here and it really is a matter of finding which kind of qubit to use and this could ease the process a lot. My prediction for a timeline is optimistic, but not exaggerated, the researcher added.

Science research

Apart from the techonological breakthroughs, the QRCs efforts are likely to also improve Abu Dhabis status as a hub for science and research.

The UAE has a long tradition of adopting technologies and incorporating technologies bought from abroad. This is now [different in] that the government is putting a serious stake in creating and producing this technology and this creates a multiplicative effect in that young people get more enthusiastic about scientific careers. This creates more demand for universities to start new careers in physics, engineering, computer science, mathematics. This [will essentially have] a long-term, multiplicative effect on development, independent of the concrete goal or technical result of the project on the scientific environment in the country, Dr Aolita added.

The QRC team currently includes 45 people, but this will grow to 60 by the end of 2022, and perhaps to 80 people in 2023. We also want to prioritise hiring the top talent from across the world, Dr Aolita added.

See the original post:
Watch: How Abu Dhabi is ushering in a new era of computing with state-of-the-art quantum lab - Gulf News

Top 5 Quantum Computing Crypto Tokens to Watch in 2022 – The VR Soldier

With the crypto market continuing to trade sideways with mounting bearish pressure, niche categories for crypto tokens remain highly popular as traders and investors prowl for underrated and undervalued projects for long-term investments. Some popular crypto token types include Metaverse tokens, Web3 coins, and dApp tokens on various ecosystems like Tron, Elrond, Ethereum, Polkadot, etc.

Cryptocurrencies have the power to revolutionize finance by cutting out intermediaries. By bringing their exceptional capability to the design process, quantum computers and supercomputers have the potential to revolutionize the way medicines and materials are created.

But, Heres the issue: If quantum computing develops more quickly than efforts to future-proof digital money, the blockchain accounting system that underpins cryptocurrencies may be susceptible to sophisticated hacks and fake transactions.

On the other hand, some new cryptocurrencies claim to be quantum secure and quantum-resistant, which means they can withstand known quantum computer assaults. We will look at some cryptocurrency tokens at the top of their game.

Note: This list is ordered by market capitalization, from lowest to highest.

Mochimo (MCM), a brand-new cryptocurrency developed by an international team and released on June 25th, 2018, is resistant to threats from quantum computers.

Mochimo uses WOTS+ Quantum Resistant Security approved by the EU-funded PQCrypto research organization and a one-time addressing feature to secure privacy when you want it.

According to the website, the Mochimo blockchain remains small while substantially increasing TX speed using ChainCrunch, a proprietary algorithm. Using a compressed portion of the historical blockchain available on every node in the decentralized network, anyone can set up a full working node in minutes.

Industry experts in computer networking, artificial intelligence, telecommunications, cryptography, and software engineering make up the critical contributors of Mochimo.

Some top cryptocurrency exchanges for trading Mochimo $MCM are currently CITEX, FINEXBOX, and VinDAX.

The goal of HyperCash (HC), originally known as Hcash, is to make value transfers possible between various blockchains. It supports DAO governance, quantum resistance, and zero-hash proofs.

Its a decentralized and open-source cross-platform cryptocurrency designed to facilitate the exchange of information between blockchains and non-blockchain networks.

Its also a highly secure network featuring quantum-resistant signature technology.

The HCASH network has two chains running laterally, each serving different functions within the ecosystem.

Hcashs governance is based on a hybrid PoW/PoS consensus methodology and blockchain/DAG network.

If you want to know where to buy HyperCash at the current rate, check out these exchanges OKX, MEXC, KuCoin, Huobi Global, Gate.io, and Hoo. HyperCash is up 3.87% in the last 24 hours.

Nexus (NXS) is a community-driven initiative with the shared goal of creating a society characterized by progressive and ethical principles, advanced technology, and universal access to connection on a free and open basis.

Since September 23rd, 2014, Nexus has been created through mining alone, without an ICO or premine. Nexus uses post-quantum signature schemes (FALCON) and automated key management functions through the Signature Chains technology.

This technology eliminates key management issues (wallet.dats) by allowing users to access their accounts with the familiarity of a username, password, and PIN.

Another technology being developed by Nexus includes;

All the tech mentioned above is connected through a multi-dimensional chaining structure. Nexus is bringing this possibility to life with an end-to-end decentralized platform designed to empower every human being with technology to reclaim their digital identity.

Some top cryptocurrency exchanges for trading $NXS are Binance, Pionex, Bittrex, and CoinDCX.

The Quantum Resistant Ledger (QRL) is a fully quantum-resistant blockchain network using PQ-CRYPTO recommended/IETF standardized cryptography.

The QRL utilizes a hash-based eXtended Merkle Tree Signature Scheme (XMSS) instead of ECDSA, which is reportedly vulnerable to quantum attacks and found in many other blockchain projects.

According to the project, a set of applications and a development environment that enable users to simply build blockchain applications on its provably quantum-resistant network enhance the security of its platform.

Combining on-chain lattice key storage with their robust ephemeral messaging layer to internode communication provides a first-of-its-kind post-quantum secure message layer for ultra-secure digital communications.

The platform has a full suite of end-user products designed with the end-user in mind: from integrations with hardware wallets to mobile applications.

If you want to know where to buy $QRL, check out the CoinTiger exchange.

Launched in 2016, IOTA (MIOTA) is a distributed ledger. However, it differs significantly from a blockchain in that it isnt one. Instead, it uses a system of nodes called Tangle, its patented technology, to confirm transactions.

There are no fees since there is no blockchain, no mining, i.e., no miners. When congestion worsens, costs soar on many conventional networks, but IOTA seeks to offer limitless capacity at a low price.

The platforms foundation claims it provides much faster speeds than traditional blockchains and has the perfect footprint for the ever-expanding Internet of Things ecosystem.

The objective of IOTA is to establish itself as the default platform for carrying out IoT device transactions.

In summary:

According to the team behind the project, their distributed ledger may provide everyone access to digital identities, lead to auto insurance policies based on actual usage, open the door to cutting-edge smart cities, facilitate frictionless international trade, and establish the legitimacy of goods.

Some top cryptocurrency exchanges for trading $MIOTA are Binance, OKX, Bybit, Bitget, and BingX.

Disclosure: This is not trading or investment advice. Always do your research before buying any Quantum Computing token or investing in any cryptocurrency.

Follow us on Twitter@thevrsoldier to stay updated with the latest Metaverse, NFT, A.I., Cybersecurity, Supercomputer, and Cryptocurrency news!

Image Source: grandeduc/123RF

Original post:
Top 5 Quantum Computing Crypto Tokens to Watch in 2022 - The VR Soldier

Rigetti and Riverlane Receive a 500 Thousand ($613K USD) Grant to Work on Error Correction – Quantum Computing Report

Rigetti and Riverlane Receive a 500 Thousand ($613K USD) Grant to Work on Error Correction

Rigetti Computing and Riverlane have received this grant from Innovate UK, the UKs national innovation agency, to study syndrome extraction on superconducting quantum computers. This would be a critical step for providing error correction on the qubits in a fault tolerant quantum computer. Since a qubit cannot be measured directly without collapsing quantum error correction circuits have to be made more complex than their classical counterparts. A common method is to include additional qubits in the circuit called ancilla (or auxiliary) qubits that can be entangled with the data qubits and subsequently measured to form a syndrome pattern. This syndrome pattern can indicate if there is an error within the data qubits and what qubits are affected. Additional gates can then be applied to the data qubits based upon the syndrom to fix the data qubits and correct the errors. The beauty of this approach is that while the ancilla qubits are measured the data qubits are not measured so they remain in the quantum state and dont collapse. This research from Rigetti and Riverlane will explore ways of implementing this error correction process while minimizing any additional errors that could result from the syndrome extraction process itself. For more about this grant and research project, you can view a press release located here.

June 27, 2022

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Visit link:
Rigetti and Riverlane Receive a 500 Thousand ($613K USD) Grant to Work on Error Correction - Quantum Computing Report

What is quantum computing? – TechTarget

Quantum computing is an area of study focused on the development of computer based technologies centered around the principles ofquantum theory. Quantum theory explains the nature and behavior of energy and matter on thequantum(atomic and subatomic) level. Quantum computing uses a combination ofbitsto perform specific computational tasks. All at a much higher efficiency than their classical counterparts. Development ofquantum computersmark a leap forward in computing capability, with massive performance gains for specific use cases. For example quantum computing excels at like simulations.

The quantum computer gains much of its processing power through the ability for bits to be in multiple states at one time. They can perform tasks using a combination of 1s, 0s and both a 1 and 0 simultaneously. Current research centers in quantum computing include MIT, IBM, Oxford University, and the Los Alamos National Laboratory. In addition, developers have begun gaining access toquantum computers through cloud services.

Quantum computing began with finding its essential elements. In 1981, Paul Benioff at Argonne National Labs came up with the idea of a computer that operated with quantum mechanical principles. It is generally accepted that David Deutsch of Oxford University provided the critical idea behind quantum computing research. In 1984, he began to wonder about the possibility of designing a computer that was based exclusively on quantum rules, publishing a breakthrough paper a few months later.

Quantum Theory

Quantum theory's development began in 1900 with a presentation by Max Planck. The presentation was to the German Physical Society, in which Planck introduced the idea that energy and matter exists in individual units. Further developments by a number of scientists over the following thirty years led to the modern understanding of quantum theory.

Quantum Theory

Quantum theory's development began in 1900 with a presentation by Max Planck. The presentation was to the German Physical Society, in which Planck introduced the idea that energy and matter exists in individual units. Further developments by a number of scientists over the following thirty years led to the modern understanding of quantum theory.

The Essential Elements of Quantum Theory:

Further Developments of Quantum Theory

Niels Bohr proposed the Copenhagen interpretation of quantum theory. This theory asserts that a particle is whatever it is measured to be, but that it cannot be assumed to have specific properties, or even to exist, until it is measured. This relates to a principle called superposition. Superposition claims when we do not know what the state of a given object is, it is actually in all possible states simultaneously -- as long as we don't look to check.

To illustrate this theory, we can use the famous analogy of Schrodinger's Cat. First, we have a living cat and place it in a lead box. At this stage, there is no question that the cat is alive. Then throw in a vial of cyanide and seal the box. We do not know if the cat is alive or if it has broken the cyanide capsule and died. Since we do not know, the cat is both alive and dead, according to quantum law -- in a superposition of states. It is only when we break open the box and see what condition the cat is in that the superposition is lost, and the cat must be either alive or dead.

The principle that, in some way, one particle can exist in numerous states opens up profound implications for computing.

A Comparison of Classical and Quantum Computing

Classical computing relies on principles expressed by Boolean algebra; usually Operating with a 3 or 7-modelogic gateprinciple. Data must be processed in an exclusive binary state at any point in time; either 0 (off / false) or 1 (on / true). These values are binary digits, or bits. The millions of transistors and capacitors at the heart of computers can only be in one state at any point. In addition, there is still a limit as to how quickly these devices can be made to switch states. As we progress to smaller and faster circuits, we begin to reach the physical limits of materials and the threshold for classical laws of physics to apply.

The quantum computer operates with a two-mode logic gate:XORand a mode called QO1 (the ability to change 0 into a superposition of 0 and 1). In a quantum computer, a number of elemental particles such as electrons or photons can be used. Each particle is given a charge, or polarization, acting as a representation of 0 and/or 1. Each particle is called a quantum bit, or qubit. The nature and behavior of these particles form the basis of quantum computing and quantum supremacy. The two most relevant aspects of quantum physics are the principles of superposition andentanglement.

Superposition

Think of a qubit as an electron in a magnetic field. The electron's spin may be either in alignment with the field, which is known as aspin-upstate, or opposite to the field, which is known as aspin-downstate. Changing the electron's spin from one state to another is achieved by using a pulse of energy, such as from alaser. If only half a unit of laser energy is used, and the particle is isolated the particle from all external influences, the particle then enters a superposition of states. Behaving as if it were in both states simultaneously.

Each qubit utilized could take a superposition of both 0 and 1. Meaning, the number of computations a quantum computer could take is 2^n, where n is the number of qubits used. A quantum computer comprised of 500 qubits would have a potential to do 2^500 calculations in a single step. For reference, 2^500 is infinitely more atoms than there are in the known universe. These particles all interact with each other via quantum entanglement.

In comparison to classical, quantum computing counts as trueparallel processing. Classical computers today still only truly do one thing at a time. In classical computing, there are just two or more processors to constitute parallel processing.EntanglementParticles (like qubits) that have interacted at some point retain a type can be entangled with each other in pairs, in a process known ascorrelation. Knowing the spin state of one entangled particle - up or down -- gives away the spin of the other in the opposite direction. In addition, due to the superposition, the measured particle has no single spin direction before being measured. The spin state of the particle being measured is determined at the time of measurement and communicated to the correlated particle, which simultaneously assumes the opposite spin direction. The reason behind why is not yet explained.

Quantum entanglement allows qubits that are separated by large distances to interact with each other instantaneously (not limited to the speed of light). No matter how great the distance between the correlated particles, they will remain entangled as long as they are isolated.

Taken together, quantum superposition and entanglement create an enormously enhanced computing power. Where a 2-bit register in an ordinary computer can store only one of four binary configurations (00, 01, 10, or 11) at any given time, a 2-qubit register in a quantum computer can store all four numbers simultaneously. This is because each qubit represents two values. If more qubits are added, the increased capacity is expanded exponentially.

Quantum Programming

Quantum computing offers an ability to write programs in a completely new way. For example, a quantum computer could incorporate a programming sequence that would be along the lines of "take all the superpositions of all the prior computations." This would permit extremely fast ways of solving certain mathematical problems, such as factorization of large numbers.

The first quantum computing program appeared in 1994 by Peter Shor, who developed a quantum algorithm that could efficiently factorize large numbers.

The Problems - And Some Solutions

The benefits of quantum computing are promising, but there are huge obstacles to overcome still. Some problems with quantum computing are:

There are many problems to overcome, such as how to handle security and quantum cryptography. Long time quantum information storage has been a problem in the past too. However, breakthroughs in the last 15 years and in the recent past have made some form of quantum computing practical. There is still much debate as to whether this is less than a decade away or a hundred years into the future. However, the potential that this technology offers is attracting tremendous interest from both the government and the private sector. Military applications include the ability to break encryptions keys via brute force searches, while civilian applications range from DNA modeling to complex material science analysis.

Continue reading here:
What is quantum computing? - TechTarget

The Spooky Quantum Phenomenon You’ve Never Heard Of – Quanta Magazine

Perhaps the most famously weird feature of quantum mechanics is nonlocality: Measure one particle in an entangled pair whose partner is miles away, and the measurement seems to rip through the intervening space to instantaneously affect its partner. This spooky action at a distance (as Albert Einstein called it) has been the main focus of tests of quantum theory.

Nonlocality is spectacular. I mean, its like magic, said Adn Cabello, a physicist at the University of Seville in Spain.

But Cabello and others are interested in investigating a lesser-known but equally magical aspect of quantum mechanics: contextuality. Contextuality says that properties of particles, such as their position or polarization, exist only within the context of a measurement. Instead of thinking of particles properties as having fixed values, consider them more like words in language, whose meanings can change depending on the context: Timeflies likean arrow. Fruitflies likebananas.

Although contextuality has lived in nonlocalitys shadow for over 50 years, quantum physicists now consider it more of a hallmark feature of quantum systems than nonlocality is. A single particle, for instance, is a quantum system in which you cannot even think about nonlocality, since the particle is only in one location, said Brbara Amaral, a physicist at the University of So Paulo in Brazil. So [contextuality] is more general in some sense, and I think this is important to really understand the power of quantum systems and to go deeper into why quantum theory is the way it is.

Researchers have also found tantalizing links between contextuality and problems that quantum computers can efficiently solve that ordinary computers cannot; investigating these links could help guide researchers in developing new quantum computing approaches and algorithms.

And with renewed theoretical interest comes a renewed experimental effort to prove that our world is indeed contextual. In February, Cabello, in collaboration with Kihwan Kim at Tsinghua University in Beijing, China, published a paper in which they claimed to have performed the first loophole-free experimental test of contextuality.

The Northern Irish physicist John Stewart Bell is widely credited with showing that quantum systems can be nonlocal. By comparing the outcomes of measurements of two entangled particles, he showed with his eponymous theorem of 1965 that the high degree of correlations between the particles cant possibly be explained in terms of local hidden variables defining each ones separate properties. The information contained in the entangled pair must be shared nonlocally between the particles.

Bell also proved a similar theorem about contextuality. He and, separately, Simon Kochen and Ernst Specker showed that it is impossible for a quantum system to have hidden variables that define the values of all their properties in all possible contexts.

In Kochen and Speckers version of the proof, they considered a single particle with a quantum property called spin, which has both a magnitude and a direction. Measuring the spins magnitude along any direction always results in one of two outcomes: 1 or 0. The researchers then asked: Is it possible that the particle secretly knows what the result of every possible measurement will be before it is measured? In other words, could they assign a fixed value a hidden variable to all outcomes of all possible measurements at once?

Quantum theory says that the magnitudes of the spins along three perpendicular directions must obey the 101 rule: The outcomes of two of the measurements must be 1 and the other must be 0. Kochen and Specker used this rule to arrive at a contradiction. First, they assumed that each particle had a fixed, intrinsic value for each direction of spin. They then conducted a hypothetical spin measurement along some unique direction, assigning either 0 or 1 to the outcome. They then repeatedly rotated the direction of their hypothetical measurement and measured again, each time either freely assigning a value to the outcome or deducing what the value must be in order to satisfy the 101 rule together with directions they had previously considered.

They continued until, in the 117th direction, the contradiction cropped up. While they had previously assigned a value of 0 to the spin along this direction, the 101 rule was now dictating that the spin must be 1. The outcome of a measurement could not possibly return both 0 and 1. So the physicists concluded that there is no way a particle can have fixed hidden variables that remain the same regardless of context.

While the proof indicated that quantum theory demands contextuality, there was no way to actually demonstrate this through 117 simultaneous measurements of a single particle. Physicists have since devised more practical, experimentally implementable versions of the original Bell-Kochen-Specker theorem involving multiple entangled particles, where a particular measurement on one particle defines a context for the others.

In 2009, contextuality, a seemingly esoteric aspect of the underlying fabric of reality, got a direct application: One of the simplified versions of the original Bell-Kochen-Specker theorem was shown to be equivalent to a basic quantum computation.

The proof, named Mermins star after its originator, David Mermin, considered various combinations of contextual measurements that could be made on three entangled quantum bits, or qubits. The logic of how earlier measurements shape the outcomes of later measurements has become the basis for an approach called measurement-based quantum computing. The discovery suggested that contextuality might be key to why quantum computers can solve certain problems faster than classical computers an advantage that researchers have struggled mightily to understand.

Robert Raussendorf, a physicist at the University of British Columbia and a pioneer of measurement-based quantum computing, showed that contextuality is necessary for a quantum computer to beat a classical computer at some tasks, but he doesnt think its the whole story. Whether contextuality powers quantum computers is probably not exactly the right question to ask, he said. But we need to get there question by question. So we ask a question that we understand how to ask; we get an answer. We ask the next question.

Some researchers have suggested loopholes around Bell, Kochen and Speckers conclusion that the world is contextual. They argue that context-independent hidden variables havent been conclusively ruled out.

In February, Cabello and Kim announced that they had closed every plausible loophole by performing a loophole free Bell-Kochen-Specker experiment.

The experiment entailed measuring the spins of two entangled trapped ions in various directions, where the choice of measurement on one ion defined the context for the other ion. The physicists showed that, although making a measurement on one ion does not physically affect the other, it changes the context and hence the outcome of the second ions measurement.

Skeptics would ask: How can you be certain that the context created by the first measurement is what changed the second measurement outcome, rather than other conditions that might vary from experiment to experiment? Cabello and Kim closed this sharpness loophole by performing thousands of sets of measurements and showing that the outcomes dont change if the context doesnt. After ruling out this and other loopholes, they concluded that the only reasonable explanation for their results is contextuality.

Cabello and others think that these experiments could be used in the future to test the level of contextuality and hence, the power of quantum computing devices.

If you want to really understand how the world is working, said Cabello, you really need to go into the detail of quantum contextuality.

See the original post here:
The Spooky Quantum Phenomenon You've Never Heard Of - Quanta Magazine

Global Quantum Computing Market is estimated to be US$ 4531.04 billion by 2030 with a CAGR of 28.2% during the forecast period – By PMI -…

Covina, June 22, 2022 (GLOBE NEWSWIRE) -- The discovery of potential COVID-19 therapeutics has a bright future due toquantum computing. New approaches to drug discovery are being investigated with funding from the Penn State Institute for Computational and Data Sciences, coordinated through the Penn State Huck Institutes of the Life Sciences. For businesses in the quantum computing market, these tendencies are turning into lucrative opportunities during forecast period. Research initiatives that are assisting in the screening of billions of chemical compounds to uncover suitable medication candidates have been made possible by the convergence of machine learning and quantum physics. Stakeholders in the quantum computing business are expanding the availability of supercomputers and growing R&D in artificial intelligence to support these studies (AI). The energy and electricity sector offers lucrative potential for businesses in the quantum computing market. As regard to whole assets, work overs, and infrastructure, this technology is assisting players in the energy and power sector in making crucial investment decisions. Budgetary considerations, resource constraints, and contractual commitments may all be factors in these issues that quantum computing can help to resolve.

Region Analysis:

North America is predicted to hold a large market share for quantum computing due to its early adoption of cutting-edge technology. Additionally, the existence of a competitive market and end-user acceptance of cutting-edge technology may promote market growth. Sales are anticipated to increase throughout Europe as a result of the rise of multiple startups, favourable legislative conditions, and the growing use of cloud technology. In addition, it is anticipated that leading companies' company expansion will accelerate market growth. The market is anticipated to grow in Asia Pacific as a result of the growing need for quantum computing solutions for simulation, optimization, and machine learning.

Key Highlights:

Before purchasing this report, request a sample or make an inquiry by clicking the following link:

https://www.prophecymarketinsights.com/market_insight/Insight/request-sample/571

Key Market Insights from the report:

Global Quantum Computing Market size accounted for US$ 387.3 billion in 2020 and is estimated to be US$ 4531.04 billion by 2030 and is anticipated to register a CAGR of 28.2%.The Global Quantum Computing Market is segmented based on component, application, end-user industry and region.

Competitive Landscape & their strategies of Quantum Computing Market:

Key players in the global quantum computing market include Wave Systems Corp, 1QB Information Technologies Inc, QC Ware, Corp, Google Inc, QxBranch LLC, Microsoft Corporation, International Business Machines Corporation, Huawei Technologies Co., Ltd, ID Quantique SA, and Atos SE.

Scope of the Report:

Global Quantum Computing Market, By Component, 2019 2029, (US$ Mn)

To know more:Click here

Some Important Points Answered in this Market Report Are Given Below:

Browse Related Reports:

1.Photonic Integrated Circuit Market, By Integration (Monolithic Integration, Hybrid Integration, and Module Integration), By Raw Material (Gallium Arsenide, Indium Phosphide, Silica On Silicon, Silicon On Insulator, and Lithium Niobate), By Application (Optical Fiber Communication, Optical Fiber Sensors, Biomedical, and Quantum Computing), and By Region (North America, Europe, Asia-Pacific, Latin America, and Middle East & Africa) - Trends, Analysis, and Forecast till 2029

2.Edge Computing Market, By Component (Hardware, Services, Platform, and Solutions), By Application (Location Services, Analytics, Data Caching, Smart Cities, Environmental Monitoring, Optimized Local Content, Augmented Reality, Optimized Local Content, and Others), By End-User (Telecommunication & IT, Healthcare, Government & Public, Retail, Media & Entertainment, Transportation, Energy & Utilities, and Manufacturing), and By Region (North America, Europe, Asia Pacific, Latin America, and Middle East & Africa) - Trends, Analysis, and Forecast till 2029

3.Global 5G Technology Infrastructure Market, By Communication Infrastructure (Small Cell, Macro Cell, Radio Access Network, and Distributed Antenna System), By Network Technology (Software Defined Networking & Network Function Virtualization, Mobile Edge Computing, and Fog Computing), By Application (Automotive, Energy & Utilities, Healthcare, Retail, and Others), and By Region (North America, Europe, Asia Pacific, Latin America, and Middle East & Africa) - Trends, Analysis and Forecast till 2029

See the article here:
Global Quantum Computing Market is estimated to be US$ 4531.04 billion by 2030 with a CAGR of 28.2% during the forecast period - By PMI -...

Quantum Error Correction: Time to Make It Work – IEEE Spectrum

Dates chiseled into an ancient tombstone have more in common with the data in your phone or laptop than you may realize. They both involve conventional, classical information, carried by hardware that is relatively immune to errors. The situation inside a quantum computer is far different: The information itself has its own idiosyncratic properties, and compared with standard digital microelectronics, state-of-the-art quantum-computer hardware is more than a billion trillion times as likely to suffer a fault. This tremendous susceptibility to errors is the single biggest problem holding back quantum computing from realizing its great promise.

Fortunately, an approach known as quantum error correction (QEC) can remedy this problem, at least in principle. A mature body of theory built up over the past quarter century now provides a solid theoretical foundation, and experimentalists have demonstrated dozens of proof-of-principle examples of QEC. But these experiments still have not reached the level of quality and sophistication needed to reduce the overall error rate in a system.

The two of us, along with many other researchers involved in quantum computing, are trying to move definitively beyond these preliminary demos of QEC so that it can be employed to build useful, large-scale quantum computers. But before describing how we think such error correction can be made practical, we need to first review what makes a quantum computer tick.

Information is physical. This was the mantra of the distinguished IBM researcher Rolf Landauer. Abstract though it may seem, information always involves a physical representation, and the physics matters.

Conventional digital information consists of bits, zeros and ones, which can be represented by classical states of matter, that is, states well described by classical physics. Quantum information, by contrast, involves qubitsquantum bitswhose properties follow the peculiar rules of quantum mechanics.

A classical bit has only two possible values: 0 or 1. A qubit, however, can occupy a superposition of these two information states, taking on characteristics of both. Polarized light provides intuitive examples of superpositions. You could use horizontally polarized light to represent 0 and vertically polarized light to represent 1, but light can also be polarized on an angle and then has both horizontal and vertical components at once. Indeed, one way to represent a qubit is by the polarization of a single photon of light.

These ideas generalize to groups of n bits or qubits: n bits can represent any one of 2n possible values at any moment, while n qubits can include components corresponding to all 2n classical states simultaneously in superposition. These superpositions provide a vast range of possible states for a quantum computer to work with, albeit with limitations on how they can be manipulated and accessed. Superposition of information is a central resource used in quantum processing and, along with other quantum rules, enables powerful new ways to compute.

Researchers are experimenting with many different physical systems to hold and process quantum information, including light, trapped atoms and ions, and solid-state devices based on semiconductors or superconductors. For the purpose of realizing qubits, all these systems follow the same underlying mathematical rules of quantum physics, and all of them are highly sensitive to environmental fluctuations that introduce errors. By contrast, the transistors that handle classical information in modern digital electronics can reliably perform a billion operations per second for decades with a vanishingly small chance of a hardware fault.

Of particular concern is the fact that qubit states can roam over a continuous range of superpositions. Polarized light again provides a good analogy: The angle of linear polarization can take any value from 0 to 180 degrees.

Pictorially, a qubits state can be thought of as an arrow pointing to a location on the surface of a sphere. Known as a Bloch sphere, its north and south poles represent the binary states 0 and 1, respectively, and all other locations on its surface represent possible quantum superpositions of those two states. Noise causes the Bloch arrow to drift around the sphere over time. A conventional computer represents 0 and 1 with physical quantities, such as capacitor voltages, that can be locked near the correct values to suppress this kind of continuous wandering and unwanted bit flips. There is no comparable way to lock the qubits arrow to its correct location on the Bloch sphere.

Early in the 1990s, Landauer and others argued that this difficulty presented a fundamental obstacle to building useful quantum computers. The issue is known as scalability: Although a simple quantum processor performing a few operations on a handful of qubits might be possible, could you scale up the technology to systems that could run lengthy computations on large arrays of qubits? A type of classical computation called analog computing also uses continuous quantities and is suitable for some tasks, but the problem of continuous errors prevents the complexity of such systems from being scaled up. Continuous errors with qubits seemed to doom quantum computers to the same fate.

We now know better. Theoreticians have successfully adapted the theory of error correction for classical digital data to quantum settings. QEC makes scalable quantum processing possible in a way that is impossible for analog computers. To get a sense of how it works, its worthwhile to review how error correction is performed in classical settings.

Simple schemes can deal with errors in classical information. For instance, in the 19th century, ships routinely carried clocks for determining the ships longitude during voyages. A good clock that could keep track of the time in Greenwich, in combination with the suns position in the sky, provided the necessary data. A mistimed clock could lead to dangerous navigational errors, though, so ships often carried at least three of them. Two clocks reading different times could detect when one was at fault, but three were needed to identify which timepiece was faulty and correct it through a majority vote.

The use of multiple clocks is an example of a repetition code: Information is redundantly encoded in multiple physical devices such that a disturbance in one can be identified and corrected.

As you might expect, quantum mechanics adds some major complications when dealing with errors. Two problems in particular might seem to dash any hopes of using a quantum repetition code. The first problem is that measurements fundamentally disturb quantum systems. So if you encoded information on three qubits, for instance, observing them directly to check for errors would ruin them. Like Schrdingers cat when its box is opened, their quantum states would be irrevocably changed, spoiling the very quantum features your computer was intended to exploit.

The second issue is a fundamental result in quantum mechanics called the no-cloning theorem, which tells us it is impossible to make a perfect copy of an unknown quantum state. If you know the exact superposition state of your qubit, there is no problem producing any number of other qubits in the same state. But once a computation is running and you no longer know what state a qubit has evolved to, you cannot manufacture faithful copies of that qubit except by duplicating the entire process up to that point.

Fortunately, you can sidestep both of these obstacles. Well first describe how to evade the measurement problem using the example of a classical three-bit repetition code. You dont actually need to know the state of every individual code bit to identify which one, if any, has flipped. Instead, you ask two questions: Are bits 1 and 2 the same? and Are bits 2 and 3 the same? These are called parity-check questions because two identical bits are said to have even parity, and two unequal bits have odd parity.

The two answers to those questions identify which single bit has flipped, and you can then counterflip that bit to correct the error. You can do all this without ever determining what value each code bit holds. A similar strategy works to correct errors in a quantum system.

Learning the values of the parity checks still requires quantum measurement, but importantly, it does not reveal the underlying quantum information. Additional qubits can be used as disposable resources to obtain the parity values without revealing (and thus without disturbing) the encoded information itself.

Like Schrdingers cat when its box is opened, the quantum states of the qubits you measured would be irrevocably changed, spoiling the very quantum features your computer was intended to exploit.

What about no-cloning? It turns out it is possible to take a qubit whose state is unknown and encode that hidden state in a superposition across multiple qubits in a way that does not clone the original information. This process allows you to record what amounts to a single logical qubit of information across three physical qubits, and you can perform parity checks and corrective steps to protect the logical qubit against noise.

Quantum errors consist of more than just bit-flip errors, though, making this simple three-qubit repetition code unsuitable for protecting against all possible quantum errors. True QEC requires something more. That came in the mid-1990s when Peter Shor (then at AT&T Bell Laboratories, in Murray Hill, N.J.) described an elegant scheme to encode one logical qubit into nine physical qubits by embedding a repetition code inside another code. Shors scheme protects against an arbitrary quantum error on any one of the physical qubits.

Since then, the QEC community has developed many improved encoding schemes, which use fewer physical qubits per logical qubitthe most compact use fiveor enjoy other performance enhancements. Today, the workhorse of large-scale proposals for error correction in quantum computers is called the surface code, developed in the late 1990s by borrowing exotic mathematics from topology and high-energy physics.

It is convenient to think of a quantum computer as being made up of logical qubits and logical gates that sit atop an underlying foundation of physical devices. These physical devices are subject to noise, which creates physical errors that accumulate over time. Periodically, generalized parity measurements (called syndrome measurements) identify the physical errors, and corrections remove them before they cause damage at the logical level.

A quantum computation with QEC then consists of cycles of gates acting on qubits, syndrome measurements, error inference, and corrections. In terms more familiar to engineers, QEC is a form of feedback stabilization that uses indirect measurements to gain just the information needed to correct errors.

QEC is not foolproof, of course. The three-bit repetition code, for example, fails if more than one bit has been flipped. Whats more, the resources and mechanisms that create the encoded quantum states and perform the syndrome measurements are themselves prone to errors. How, then, can a quantum computer perform QEC when all these processes are themselves faulty?

Remarkably, the error-correction cycle can be designed to tolerate errors and faults that occur at every stage, whether in the physical qubits, the physical gates, or even in the very measurements used to infer the existence of errors! Called a fault-tolerant architecture, such a design permits, in principle, error-robust quantum processing even when all the component parts are unreliable.

A long quantum computation will require many cycles of quantum error correction (QEC). Each cycle would consist of gates acting on encoded qubits (performing the computation), followed by syndrome measurements from which errors can be inferred, and corrections. The effectiveness of this QEC feedback loop can be greatly enhanced by including quantum-control techniques (represented by the thick blue outline) to stabilize and optimize each of these processes.

Even in a fault-tolerant architecture, the additional complexity introduces new avenues for failure. The effect of errors is therefore reduced at the logical level only if the underlying physical error rate is not too high. The maximum physical error rate that a specific fault-tolerant architecture can reliably handle is known as its break-even error threshold. If error rates are lower than this threshold, the QEC process tends to suppress errors over the entire cycle. But if error rates exceed the threshold, the added machinery just makes things worse overall.

The theory of fault-tolerant QEC is foundational to every effort to build useful quantum computers because it paves the way to building systems of any size. If QEC is implemented effectively on hardware exceeding certain performance requirements, the effect of errors can be reduced to arbitrarily low levels, enabling the execution of arbitrarily long computations.

At this point, you may be wondering how QEC has evaded the problem of continuous errors, which is fatal for scaling up analog computers. The answer lies in the nature of quantum measurements.

In a typical quantum measurement of a superposition, only a few discrete outcomes are possible, and the physical state changes to match the result that the measurement finds. With the parity-check measurements, this change helps.

Imagine you have a code block of three physical qubits, and one of these qubit states has wandered a little from its ideal state. If you perform a parity measurement, just two results are possible: Most often, the measurement will report the parity state that corresponds to no error, and after the measurement, all three qubits will be in the correct state, whatever it is. Occasionally the measurement will instead indicate the odd parity state, which means an errant qubit is now fully flipped. If so, you can flip that qubit back to restore the desired encoded logical state.

In other words, performing QEC transforms small, continuous errors into infrequent but discrete errors, similar to the errors that arise in digital computers.

Researchers have now demonstrated many of the principles of QEC in the laboratoryfrom the basics of the repetition code through to complex encodings, logical operations on code words, and repeated cycles of measurement and correction. Current estimates of the break-even threshold for quantum hardware place it at about 1 error in 1,000 operations. This level of performance hasnt yet been achieved across all the constituent parts of a QEC scheme, but researchers are getting ever closer, achieving multiqubit logic with rates of fewer than about 5 errors per 1,000 operations. Even so, passing that critical milestone will be the beginning of the story, not the end.

On a system with a physical error rate just below the threshold, QEC would require enormous redundancy to push the logical rate down very far. It becomes much less challenging with a physical rate further below the threshold. So just crossing the error threshold is not sufficientwe need to beat it by a wide margin. How can that be done?

If we take a step back, we can see that the challenge of dealing with errors in quantum computers is one of stabilizing a dynamic system against external disturbances. Although the mathematical rules differ for the quantum system, this is a familiar problem in the discipline of control engineering. And just as control theory can help engineers build robots capable of righting themselves when they stumble, quantum-control engineering can suggest the best ways to implement abstract QEC codes on real physical hardware. Quantum control can minimize the effects of noise and make QEC practical.

In essence, quantum control involves optimizing how you implement all the physical processes used in QECfrom individual logic operations to the way measurements are performed. For example, in a system based on superconducting qubits, a qubit is flipped by irradiating it with a microwave pulse. One approach uses a simple type of pulse to move the qubits state from one pole of the Bloch sphere, along the Greenwich meridian, to precisely the other pole. Errors arise if the pulse is distorted by noise. It turns out that a more complicated pulse, one that takes the qubit on a well-chosen meandering route from pole to pole, can result in less error in the qubits final state under the same noise conditions, even when the new pulse is imperfectly implemented.

One facet of quantum-control engineering involves careful analysis and design of the best pulses for such tasks in a particular imperfect instance of a given system. It is a form of open-loop (measurement-free) control, which complements the closed-loop feedback control used in QEC.

This kind of open-loop control can also change the statistics of the physical-layer errors to better comport with the assumptions of QEC. For example, QEC performance is limited by the worst-case error within a logical block, and individual devices can vary a lot. Reducing that variability is very beneficial. In an experiment our team performed using IBMs publicly accessible machines, we showed that careful pulse optimization reduced the difference between the best-case and worst-case error in a small group of qubits by more than a factor of 10.

Some error processes arise only while carrying out complex algorithms. For instance, crosstalk errors occur on qubits only when their neighbors are being manipulated. Our team has shown that embedding quantum-control techniques into an algorithm can improve its overall success by orders of magnitude. This technique makes QEC protocols much more likely to correctly identify an error in a physical qubit.

For 25 years, QEC researchers have largely focused on mathematical strategies for encoding qubits and efficiently detecting errors in the encoded sets. Only recently have investigators begun to address the thorny question of how best to implement the full QEC feedback loop in real hardware. And while many areas of QEC technology are ripe for improvement, there is also growing awareness in the community that radical new approaches might be possible by marrying QEC and control theory. One way or another, this approach will turn quantum computing into a realityand you can carve that in stone.

This article appears in the July 2022 print issue as Quantum Error Correction at the Threshold.

From Your Site Articles

Related Articles Around the Web

More:
Quantum Error Correction: Time to Make It Work - IEEE Spectrum

Quantum computing: Definition, facts & uses | Live Science

Quantum computing is a new generation of technology that involves a type of computer 158 million times faster than the most sophisticated supercomputer we have in the world today. It is a device so powerful that it could do in four minutes what it would take a traditional supercomputer 10,000 years to accomplish.

For decades, our computers have all been built around the same design. Whether it is the huge machines at NASA, or your laptop at home, they are all essentially just glorified calculators, but crucially they can only do one thing at a time.

The key to the way all computers work is that they process and store information made of binary digits called bits. These bits only have two possible values, a one or a zero. It is these numbers that create binary code, which a computer needs to read in order to carry out a specific task, according to the book Fundamentals of Computers (opens in new tab).

Quantum theory is a branch of physics which deals in the tiny world of atoms and the smaller (subatomic) particles inside them, according to the journal Documenta Mathematica (opens in new tab). When you delve into this minuscule world, the laws of physics are very different to what we see around us. For instance, quantum particles can exist in multiple states at the same time. This is known as superposition.

Instead of bits, quantum computers use something called quantum bits, 'qubits' for short. While a traditional bit can only be a one or a zero, a qubit can be a one, a zero or it can be both at the same time, according to a paper published from IEEE International Conference on Big Data (opens in new tab).

This means that a quantum computer does not have to wait for one process to end before it can begin another, it can do them at the same time.

Imagine you had lots of doors which were all locked except for one, and you needed to find out which one was open. A traditional computer would keep trying each door, one after the other, until it found the one which was unlocked. It might take five minutes, it might take a million years, depending on how many doors there were. But a quantum computer could try all the doors at once. This is what makes them so much faster.

As well as superposition, quantum particles also exhibit another strange behaviour called entanglement which also makes this tech so potentially ground-breaking. When two quantum particles are entangled, they form a connection to each other no matter how far apart they are. When you alter one, the other responds the same way even if they're thousands of miles apart. Einstein called this particle property "spooky action at a distance", according to the journal Nature (opens in new tab).

As well as speed, another advantage quantum computers have over traditional computers is size. According to Moore's Law, computing power doubles roughly every two years, according to the journal IEEE Annals of the History of Computing (opens in new tab). But in order to enable this, engineers have to fit more and more transistors onto a circuit board. A transistor is like a microscopic light switch which can be either off or on. This is how a computer processes a zero or a one that you find in binary code.

To solve more complex problems, you need more of those transistors. But no matter how small you make them there's only so many you can fit onto a circuit board. So what does that mean? It means sooner or later, traditional computers are going to be as smart as we can possibly make them, according to the Young Scientists Journal (opens in new tab). That is where quantum machines can change things.

The quest to build quantum computers has turned into something of a global race, with some of the biggest companies and indeed governments on the planet vying to push the technology ever further, prompting a rise in interest in quantum computing stocks on the money markets.

One example is the device created by D-Wave. It has built the Advantage system which it says is the first and only quantum computer designed for business use, according to a press release (opens in new tab) from the company.

D-wave said it has been designed with a new processor architecture with over 5,000 qubits and 15-way qubit connectivity, which it said enables companies to solve their largest and most complex business problems.

The firm claims the machine is the first and only quantum computer that enables customers to develop and run real-world, in-production quantum applications at scale in the cloud. The firm said the Advantage is 30 times faster and delivers equal or better solutions 94% of the time compared to its previous generation system.

But despite the huge, theoretical computational power of quantum computers, there is no need to consign your old laptop to the wheelie bin just yet. Conventional computers will still have a role to play in any new era, and are far more suited to everyday tasks such as spreadsheets, emailing and word processing, according to Quantum Computing Inc. (QCI) (opens in new tab).

Where quantum computing could really bring about radical change though is in predictive analytics. Because a quantum computer can make analyses and predictions at breakneck speeds, it would be able to predict weather patterns and perform traffic modelling, things where there are millions if not billions of variables that are constantly changing.

Standard computers can do what they are told well enough if they are fed the right computer programme by a human. But when it comes to predicting things, they are not so smart. This is why the weather forecast is not always accurate. There are too many variables, too many things changing too quickly for any conventional computer to keep up.

Because of their limitations, there are some computations which an ordinary computer may never be able to solve, or it might take literally a billion years. Not much good if you need a quick prediction or piece of analysis.

But a quantum computer is so fast, almost infinitely so, that it could respond to changing information quickly and examine a limitless number of outcomes and permutations simultaneously, according to research by Rigetti Computing (opens in new tab).

Quantum computers are also relatively small because they do not rely on transistors like traditional machines. They also consume comparatively less power, meaning they could in theory be better for the environment.

You can read about how to get started in quantum computing in this article by Nature (opens in new tab). To learn more about the future of quantum computing, you can watch this TED Talk (opens in new tab) by PhD student Jason Ball.

Read the original here:
Quantum computing: Definition, facts & uses | Live Science

Alan Turing’s Everlasting Contributions to Computing, AI and Cryptography – NIST

An enigma machine on display outside the Alan Turing Institute entrance inside the British Library, London.

Credit: Shutterstock/William Barton

Suppose someone asked you to devise the most powerful computer possible. Alan Turing, whose reputation as a central figure in computer science and artificial intelligence has only grown since his untimely death in 1954, applied his genius to problems such as this one in an age before computers as we know them existed. His theoretical work on this problem and others remains a foundation of computing, AI and modern cryptographic standards, including those NIST recommends.

The road from devising the most powerful computer possible to cryptographic standards has a few twists and turns, as does Turings brief life.

Alan Turing

Credit: National Portrait Gallery, London

In Turings time, mathematicians debated whether it was possible to build a single, all-purpose machine that could solve all problems that are computable. For example, we can compute a cars most energy-efficient route to a destination, and (in principle) the most likely way in which a string of amino acids will fold into a three-dimensional protein. Another example of a computable problem, important to modern encryption, is whether or not bigger numbers can be expressed as the product of two smaller numbers. For example, 6 can be expressed as the product of 2 and 3, but 7 cannot be factored into smaller integers and is therefore a prime number.

Some prominent mathematicians proposed elaborate designs for universal computers that would operate by following very complicated mathematical rules. It seemed overwhelmingly difficult to build such machines. It took the genius of Turing to show that a very simple machine could in fact compute all that is computable.

His hypothetical device is now known as a Turing machine. The centerpiece of the machine is a strip of tape, divided into individual boxes. Each box contains a symbol (such as A,C,T, G for the letters of genetic code) or a blank space. The strip of tape is analogous to todays hard drives that store bits of data. Initially, the string of symbols on the tape corresponds to the input, containing the data for the problem to be solved. The string also serves as the memory of the computer. The Turing machine writes onto the tape data that it needs to access later in the computation.

Credit: NIST

The device reads an individual symbol on the tape and follows instructions on whether to change the symbol or leave it alone before moving to another symbol. The instructions depend on the current state of the machine. For example, if the machine needs to decide whether the tape contains the text string TC it can scan the tape in the forward direction while switching among the states previous letter was T and previous letter was not C. If while in state previous letter was T it reads a C, it goes to a state found it and halts. If it encounters the blank symbol at the end of the input, it goes to the state did not find it and halts. Nowadays we would recognize the set of instructions as the machines program.

It took some time, but eventually it became clear to everyone that Turing was right: The Turing machine could indeed compute all that seemed computable. No number of additions or extensions to this machine could extend its computing capability.

To understand what can be computed it is helpful to identify what cannot be computed. Ina previous life as a university professor I had to teach programming a few times. Students often encounter the following problem: My program has been running for a long time; is it stuck? This is called the Halting Problem, and students often wondered why we simply couldnt detect infinite loops without actually getting stuck in them. It turns out a program to do this is an impossibility. Turing showed that there does not exist a machine that detects whether or not another machine halts. From this seminal result followed many other impossibility results. For example, logicians and philosophers had to abandon the dream of an automated way of detecting whether an assertion (such as whether there are infinitely many prime numbers) is true or false, as that is uncomputable. If you could do this, then you could solve the Halting Problem simply by asking whether the statement this machine halts is true or false.

Turing went on to make fundamental contributions to AI, theoretical biology and cryptography. His involvement with this last subject brought him honor and fame during World War II, when he played a very important role in adapting and extending cryptanalytic techniques invented by Polish mathematicians. This work broke the German Enigma machine encryption, making a significant contribution to the war effort.

Turing was gay. After the war, in 1952, the British government convicted him for having sex with a man. He stayed out of jail only by submitting to what is now called chemical castration. He died in 1954 at age 41 by cyanide poisoning, which was initially ruled a suicide but may have been an accident according to subsequent analysis. More than 50 years would pass before the British government apologized and pardoned him (after years of campaigning by scientists around the world). Today, the highest honor in computer sciences is called the Turing Award.

Turings computability work provided the foundation for modern complexity theory. This theory tries to answer the question Among those problems that can be solved by a computer, which ones can be solved efficiently? Here, efficiently means not in billions of years but in milliseconds, seconds, hours or days, depending on the computational problem.

For example, much of the cryptography that currently safeguards our data and communications relies on the belief that certain problems, such as decomposing an integer number into its prime factors, cannot be solved before the Sun turns into a red giant and consumes the Earth (currently forecast for 4 billion to 5 billion years). NIST is responsible for cryptographic standards that are used throughout the world. We could not do this work without complexity theory.

Technology sometimes throws us a curve, such as the discovery that if a sufficiently big and reliable quantum computer is built it would be able to factor integers, thus breaking some of our cryptography. In this situation, NIST scientists must rely on the worlds experts (many of them in-house) in order to update our standards. There are deep reasons to believe that quantum computers will not be able to break the cryptography that NIST is about to roll out. Among these reasons is that Turings machine can simulate quantum computers. This implies that complexity theory gives us limits on what a powerful quantum computer can do.

But that is a topic for another day. For now, we can celebrate how Turing provided the keys to much of todays computing technology and even gave us hints on how to solve looming technological problems.

See the original post here:
Alan Turing's Everlasting Contributions to Computing, AI and Cryptography - NIST

Quantum computing: D-Wave shows off prototype of its next quantum annealing computer – ZDNet

Image: Wacomka/Shutterstock

Quantum-computing outfit D-Wave has announced commercial access to an "experimental prototype" of its Advantage2 quantum annealing computer.

D-Wave is beating its own path to qubit processors with its quantum annealing approach. According to D-Wave, the Advantage2 prototype available today features over 500 qubits. It's a preview of a much larger Advantage2 it hopes to be available by 2024 with 7,000 qubits.

Access to the Advantage2 prototype is restricted to customers who have a D-Wave's Leap cloud service subscription, but developers interested in trying D-Wave's quantum cloud can sign up to get "one minute of free use of the actual quantum processing units (QPUs) and quantum hybrid solvers" that run on its earlier Advantage QPU.

The Advantage2 prototype is built with D-Wave's Zephyr connection technology that it claims offers higher connectivity between qubits than its predecessor topology called Pegasus, which is used in its Advantage QPU.

D-Wave says the Zephyr design enables shorter chains in its Advantage2 quantum chips, which can make them friendlier for calculations that require extra precision.

SEE:What is quantum computing? Everything you need to know about the strange world of quantum computers

"The Advantage2 prototype is designed to share what we're learning and gain feedback from the community as we continue to build towards the full Advantage2 system," says Emile Hoskinson, director of quantum annealing products at D-Wave.

"With Advantage2, we're pushing that envelope again demonstrating that connectivity and reduction in noise can be a delivery vehicle for even greater performance once the full system is available. The Advantage2 prototype is an opportunity for us to share our excitement and give a sneak peek into the future for customers bringing quantum into their applications."

While quantum computing is still experimental, senior execs are priming up for it as a business disruptor by 2030, according to a survey by consultancy EY. The firm found found that 81% of senior UK executives expect quantum computing to play a significant role in their industry by 2030.

Fellow consultancy McKinsey this month noted funding for quantum technology startups doubled in the past two years, from $700 million in 2020 to $1.4 billion in 2021. McKinsey sees quantum computing shaking up pharmaceuticals, chemicals, automotive, and finance industries, enabling players to "capture nearly $700 billion in value as early as 2035" through improved simulation and better machine learning. It expects revenues from quantum computing to exceed $90 billion by 2040.

D-Wave's investors include PSP Investments, Goldman Sachs, BDC Capital, NEC Corp, Aegis Group Partners, and the CIA's VC firm, In-Q-Tel.

See the rest here:
Quantum computing: D-Wave shows off prototype of its next quantum annealing computer - ZDNet

Businesses brace for quantum computing disruption by end of decade – The Register

While business leaders expect quantum computing to play a significant role in industry by 2030, some experts don't believe the tech is going to be ready for production deployment in the near future.

The findings, from a survey titled "2022 Quantum Readiness" commissioned by consultancy EY, refer to UK businesses, although it is likely that the conclusions are equally applicable to global organizations.

According to EY, 81 percent of senior UK executives expect quantum computing to have a significant impact in their industry within seven and a half years, with almost half (48 percent) believing that quantum technology will begin to transform industries as soon as 2025.

As for the naysayers who say quantum tech won't be ready for live deployment any time soon, the industry also suffers from a hype problem, with capabilities being exaggerated and even some accusations flying around of alleged falsification, as with the example of quantum startup IonQ that was recently accused by Scorpion Capital of misleading investors about the effectiveness of its quantum hardware.

Joseph Reger, Fujitsu Fellow, CTO of Central and Eastern Europe and Member of Quantum Computing Council of World Economic Forum, told The Register he is getting some "heat" for saying quantum is not nearly a thing yet.

"There are impressive advantages that pre-quantum or quantum-inspired technologies provide. They are less sexy, but very powerful."

He added: "Some companies are exaggerating the time scales. If quantum computing gets overhyped, we are likely to face the first quantum winter."

Fujitsu is itself developing quantum systems, and announced earlier this year that it was working to integrate quantum computing with traditional HPC technology. The company also unveiled a high performance quantum simulator based on its PRIMEHPC FX 700 systems that it said will serve as an important bridge towards the development of quantum computing applications in future.

Meanwhile, EY claims that respondents were "almost unanimous" in their belief that quantum computing will create a moderate or high level of disruption for their own organization, industry sector, and the broader economy in the next five years.

Despite this, the survey finds that strategic planning for quantum computing is still at an embryonic stage for most organizations, with only 33 percent involved in strategic planning for how quantum will affect them and only a quarter have appointed specialist leaders or set up pilot teams.

The survey conducted in February-March 2022 covered 501 UK-based executives, all with senior roles in their organisations, who had to demonstrate at least a moderate (but preferably a high) level of understanding of quantum computing. EY said they originally approached 1,516 executives, but only 501 met this requirement, which in and of itself tells a tale.

EY's Quantum Computing Leader, Piers Clinton-Tarestad, said the survey reveals a disconnect between the pace at which some industry leaders expect quantum to start affecting business and their preparedness for those impacts.

"Maximizing the potential of quantum technologies will require early planning to build responsive and adaptable organisational capabilities," he said, adding that this is a challenge because the progress of quantum has accelerated, but it is "not following a steady trajectory."

For example, companies with quantum processors have increased the power of their hardware dramatically over the past several years, from just a handful of qubits to over a hundred in the case of IBM, which expects to deliver a 4,158-qubit system by 2025. Yet despite these advances, quantum computers remain a curiosity, with most operational systems deployed in research laboratories or made available via a cloud service for developers to experiment with.

Clinton-Tarestad said "quantum readiness" is "not so much a gap to be assessed as a road to be walked," with the next steps in the process being regularly revisited as the landscape evolves. He warned businesses that expect to see disruption in their industry within the next three or five years need to act now.

According to EY's report, executives in consumer and retail markets are those most likely to believe that quantum will play a significant role by 2025, with just over half of technology, media and telecommunications (TMT) executives expecting an impact within the same time frame. Most respondents among health and life sciences companies think this is more likely to happen later, between 2026 and 2035.

Most organizations surveyed expect to start their quantum preparations within the next two years, with 72 percent aiming to start by 2024.

However, only a quarter of organizations have got as far as recruiting people with the necessary skills to lead quantum computing efforts, although 68 percent said they are aiming to set up pilot teams to explore the potential of quantum for their business by 2024.

Fear of falling behind because rival companies are working to develop their own quantum capabilities is driving some respondents to start quantum projects, while the applications of quantum computing anticipated by industry leaders would advance operations involving AI and machine learning, especially among financial services, automotive and manufacturing companies. TMT respondents cited potential applications in cryptography and encryption as being the most likely use of quantum computing.

While the EY report warns about companies potentially losing out to rivals on the benefits of quantum computing, there are also dangers that organizations should be preparing for now, as Intel warned about during its Intel Vision conference last month.

One of these is that quantum computers could be used to break current cryptographic algorithms, meaning that the confidentiality of both personal and enterprise data could be at risk. This is not a far-off threat, but something that organizations need to consider right now, according to Sridhar Iyengar, VP of Intel Labs and Director of Security and Privacy Research.

"Adversaries could be harvesting encrypted data right now, so that they can decrypt it later when quantum computers are available. This could be sensitive data, such as your social security number or health records, which are required to be protected for a long period of time," Iyengar told us.

Organizations may want to address threats like this by taking steps such as evaluating post-quantum cryptography algorithms and increasing the key sizes for current crypto algorithms like AES.

Or they may simply decide to adopt a wait and see attitude. EY will no doubt be on hand to sell consultancy services to help clarify their thinking.

Read the original post:
Businesses brace for quantum computing disruption by end of decade - The Register

Quantum computing can solve EVs safety woes – Times of India

Recent incidents of electric vehicle (EV) catching fire has shocked the Indian ecosystem and hindered the broad adoption of these vehicles. Before March of this year, there has been a substantial rise in the demand for electric vehicles and rapid advances in innovation and technology. Improvements in the battery technology, through increased efficiency and range, have made the EVs more accessible to the mass public, as the sector is currently dominated by two-wheelers and three-wheelers in India. According to Mordor Intelligence, Indias electric vehicle market was valued at $1.4 trillion in 2021, and it is expected to reach $15.4 trillion by 2027, recording a CAGR of 47.09% over the forecast period (2022-2027). Since March, the challenge in EV has shifted from affordability, charging, and range anxiety to safety. Safety has been of prime importance and an EV catching fire has led to dire consequences and even fatal.

The question is, why is this happening?

A report by the Defence Research and Development Organisations (DRDO) Centre for Fire Explosive and Environment Safety points it to the EV batteries. The issues highlighted includes poor quality cells, lack of fuse, issues with thermal management, and battery management system (BMS).

The highlighted issues cause the batteries to experience Thermal Runaway problem, leading to the fires. This phenomenon occurs when an increase in temperature changes the conditions in a manner that causes further increase in temperature, often leading to a destructive result. The issue highlighted by the DRDO report are all potential causes of thermal runaway. Lets explain why.

Local atmospheric temperature directly affects the operating temperature of battery. For efficient performance, batterys operating temperature should be around 20-35 C. To keep the battery at this temperature, EVs need battery thermal management system (BTMS). Now, with rising temperatures in our cities, the BTMS are being challenged and possibly due to the poor thermal management system of EV batteries, thermal runaway is being caused.

Another cause for the thermal runaway, is possibly due to the rapid battery charging. With the evolution of battery technology, charging technology is also advancing. While the fast charging can greatly improve the convenience of EVs, it increases the risks related to batteries. Fast charging an EV can overheat the battery system, enough to melt the electrical wires and cause short circuits, leading to explosive consequences, as already seen by several charging-related incidents.

While hot weather conditions and inadequate thermal management systems of the battery can negatively impact performance and shorten life, they alone cannot cause thermal runaway. As mentioned by DRDO report, inefficient, or even absence of, fuse as a fail-safe mechanism is a missing component causing thermal runaway.

The causes of thermal runaway highlighted above could be due to either inefficient design or not enough testing by EV manufacturers. But the manufacturers cannot spend more time on increased testing due to time-to-market constraints.

Whats the solution?

As stated, design and testing phase are very important phases of any product manufacturing. Since the era of industry 4.0, all design and testing have moved digitally and carried out on large-scale powerful computers through what is called Engineering Simulations (referred to as Simulations hereafter). Simulations can be of various types some of which are thermal (studying the effect of heat and temperature on object), structural (studying effect of objects strength, stress, and failure), fluid (studying effect of flow in and around an object), and electrochemical (studying effect of chemistry on electricity). Thermal runaway is a complex engineering problem, entailing all the types of simulations mentioned above. With the right simulation tools, simulations allow to mimic every possible physical condition, rising temperature, fast charging, or fuse placement and find areas of problem. After identifying, it can also aid in testing different solutions and hence avoid thermal runaway all together.

The question then becomes why are we seeing the news at all?

Biggest issue EV manufactures have with performing numerous simulations is the duration of time. To run a series of simulations, it can take months to obtain results with minimal flaws and defects (high accuracy simulations). Manufacturers cannot afford this as it greatly hampers the time to market. Thus, companies opt for simulations that can provide solutions but with several minor flaws and defects (low accuracy simulations) to them, leading to large mishaps like EV explosions, system failures, and affecting human lives. In addition, if the companies do find some time to perform these simulations with minimum flaws and defects (high accuracy simulations), the cost that manufacturers incur is very high due to the need for supercomputers whether on-premises (setup and maintenance cost) or on cloud (due high duration time of the computing).

So the real issue is the computing technology bottleneck. This is where the next-generation computing technology of Quantum computers can step in and revolutionize the industries like EV and Battery Design. This new technology is much more powerful, enabling exponential abilities to these industries.

Prospect of Quantum-powered simulations

The power Quantum computers is showcased by its ability to perform the same simulations in much less time compared to classical supercomputers. Hence, this technology can significantly help EV manufacturers in their time to market.

Moreover, the ability to obtain high accuracy from simulations is vital in using them in the product development process. Since high accuracy simulations took lot of time before, making them prohibitive, quantum-powered simulations can now enable the manufacturers to perform accurate simulations at reasonable time, in hours instead of months. Added accuracy will not only help companies create more efficient designs and improve the reliability of their vehicles, but also help in saving something invaluable, i.e., Lives. In addition, the speedup from Quantum computations enables lower computing usages, decreasing the overall cost and making it affordable for EV manufacturers.

Whats next?

In the computing sphere, Quantum Computing is the revolutionizing system, changing our understanding of computations and shows tremendous potential as shown by various use cases. While the prospect of Quantum-powered simulations offers the advantage of Better, Faster, and Cheaper, the development is very challenging as the Quantum computers work in entirely different ways.

Good news is that companies are already developing & building Quantum-powered simulation software, which can solve problems of thermal runaway and optimization of BTMS. Quantum Computing is here and now!

Views expressed above are the author's own.

END OF ARTICLE

The rest is here:
Quantum computing can solve EVs safety woes - Times of India

McKinsey thinks quantum computing could create $80b in revenue … eventually – The Register

In the hype-tastic world of quantum computing, consulting giant McKinsey & Company claims that the still-nascent field has the potential to create $80 billion in new revenue for businesses across industries.

It's a claim McKinsey has repeated nearly two dozen times on Twitter since March to promote its growing collection of research diving into various aspects of quantum computing, from startup and government funding to use cases and its potential impact on a range of industries.

The consulting giant believes this $80 billion figure represents the "value at stake" for quantum computing players but not the actual value that use cases could create [PDF]. This includes companies working in all aspects of quantum computing, from component makers to service providers.

Despite wildly optimistic numbers, McKinsey does ground the report in a few practical realities. For instance, in a Wednesday report, the firm says the hardware for quantum systems "remains too immature to enable a significant number of use cases," which, in turn, limits the "opportunities for fledgling software players." The authors add that this is likely one of the reasons why the rate of new quantum startups entering the market has begun to slow.

Even the top of McKinsey's page for quantum computing admits that capable systems won't be ready until 2030, which is in line with what various industry players, including Intel, are expecting. Like fusion, it's always a decade or so away.

McKinsey, like all companies navigating if quantum computing has any real-world value, is trying to walk a fine line, exploring the possibilities of quantum computing while showing the ways the tech is still disconnected from ordinary enterprise reality.

"While quantum computing promises to help businesses solve problems that are beyond the reach and speed of conventional high-performance computers, use cases are largely experimental and hypothetical at this early stage. Indeed, experts are still debating the most foundational topics for the field," McKinsey wrote in a December 2021 article about how use cases "are getting real."

One could argue the report is something of a metaphor for the quantum industry in 2022. Wildl optimism about future ecosystem profitability without really understanding what the tech will mean and to whom--and at what scale.

Go here to see the original:
McKinsey thinks quantum computing could create $80b in revenue ... eventually - The Register

Global Quantum Computing in Communication Market 2022 Precise Scenario Covering Trends, Opportunities and Growth Forecast 2028 Ripon College Days -…

The MarketandResearch.biz recent analysis on the Global Quantum Computing in Communication Market anticipates exponential growth from 2022 to 2028. The report presents a market proportion estimation in terms of volumes for the predicted period. The book focuses on market research from the past and present, which serves as a framework for evaluating the markets potential. The study is based on in-depth research of market dynamics, market size, problems, challenges, competition analysis, and the companies involved. The study takes a close look at a number of critical factors that drive the global Quantum Computing in Communication markets growth.

The report provides a comprehensive analysis of the global Quantum Computing in Communication market, including market trends, market size, market value, and market growth over the forecast period, both on a compound and yearly basis. This document contains a comprehensive analysis of the companys future prospects. The research defines the market situation and forecast details of the main zones with a logical presentation of leading producers, product categories, and final associations.

DOWNLOAD FREE SAMPLE REPORT: https://www.marketandresearch.biz/sample-request/221953

The study looks at competing factors that are important for pushing your business to the next level of innovation. This report then forecasts the market development patterns for this industry. An analysis of upstream raw resources, downstream demand, and current market dynamics is also included in this part.

The major regions covered in the report are:

The product can be segmented into the following market segments based on its type:

Market segmentation by application, broken down into:

The following are the prominent players profiled in the market report:

ACCESS FULL REPORT: https://www.marketandresearch.biz/report/221953/global-quantum-computing-in-communication-market-growth-status-and-outlook-2021-2027

To estimate and forecast the market size, variables such as product price, production, consumption/adoption, import & export, penetration rate, regulations, innovations, technical advancements, demand in specific countries, demand by specific end-use, socio-economic factors, inflation, legal factors, historic data, and regulatory framework were examined.

Customization of the Report:

This report can be customized to meet the clients requirements. Please connect with our sales team (sales@marketandresearch.biz), who will ensure that you get a report that suits your needs. You can also get in touch with our executives on 1-201-465-4211 to share your research requirements.

Contact UsMark StoneHead of Business DevelopmentPhone: 1-201-465-4211Email: sales@marketandresearch.biz

Read the original post:
Global Quantum Computing in Communication Market 2022 Precise Scenario Covering Trends, Opportunities and Growth Forecast 2028 Ripon College Days -...

Chip startups using light instead of wires gaining speed and investments – Reuters

April 26 (Reuters) - Computers using light rather than electric currents for processing, only years ago seen as research projects, are gaining traction and startups that have solved the engineering challenge of using photons in chips are getting big funding.

In the latest example, Ayar Labs, a startup developing this technology called silicon photonics, said on Tuesday it had raised $130 million from investors including chip giant Nvidia Corp (NVDA.O).

While the transistor-based silicon chip has increased computing power exponentially over past decades as transistors have reached the width of several atoms, shrinking them further is challenging. Not only is it hard to make something so miniscule, but as they get smaller, signals can bleed between them.

Register

So, Moore's law, which said every two years the density of the transistors on a chip would double and bring down costs, is slowing, pushing the industry to seek new solutions to handle increasingly heavy artificial intelligence computing needs.

According to data firm PitchBook, last year silicon photonics startups raised over $750 million, doubling from 2020. In 2016 that was about $18 million.

"A.I. is growing like crazy and taking over large parts of the data center," Ayar Labs CEO Charles Wuischpard told Reuters in an interview. "The data movement challenge and the energy consumption in that data movement is a big, big issue."

The challenge is that many large machine-learning algorithms can use hundreds or thousands of chips for computing, and there is a bottleneck on the speed of data transmission between chips or servers using current electrical methods.

Light has been used to transmit data through fiber-optic cables, including undersea cables, for decades, but bringing it to the chip level was hard as devices used for creating light or controlling it have not been as easy to shrink as transistors.

PitchBooks senior emerging technology analyst Brendan Burke expects silicon photonics to become common hardware in data centers by 2025 and estimates the market will reach $3 billion by then, similar to the market size of the A.I. graphic chips market in 2020.

Beyond connecting transistor chips, startups using silicon photonics for building quantum computers, supercomputers, and chips for self-driving vehicles are also raising big funds.

PsiQuantum raised about $665 million so far, although the promise of quantum computers changing the world is still years out.

Lightmatter, which builds processors using light to speed up AI workloads in the datacenter, raised a total of $113 million and will release its chips later this year and test with customers soon after.

Luminous Computing, a startup building an AI supercomputer using silicon photonics backed by Bill Gates, raised a total of $115 million.

It is not just the startups pushing this technology forward. Semiconductor manufacturers are also gearing up to use their silicon chip-making technology for photonics.

GlobalFoundries Head of Computing and Wired Infrastructure Amir Faintuch said collaboration with PsiQuantum, Ayar, and Lightmatter has helped build up a silicon photonics manufacturing platform for others to use. The platform was launched in March.

Peter Barrett, founder of venture capital firm Playground Global, an investor in Ayar Labs and PsiQuantum, believes in the long-term prospects for silicon photonics for speeding up computing, but says it is a long road ahead.

"What the Ayar Labs guys do so well ... is they solved the data interconnect problem for traditional high-performance (computing)," he said. "But it's going to be a while before we have pure digital photonic compute for non-quantum systems."

Register

Reporting by Jane Lanhee Lee; Editing by Stephen Coates

Our Standards: The Thomson Reuters Trust Principles.

Read more:
Chip startups using light instead of wires gaining speed and investments - Reuters

Aalto University Wins a 2.5 Million ($2.66M USD) Grant to Develop a New Type of Superconducting Qubit – Quantum Computing Report

Aalto University Wins a 2.5 Million ($2.66M USD) Grant to Develop a New Type of Superconducting Qubit

The award was made by the European Research Council for a project named ConceptQ. It will cover a five year period to research a new superconducting quantum device concept utilizing increased anharmonicity, simple structure, and insensitivity to charge and flux noise. One problem with superconducting qubits is that they can sometimes end up in states other than |0> or |1>, These states are sometimes called Qutrits which could potentially be in a superpositions of three different states denoted as |0>, |1>, and |2>. In current quantum processors, the |2> state is not desired and could cause a loss of qubit fidelity. Aaltos new qubit design is meant to reduce or eliminate the occurrence of the |2> state which would remove a source of errors and help to increase the accuracy of the calculation. Another aspect of the project will be to develop low temperature cryoCMOS electronics that can be used to control qubits inside a dilution refrigerator. More information about this grant and the ConceptQ project is available in a news release posted available on the Aalto University website here.

April 26, 2022

This site uses Akismet to reduce spam. Learn how your comment data is processed.

See the rest here:
Aalto University Wins a 2.5 Million ($2.66M USD) Grant to Develop a New Type of Superconducting Qubit - Quantum Computing Report