Businesses brace for quantum computing disruption by end of decade – The Register

While business leaders expect quantum computing to play a significant role in industry by 2030, some experts don't believe the tech is going to be ready for production deployment in the near future.

The findings, from a survey titled "2022 Quantum Readiness" commissioned by consultancy EY, refer to UK businesses, although it is likely that the conclusions are equally applicable to global organizations.

According to EY, 81 percent of senior UK executives expect quantum computing to have a significant impact in their industry within seven and a half years, with almost half (48 percent) believing that quantum technology will begin to transform industries as soon as 2025.

As for the naysayers who say quantum tech won't be ready for live deployment any time soon, the industry also suffers from a hype problem, with capabilities being exaggerated and even some accusations flying around of alleged falsification, as with the example of quantum startup IonQ that was recently accused by Scorpion Capital of misleading investors about the effectiveness of its quantum hardware.

Joseph Reger, Fujitsu Fellow, CTO of Central and Eastern Europe and Member of Quantum Computing Council of World Economic Forum, told The Register he is getting some "heat" for saying quantum is not nearly a thing yet.

"There are impressive advantages that pre-quantum or quantum-inspired technologies provide. They are less sexy, but very powerful."

He added: "Some companies are exaggerating the time scales. If quantum computing gets overhyped, we are likely to face the first quantum winter."

Fujitsu is itself developing quantum systems, and announced earlier this year that it was working to integrate quantum computing with traditional HPC technology. The company also unveiled a high performance quantum simulator based on its PRIMEHPC FX 700 systems that it said will serve as an important bridge towards the development of quantum computing applications in future.

Meanwhile, EY claims that respondents were "almost unanimous" in their belief that quantum computing will create a moderate or high level of disruption for their own organization, industry sector, and the broader economy in the next five years.

Despite this, the survey finds that strategic planning for quantum computing is still at an embryonic stage for most organizations, with only 33 percent involved in strategic planning for how quantum will affect them and only a quarter have appointed specialist leaders or set up pilot teams.

The survey conducted in February-March 2022 covered 501 UK-based executives, all with senior roles in their organisations, who had to demonstrate at least a moderate (but preferably a high) level of understanding of quantum computing. EY said they originally approached 1,516 executives, but only 501 met this requirement, which in and of itself tells a tale.

EY's Quantum Computing Leader, Piers Clinton-Tarestad, said the survey reveals a disconnect between the pace at which some industry leaders expect quantum to start affecting business and their preparedness for those impacts.

"Maximizing the potential of quantum technologies will require early planning to build responsive and adaptable organisational capabilities," he said, adding that this is a challenge because the progress of quantum has accelerated, but it is "not following a steady trajectory."

For example, companies with quantum processors have increased the power of their hardware dramatically over the past several years, from just a handful of qubits to over a hundred in the case of IBM, which expects to deliver a 4,158-qubit system by 2025. Yet despite these advances, quantum computers remain a curiosity, with most operational systems deployed in research laboratories or made available via a cloud service for developers to experiment with.

Clinton-Tarestad said "quantum readiness" is "not so much a gap to be assessed as a road to be walked," with the next steps in the process being regularly revisited as the landscape evolves. He warned businesses that expect to see disruption in their industry within the next three or five years need to act now.

According to EY's report, executives in consumer and retail markets are those most likely to believe that quantum will play a significant role by 2025, with just over half of technology, media and telecommunications (TMT) executives expecting an impact within the same time frame. Most respondents among health and life sciences companies think this is more likely to happen later, between 2026 and 2035.

Most organizations surveyed expect to start their quantum preparations within the next two years, with 72 percent aiming to start by 2024.

However, only a quarter of organizations have got as far as recruiting people with the necessary skills to lead quantum computing efforts, although 68 percent said they are aiming to set up pilot teams to explore the potential of quantum for their business by 2024.

Fear of falling behind because rival companies are working to develop their own quantum capabilities is driving some respondents to start quantum projects, while the applications of quantum computing anticipated by industry leaders would advance operations involving AI and machine learning, especially among financial services, automotive and manufacturing companies. TMT respondents cited potential applications in cryptography and encryption as being the most likely use of quantum computing.

While the EY report warns about companies potentially losing out to rivals on the benefits of quantum computing, there are also dangers that organizations should be preparing for now, as Intel warned about during its Intel Vision conference last month.

One of these is that quantum computers could be used to break current cryptographic algorithms, meaning that the confidentiality of both personal and enterprise data could be at risk. This is not a far-off threat, but something that organizations need to consider right now, according to Sridhar Iyengar, VP of Intel Labs and Director of Security and Privacy Research.

"Adversaries could be harvesting encrypted data right now, so that they can decrypt it later when quantum computers are available. This could be sensitive data, such as your social security number or health records, which are required to be protected for a long period of time," Iyengar told us.

Organizations may want to address threats like this by taking steps such as evaluating post-quantum cryptography algorithms and increasing the key sizes for current crypto algorithms like AES.

Or they may simply decide to adopt a wait and see attitude. EY will no doubt be on hand to sell consultancy services to help clarify their thinking.

Read more from the original source:
Businesses brace for quantum computing disruption by end of decade - The Register

Global Next Generation Computing Market Size, Share & Industry Trends Analysis Report By Type, By Component, By Offering, By Organization Size, By…

New York, June 14, 2022 (GLOBE NEWSWIRE) -- Reportlinker.com announces the release of the report "Global Next Generation Computing Market Size, Share & Industry Trends Analysis Report By Type, By Component, By Offering, By Organization Size, By End User, By Regional Outlook and Forecast, 2022 2028" - https://www.reportlinker.com/p06283453/?utm_source=GNW Its also known as high-performance computing, and it employs quantum computing technologies. It processes data with quantum bits, rather than traditional computers. Furthermore, when compared to traditional computers, next-generation computing is more capable of doing complex computations, which is a primary driving force behind the markets expansion. Its also used in aerospace and defence, banking and financial services, healthcare and life science, energy and utilities, manufacturing, information technology and telecommunications, and other fields.

The next-generation computing industry is driven by several factors, including increased investments in next-generation computing technology, increased demand for high-performance computing, and increased demand for next-generation computing from scientific science and the capital industry. Over the forecast period, factors such as rising expenditures in artificial intelligence (AI), Industrial Internet of Things (IIoT), and engineering, which involve electronic design automation (EDA), are expected to boost the market.

Without the necessary tools and advanced technologies, meeting the escalating need for quick product development cycles (PLCs) and maintaining consistent quality becomes nearly impossible in real-time. Various sectors, such as automobile and medical robots, are using next-generation computing systems with computer-aided engineering programs for high-fidelity modelling simulation.

Machine learning (ML), Physical modelling, and optimization, in a variety of industrial applications, including financial modelling and life science simulation, are just a few examples of how next-generation computing may help solve complicated issues quickly. In addition, regulatory standards for energy consumption, sustainability, and safety, along with cost pressure, are at an all-time high around the world and are increasing rapidly, leading to increased complexity for development engineers.

COVID-19 Impact

The market for next-generation computing has grown in recent years; but, following the outbreak of the COVID-19 pandemic, the sector would see a minor fall in software sales in 2020. This is due to governments in the majority of countries putting the country on lockdown and shutting down cities to avoid the virus from spreading. Following the recuperation from the COVID-19 pandemic, the next-generation computer sector is expected to thrive in the future years. Moreover, several firms around Asia are implementing modern computer technologies to better their business processes and operational efficiency. Furthermore, several countries have implemented quantum computing apps and utilized quantum computing solutions for their health and life sciences operations, all without the virus spreading to the general population.

Market Growth Factors

High demand in science and healthcare sector

Advances in genetic, personalized medicine, the mass acceptance of health records (EHRs) and digital photography, as well as the growing multiplication of medical IoT and mobile devices have resulted in a massive growth of structured or unstructured healthcare-related data. The healthcare business has always been at the cutting edge of technology adoption for the past two decades, due to the rising need for data analysis. Furthermore, one of the primary elements that complemented the acceptance of powerful computational solutions in the industry was the necessity to hasten drug development and genomics-related research. The use of AI in the medical field to assist healthcare professionals with diagnoses has been a big enabler for the adoption of these technologies in industrial settings.

Innovations in next-generation computing technology

The rise of next-generation computing technologies such as high-performance processing and quantum technology, as well as continuous prospective advances noticed by major sectors, are driving market expansion. For example, industry behemoths like NASA, Lockheed Martin, Goldman Sachs Group, and other government agencies are investing in this technologys research and development. Another example, Google LLC teamed up with NASA and Oak Ridge National Laboratory in October 2019 to create the greatest quantum information service in the world. Sandia National Laboratories will also receive million from the US Department of Energys Innovative Scientific Computing Research Programs.

Market Restraining Factors

Lack of Utilization by SMEs

Many SMEs are unaware of the importance of Next-Generation Computing Market and lack the financial resources to put up such systems. Due to the high investment costs, many SMEs in underdeveloped countries are still hesitant to embrace Next-Generation Computing. Many of them are ignorant of the numerous benefits, such as improved performance and customized delivery. These businesses also lack the skills and know-how required to set up and maintain an Next-Generation Computing system. As a result, lack of awareness amongst SMEs hinders market expansion. Cloud computing, on the other hand, has the potential to increase adoption among SMEs by significantly lower prices.

Type Outlook

Based on Type, the market is segmented into High Performance Computing, Quantum Computing, Energy Efficiency Computing, Memory Based Computing, Approximate & Probabilistic Computing, Brain Type Computing, Optical Computing, Thermodynamic Computing, and Others. The high performance computing segment acquired the highest revenue share in the Next-Generation Computing Market in 2021. Parallel computational and Supercomputers techniques, processing algorithms, and systems are used in high-performance computing to address complicated computational problems. Next-Generation Computing uses a variety of approaches, such as computer modelling, simulation, and analysis, to solve complex computational problems and conduct research while allowing multiple users to access computing resources at the same time.

Component Outlook

Based on Component, the market is segmented into Hardware, Software, and Services. The software segment witnessed a substantial revenue share in the Next-Generation Computing Market in 2021. The implementation of this software improves customer satisfaction in several works for large such as IT & telecommunications, BFSI, and healthcare, maximizing the demand for existing customers while lowering operating costs. This supports the implementation of solutions that are required to properly manage the software.

Offering Outlook

Based on Offering, the market is segmented into On-premise and Cloud. The On-premise segment acquired the maximum revenue share in the Next-Generation Computing Market in 2021. The aspects that can be credited as governments continue to be interested in obtaining sensitive data defense and security and private details of citizens, businesses are worried about the protection of their administrative data. This is due to a variety of benefits provided by the on-premise implementation, including a strong level of data protection and safety. As a result, on-premise infrastructure is preferred over cloud-based technology. In the coming years, such factors are expected to boost the on-premise segments growth.

Organization size Outlook

Based on Organization size, the market is segmented into Large Enterprises and Small & Medium Enterprises. The small & medium enterprises segment registered a substantial revenue share in the Next-Generation Computing Market in 2021. This is because SMEs are migrating their organizations to a digital platform and implementing next-generation computing solutions, permitting businesses to become more productive, intelligent, and efficient.

End User Outlook

Based on End User, the market is segmented into Government, BFSI & Telecom, Space & Defense, Energy & Power, Chemicals, Healthcare, Academia, and Others. The government segment garnered the highest revenue share in the Next-Generation Computing Market in 2021. The rise of the Next-generation computing is being fueled by governments investing in breakthrough technologies for military and defense, law enforcement, and the Securities and Exchange Commission to detect fraud risk and identify trade infractions. This is due to the government and defense agencies active adoption of cutting-edge IT systems to improve computing efficiency.

Regional Outlook

Based on Regions, the market is segmented into North America, Europe, Asia Pacific, and Latin America, Middle East & Africa. North America garnered the largest revenue share in the Next-Generation Computing Market in 2021. To increase their regional coverage and reach, regional market sellers have formed new partnerships with other companies. For example, Graph core developed its partner programme to expand its ability to contact new customers and help them scale up using its intelligence processing unit (IPU) products. To increase its presence in the North American market, the company added many new partners to its partner network, including Applied Data Systems and Images ET Technologies, among others.

The major strategies followed by the market participants are Partnerships. Based on the Analysis presented in the Cardinal matrix; Google LLC and Microsoft Corporation are the forerunners in the Next-Generation Computing Market. Companies such as IBM Corporation, Oracle Corporation and Intel Corporation are some of the key innovators in the Market.

The market research report covers the analysis of key stake holders of the market. Key companies profiled in the report include IBM Corporation, Atos Group, Cisco Systems, Inc., Hewlett-Packard Enterprise Company, Amazon Web Services, Inc., Microsoft Corporation, Intel Corporation, Oracle Corporation, Google LLC, and Alibaba Group Holding Limited.

Recent strategies deployed in Next-Generation Computing Market

Partnerships, Collaborations and Agreements:

Feb-2022: Hewlett Packard Enterprise joined hands with Ayar Labs, the leader in chip-to-chip optical connectivity. Together, the companies aimed to escort a new era of data centre innovation by growing silicon photonics solutions based on optical I/O technology. Additionally, the modernization of these technologies would support future demands for high-performance computing and artificial intelligence solutions.

Feb-2022: Amazon Web Services came into a partnership with Kyndryl, the information technology infrastructure services provider. Together, the companies aimed to provide separate skills, expertise, and global resources to assist consumers in upgrading enterprises through Industry-based companies cloud services and solutions. Additionally, AWS would provide solutions for Kyndryls top industry consumers across the world, Kyndryl plans to build out its internal architecture in the cloud, using AWS as a prefered cloud supplier.

Jan-2022: Amazon signed a multi-year agreement with Stellantis, a leading global mobility and automaker supplier. Through this agreement, the companies aimed to transform the in-vehicle experience for millions of Stellantis consumers and modernize the mobility industrys transition to a defendable software explained future.

Jan-2022: Oracle Cloud Infrastructure came into a partnership with Syntax, a leading provider of multi-cloud and mission-critical application managed services. Through this partnership, the companies would allow on-premises Oracle E-Business offering consumers to roam or extend their solutions by taking benefits of OCIs low price, better performance, enhanced scalability, and a broad array of platform services.

Jan-2022: Microsoft joined hands with Qualcomm Technologies, an American multinational corporation. Together, the companies aimed to boost the adaptation of augmented reality in both the customers and business sector. Additionally, Qualcomm Technologies is working with Microsoft around various actions to propel the environment, developing custom AR chips to allow a new wave of power-efficient, lightweight AR glasses to provide immersive and rich experiences, and plans to combine software such as Microsoft Mesh and Snapdragon Spaces XR Developer Platform.

Nov-2021: Amazon Web Services formed a partnership with Nasdaq, an online global marketplace for buying and trading securities. Through this partnership, the companies aimed to build the next generation of cloud authorized architecture for the worlds capital markets, helping to boost innovation and enhance enterprise procedures.

Nov-2021: IBM formed a partnership with Amazon Web Services, a subsidiary of Amazon providing on-demand cloud computing platforms. Together, the companies aimed to integrate the advantages of IBM Open Data for companies for IBM Cloud Pak for Data and the AWS Cloud to provide energy consumers. Additionally, the solution would run on the AWS Cloud and streamline the capacity for consumers to run work tasks in the AWS cloud and on-premises. Moreover, the companies would collaborate on further advancement of future functionality to offer better flexibility and choice on where to run OSDU applications.

Nov-2021: Google Cloud joined hands with Genesys Telecommunications, a full-service contact centre solution. Together, the companies aimed to create new next-generation AI, data analytics and machine learning applications that would help enterprises provide stronger, more instinctive and active experiences.

Oct-2021: Cisco system extended its partnership with Tata Communications, an Indian telecommunications company. Together, the companies aimed to enhance companies with easy and simple to manage, deploy, and analyse IT infrastructure for providing anytime, anywhere access. Additionally, Cisco Meraki with Tata Communications environment to provide a leading offering of next-generation cloud-managed Wi-Fi services based on the advanced Wi-Fi 6 technology and SD-WAN services around multiple enterprises. Moreover, the integrated expertise assures smooth lifecycle management and advanced consumer experience to the companys stakeholders with greater efficiency, security, and agility.

Jul-2021: Google Cloud entered into a partnership with AT&T, an American multinational conglomerate holding company. Through this partnership, the companies aimed to provide transformative abilities that assist enterprises to propel real value and build industry-changing experiences in healthcare, retail, entertainment, manufacturing, and more, with the capabilities to use Android, Google Maps, augmented reality, Pixel, and virtual reality, and other solutions around Google for more enveloping consumer experiences.

Jun-2021: Google Quantum AI formed a partnership with Boehringer Ingelheim, a world-leading contract manufacturer of biopharmaceuticals. Together, the companies aimed to implement and research cutting-edge use cases for quantum computing in pharmaceutical research and development, particularly molecular dynamics simulations.

Mar-2021: Intel teamed up with IBM, an American multinational technology corporation. Through this collaboration, the companies aimed to boost semiconductor production innovation across the environment, improve the challenges of the U.S. semiconductor enterprise and support key U.S. government actions.

Product Launches and Product Expansions:

Apr-2022: IBM introduced IBM z16, the first integrated on-chip AI accelerator. The chip provides latency advanced assumptions designed to allow consumers to analyze real-time transactions at scale on crucial applications. Additionally, IBM z16 is developed to protect consumers from harvest now, and decrypt later attacks with the companys first quantum-secure system.

Feb-2022: Atos introduced the BullSequana XH3000, a new exascale-class supercomputer. The hybrid computing platform provides incomparable flexibility and performance to allow top researchers and scientists to advance research in sectors including weather forecasting and climate change, genomics, and new drug discovery. Additionally, BullSequana XH3000 are manufactured and designed in Europe at its factory in Angers, France, this is Atos most powerful and organized supercomputer and a crucial element in securing digital and economic jurisdiction.

Nov-2021: Oracle introduced Oracle Cloud Infrastructure AI services. The AI provides a compilation of services that make it accessible for developers to apply AI services to applications without demanding data science expertise.

Oct-2021: Intel introduced the 12th Gen Intel Core family, along with Intel Core i9-12900K. The new processor provides a new performance hybrid infrastructure that provides leaps in multi-threaded performance, allowing up to 2 times quick content creation compared to prior generation2.

Sep-2021: Oracle introduced Oracle Exadata X9M platforms, the industrys fastest and most affordable systems. The Exadata X9M portfolio provides Oracle Exadata Database Machine X9M and Exadata Cloud@Customer X9M, the only platform that runs Oracle Autonomous Database in consumer data centres.

May-2021: IBM introduced a 2-nanometer nanosheet technology semiconductor. This chip plays a crucial role in everything from computing to communication devices, to appliances, transportation structure, and crucial infrastructure.

May-2021: Google introduced Tensor Processing Units AI chips. The next-generation custom chip offers the fourth generation of chips which provides twice as fast as the last variant.

Mar-2021: Cisco introduced a new offering of networking systems. The suite offers three core elements such as full-stack visibility of network applications, expanded secure access service edge architecture, and new network-as-a-service solutions developed to provide modest IT and flexible accession for consumers looking for quick speed, scale, and agility. Additionally, the portfolio is designed to allow consumers to build the infrastructure for enterprise in the new world defined by the Covid-19 pandemic.

Acquisitions and Mergers:

Mar-2022: Google signed an agreement to acquire Mandiant, a publicly traded American cybersecurity firm. This acquisition of Mandiant would achieve Google Clouds existing strengths in security. Additionally, Google Cloud provide consumers with a robust set of services including advanced abilities such as BeyondCorp Enterprise for Zero Trust and VirusTotal for malevolent content and software susceptibility.

Feb-2022: Intel Corporation took over Tower Semiconductor, a leading foundry for analogue semiconductor solutions. This acquisition would advance Intels IDM 2.0 strategy as the enterprise further expands its production abilities, global footprint and technology offering to address extraordinary industry requirements.

Oct-2021: Atos took over DataSentics, a technology consultancy specialising in data science. Through this acquisition, Atos would improve its AI/ML and Computer Vision offering with new AI-intensive products and data science abilities and welcome highly-skilled professionals of approximately 100 AI/ML engineers and data scientists.

Nov-2020: IBM completed the acquisition of Instana, a leading enterprise application and observability performance monitoring platform. This acquisition aimed to help enterprises to better control the problems of advanced applications that span the hybrid cloud landscape.

Mar-2020: Microsoft completed the acquisition with Affirmed Networks, a cloud-native networking solutions for telecom operators. Under this acquisition, the company aimed to target wide cloud suppliers looking to get intensity into the telco enterprise.

Scope of the Study

Market Segments covered in the Report:

By Type

High Performance Computing

Quantum Computing

Energy Efficiency Computing

Memory Based Computing

Approximate & Probabilistic Computing

Brain Type Computing

Optical Computing

Thermodynamic Computing

Others

By Component

Hardware

Software

Services

By Offering

On-premise

Cloud

By Organization size

Large Enterprises

Small & Medium Enterprises

By End User

Government

BFSI & Telecom

Space & Defense

Energy & Power

Chemicals

Healthcare

Academia

Others

By Geography

North America

o US

o Canada

o Mexico

o Rest of North America

Europe

o Germany

o UK

o France

o Russia

o Spain

Read more:
Global Next Generation Computing Market Size, Share & Industry Trends Analysis Report By Type, By Component, By Offering, By Organization Size, By...

What’s So Great About Quantum Computing? A Q&A with NIST Theorist Alexey Gorshkov – HPCwire

The following is a Q&A originally published on Taking Measure, the official blog of the National Institute of Standards and Technology (NIST). Photo credit: NIST.

As the rise of quantum computers becomes the subject of more and more news articles especially those that prophesy these devices ability to crack the encryption that protects secure messages, such as our bank transfers its illuminating to speak with one of the quantum experts who is actually developing the ideas behind these as-yet-unrealized machines. Whereas ordinary computers work with bits of data that can be either 0 or 1, quantum computers work with bits called qubits that can be 0 and 1 simultaneously, enabling them to perform certain functions exponentially faster, such as trying out the different keys that can break encryption.

Simple quantum computers already exist, but it has been extremely challenging to build powerful versions of them. Thats because the quantum world is so delicate; the tiniest disturbances from the outside world, such as stray electrical signals, can cause a quantum computer to crash before it can carry out useful calculations.

National Institute of Standards and Technology (NIST) public affairs specialist Chad Boutin interviewed Alexey Gorshkov, a NIST theorist at NIST/University of MarylandsJoint Center for Quantum Information and Computer Science(QuICS) andJoint Quantum Institute, who works at the intersection of physics and computer science research. His efforts are helping in the design of quantum computers, revealing what capabilities they might possess, and showing why we all should be excited about their creation.

We all hear about quantum computers and how many research groups around the world are trying to help build them. What has your theoretical work helped clarify about what they can do and how?

I work on ideas for quantum computer hardware. Quantum computers will be different from the classical computers we all know, and they will use memory units called qubits. One thing I do is propose ideas for various qubit systems made up of different materials, such as neutral atoms. I also talk about how to make logic gates, and how to connect qubits into a big computer.

Another thing my group does is propose quantum algorithms: software that one can potentially run on a quantum computer. We also study large quantum systems and figure out which ones have promise for doing useful computations faster than is possible with classical computers. So, our work covers a lot of ground, but theres a lot to do. You have this big, complicated beast in front of you and youre trying to chip away at it with whatever tools you have.

You focus on quantum systems. What are they?

I usually start by saying, at very small scales the world obeys quantum mechanics. People know about atoms and electrons, which are small quantum systems. Compared to the big objects we know, they are peculiar because they can be in two seemingly incompatible states at once, such as particles being in two places at the same time. The way these systems work is weird at first, but you get to know them.

Large systems, made up of a bunch of atoms, are different from individual particles. Those weird quantum effects we want to harness are hard to maintain in bigger systems. Lets say you have one atom thats working as a quantum memory bit. A small disturbance like a nearby magnetic field has a chance of causing the atom to lose its information. But if you have 500 atoms working together, that disturbance is 500 times as likely to cause a problem. Thats why classical physics worked well enough for so many years: Because classical effects overwhelm weird quantum effects so easily, usually classical physics is enough for us to understand the big objects we know from our everyday life.

What were doing is trying to understand and build large quantum systems that stay quantum something we specialists call coherent even when they are large. We want to combine lots of ingredients, say 300 qubits, and yet ensure that the environment doesnt mess up the quantum effects we want to harness. Large coherent systems that are not killed by the environment are hard to create or even simulate on a classical computer, but coherence is also what will make the large systems powerful as quantum computers.

What is compelling about a large quantum system?

One of the first motivations for trying to understand large quantum systems is potential technological applications. So far quantum computers havent done anything useful, but people think they will very soon and its very interesting. A quantum internet would be a secure internet, and it also would allow you to connect many quantum computers to make them more powerful. Im fascinated by these possibilities.

Its also fascinating because of fundamental physics. You try to understand why this system does some funny stuff. I think a lot of scientists just enjoy doing that.

Why are you personally so interested in quantum research?

I got my first exposure to it after my junior year in college. I quickly found it has a great mix of math, physics, computer science and interactions with experimentalists. The intersection of all these fields is why its so much fun. I like seeing the connections. You end up pulling an idea from one field and applying it to another and it becomes this beautiful thing.

Lots of people worry that a quantum computer will be able to break all our encryption, revealing all our digitized secrets. What are some less worrying things they might be able to do that excite you?

Before I get into what excites me, let me say first that its important to remember that not all of our encryption will break. Some encryption protocols are based on math problems that will be vulnerable to a quantum computer, but other protocols arent. NISTs post-quantum cryptography project is working on encryption algorithms that could foil a quantum computer.

As for what excites me, lots does! But here are a couple of examples.

One thing we can do is simulation. We might be able to simulate really complicated things in chemistry, materials science and nuclear physics. If you have a big complex chemical reaction and you want to figure out how its taking place, you have to be able to simulate a big molecule that has lots of electrons in a cloud around it. Its a mess, and its hard to study. A quantum computer can in principle answer these questions. So maybe you could use it to find a new drug.

Another possibility is finding better solutions to what are called classical optimization problems, which give classical computers a lot of trouble. An example is, What are more efficient ways to direct shipments in a complex supply chain network? Its not clear whether quantum computers will be able to answer this question any better than classical computers, but theres hope.

A follow-up to the previous question: If quantum computers arent actually built yet, how do we know anything about their abilities?

We know or think we know the microscopic quantum theory that qubits rely on, so if you put these qubits together, we can describe their capabilities mathematically, and that would tell us what quantum computers might be able to do. Its a combination of math, physics and computer science. You just use the equations and go to town.

There are skeptics who say that there might be effects we dont know about yet that would destroy the ability of large systems to remain coherent. Its unlikely that these skeptics are right, but the way to disprove them is to run experiments on larger and larger quantum systems.

Are you chasing a particular research goal? Any dreams youd like to realize someday, and why?

The main motivation is a quantum computer that does something useful. Were living in an exciting time. But another motivation is just having fun. As a kid in eighth grade, I would try to solve math problems for fun. I just couldnt stop working on them. And as you have fun, you discover things. The types of problems we are solving now are just as fun and exciting to me.

Lastly, why NIST? Why is working at a measurement lab on this research so important?

Quantum is at the heart of NIST, and its people are why. We have top experimentalists here including multipleNobel laureates. NIST gives us the resources to do great science. And its good to work for a public institution, where you can serve society.

In many ways, quantum computing came out of NIST and measurement: It came out of trying to build better clocks.Dave Winelands work with ions is important here.Jun Yes work with neutral atoms is too. Their work led to the development of amazing control over ions and neutral atoms, and this is very important for quantum computing.

Measurement is at the heart of quantum computing. An exciting open question that lots of people are working on is how to measure the quantum advantage, as we call it. Suppose someone says, Here is a quantum computer, but just how big is its advantage over a classical computer? Were proposing how to measure that.

Read more:
What's So Great About Quantum Computing? A Q&A with NIST Theorist Alexey Gorshkov - HPCwire

The race toward a new computing technology is heating up and Asia is jumping on the trend – CNBC

A quantum computer in a vibration-free building. Quantum computing will ultimately speed up the computational power that drives many industries and could affect everything from drug discovery to how data is secured.

Oliver Berg | Picture Alliance | Getty Images

Quantum computing was already gathering pace in Japan and elsewhere in Asia when the University of Tokyo and IBM launched their new quantum computer last year.

The computer was the second such system built outside the United States by IBM the latest in a string of key moves in quantum research.

The university and IBM have led the Quantum Innovation Initiative Consortium alongside heavyweights of Japanese industry like Toyota and Sony all with a view to nailing the quantum question.

Quantum computing refers to the use of quantum mechanics to run calculations. Quantum computing can run multiple processes at once by using quantum bits, unlike binary bits which power traditional computing.

The new technology will ultimately speed up the computational power that drives many industries and could affect everything from drug discovery to how data is secured. Several countries are racing to get quantum computers fully operational.

Christopher Savoie, CEO of quantum computing firm Zapata, who spent much of his career in Japan, said technological development has been very U.S.-centric. But now, Asian nations don't want to be left behind on quantum computing, he added.

"Nation states like India, Japan and China are very much interested in not being the only folks without a capability there. They don't want to see the kind of hegemony that's arisen where the large cloud aggregators by and large are only US companies," Savoie said, referring to the likes of Amazon Web Services and Microsoft Azure.

China, for example, has committed a great deal of brainpower to the quantum race. Researchers have touted breakthroughs and debates are simmering over whether China has surpassed the U.S. on some fronts.

India, for its part, announced plans earlier this year to invest $1 billion in a five-year plan to develop a quantum computer in the country.

James Sanders, an analyst at S&P Global Market Intelligence, told CNBC that governments around the world have been taking more interest in quantum computing in recent years.

In March, Sanders published a report that found governments have pledged around $4.2 billion to support quantum research. Some notable examples include South Korea's $40 million investment in the field and Singapore's Ministry of Education's funding of a research center, The Center for Quantum Technologies.

All of these efforts have a long lens on the future. And for some, the benefits of quantum can seem nebulous.

According to Sanders, the benefits of quantum computing aren't going to be immediately evident for everyday consumers.

What is likely to happen is that quantum computers will wind up utilized in designing products that consumers eventually buy.

James Sanders

analyst, S&P Global Market Intelligence

"On a bad day, I'm talking people down from the idea of quantum cell phones. That's not realistic, that's not going to be a thing," he said.

"What is likely to happen is that quantum computers will wind up utilized in designing products that consumers eventually buy."

There are two major areas where quantum's breakthrough will be felt industry and defense.

A staff member of tech company Q.ant puts a chip for quantum computing in a test station in Stuttgart, Germany, on Sept. 14, 2021. It's expected that the power of quantum computing will be able to decrypt RSA encryption, one of the most common encryption methods for securing data.

Thomas Kienzle | Afp | Getty Images

"Areas where you have HPC [high-performance computing] are areas where we will be seeing quantum computers having an impact. It's things like material simulation, aerodynamic simulation, these kinds of things, very high, difficult computational problems, and then machine learning artificial intelligence," Savoie said.

In pharmaceuticals, traditional systems for calculating the behavior of drug molecules can be time-consuming. The speed of quantum computing could rapidly increase these processes around drug discovery and, ultimately, the timeline for drugs coming to market.

On the flip side, quantum could present security challenges. As computing power advances, so too does the risk to existing security methods.

"The longer-term [motivation] but the one that that everyone recognizes as an existential threat, both offensively and defensively, is the cryptography area. RSA will be eventually compromised by this," Savoie added.

RSA refers to one of the most common encryption methods for securing data, developed in 1977, that could be upended by quantum's speed. It is named after its inventors Ron Rivest, Adi Shamir and Leonard Adleman.

You're seeing a lot of interest from governments and communities that don't want to be the last people on the block to have that technology because [other nations] will be able to decrypt our messages.

Christopher Savoie

CEO of Zapata

"You're seeing a lot of interest from governments and communities that don't want to be the last people on the block to have that technology because [other nations] will be able to decrypt our messages," Savoie said.

Magda Lilia Chelly, chief information security officer at Singaporean cybersecurity firm Responsible Cyber, told CNBC that there needs to be a twin track of encryption and quantum research and development so that security isn't outpaced.

"Some experts believe that quantum computers will eventually be able to break all forms of encryption, while others believe that new and more sophisticated forms of encryption will be developed that cannot be broken by quantum computers," Chelly said.

A quantum processor on a prototype of a quantum computer. There needs to be a twin track of encryption and quantum research and development so that security isn't outpaced, said Magda Lilia Chelly, chief information security officer at Singaporean cybersecurity firm Responsible Cyber.

Julian Stratenschulte/dpa | Picture Alliance | Getty Images

"In particular, [researchers] have been looking at ways to use quantum computers to factor large numbers quickly. This is important because many of the modern encryption schemes used today rely on the fact that it is very difficult to factor large numbers," she added.

If successful, this would make it possible to break most current encryption schemes, making it possible to unlock messages that are encrypted.

Sanders said the development and eventual commercialization of quantum computing will not be a straight line.

Issues like the threat to encryption can garner attention from governments, but research and breakthroughs, as well as mainstream interest, can be "stop-start," he said.

Progress can also be affected by fluctuating interest of private investors as quantum computing won't deliver a quick return on investment.

"There are a lot of situations in this industry where you might have a lead for a week and then another company will come out with another type of the advancement and then everything will go quiet for a little bit."

Another looming challenge for quantum research is finding the right talent with specific skills for this research.

"Quantum scientists that can do quantum computing don't grow on trees," Savoie said, adding that cross-border collaboration is necessary in the face of competing government interests.

"Talent is global. People don't get to choose what country they're born in or what nationality they have."

Visit link:
The race toward a new computing technology is heating up and Asia is jumping on the trend - CNBC

What is the Orca PT-1 computer and how does quantum computing work? – The National

Britain's Ministry of Defence said on Thursday it will work with UK tech firm Orca Computing to investigate the scope to apply quantum technology in defence.

Here, The National explains what quantum computing is and why the UK MoD has agreed to work with Orca.

The pioneer of quantum computing was Paul Benioff of Argonne National Labs, who in 1984 theorised the possibility of designing a computer based exclusively on quantum theory.

In the most simple of terms, quantum computing is extremely high-performance computing so high performing it has the potential revolutionise global industry.

Normal computers process data in bits which have a binary value of zero or one. Quantum computers, by contrast, can process digits simultaneously using a two-state unit called a qubit.

This means that quantum computers have far greater processing power than their regular counterparts.

This extra processing power comes at a heady cost. Quantum computing firm SEECQ says "a single qubit costs around $10,000 and needs to be supported by a host of microwave controller electronics, coaxial cabling and other materials that require large controlled rooms in order to function".

It estimates that in terms of pure hardware, "a useful quantum computer costs tens of billions of dollars to build".

Orca Computing is a 2-year-old UK company that is seeking to scale and integrate quantum computers with real-world technology.

This is a challenge, as qubits must be kept at extremely old temperatures or they will become unstable.

However, Orca says it has found an alternative to conventional quantum computing whereby its software allows small-scale photonic processors to use single units of light to power the process at room temperature.

Investors have been persuaded by Orca's Series A funding round, raising $15 million and attracting investment from the likes of Octopus Ventures, Oxford Science Enterprises, Quantonation and Verve Ventures.

The UK's MoD is also seemingly persuaded by Orca's proposition, having agreed to work in concert with the firm to develop future data-processing capabilities, using Orca's small PT-1 quantum computer.

"Our partnership with MoD gives us the type of hands-on, close interaction, working with real hardware which will help us to jointly discover new applications of this revolutionary new technology," said Richard Murray, chief executive of Orca Computing.

Stephen Till, of the MoD's science and technology lab, said access to the PT-1 would accelerate his ministry's understanding of the technology.

"We expect the Orca system to provide significantly improved latency the speed at which we can read and write to the quantum computer," he said.

Updated: June 09, 2022, 4:04 PM

Read this article:
What is the Orca PT-1 computer and how does quantum computing work? - The National

Mphasis accelerates the world-leading Quantum Computing Ecosystem in partnership with the University of Calgary and the Government of Alberta – Yahoo…

~ The Quantum Lab is set to accelerate the development of quantum skills in the city to enable job creation

CALGARY, AB, June 9, 2022 /CNW/ --Mphasis, (BSE: 526299; NSE: MPHASIS), an Information Technology (IT) solutions provider specializing in cloudand cognitiveservices, today joined the Government of Alberta and the University of Calgary to announce the launch of the world-leading Quantum City - Canada. Quantum city will further establish Alberta as a leading technology hub and will accelerate the development of the quantum ecosystem in Calgary.

Mphasis (PRNewsfoto/Mphasis)

The partnership aims to utilize the synergy between academia, industry, and government to put the process of ideation to market at the forefront. This will include assessment, consulting, and joint development of quantum computing solutions along with exploring possible industry solutions in the areas of machine learning, optimization, simulation, and cryptography, among others. Additionally, to enable capability building in next-gen technologies, joint design, and development of an industry-focused quantum computing curriculum and leveraging Mphasis' TalentNEXT training framework at the University of Calgary will help build an industry-ready workforce to operationalize the development and delivery of quantum solutions for real-world problems.

Further, Mphasis will help with Go-To-Market activities through its sales, partner, and analyst channels for commercialization and adoption of Quantum computing solutions by the public and private sectors developed under this partnership. Mphasis has built a host of industry-focused IPs in areas including AI and Quantum computing and will extend those to the University of Calgary to jumpstart innovation and ideation. The collaboration will also accelerate the university's innovation ecosystem to build a quantum start-up incubation center.

Quantum City will cultivate a national network of researchers, spur economic, technological, and infrastructural development, and act as a focal point for attracting talent through high-quality mentoring, training, and skills development in the country. The center will focus on developing cutting-edge solutions, collaborations, and skills-building initiatives in crucial areas of research such as Health, Energy, Environment, Agriculture & Food, Clean Tech, Oil & Gas, Social Sciences, Space, Finance, Logistics, and Transportation, etc.

Story continues

"Alberta's tech sector is one of the fastest growing in the world, and that is thanks to the ingenuity, know-how, and hard work of Alberta's innovators and job creators. With this new support, the University of Calgary and its partners will play a key role in making Alberta a world-renowned technology and innovation hub diversifying our economy today to create more jobs tomorrow," said the HonorableJason Kenney, Premier of Alberta, Canada.

The partnership in quantum computing will foster economic growth and job creation in the region. Our aim is to leverage the Engineering DNA, to help advance the adoption of digital technologies and talent for the future Mphasis seeks to strengthen and contribute to Alberta's quantum computing, machine learning, and artificial intelligence ecosystems. Our aim is to help organizations harness the power of rapidly advancing digital technologies to gain competitive advantage and advance their business strategies," said Nitin Rakesh, Chief Executive Officer and Managing Director, Mphasis.

The collaboration between Mphasis and the University of Calgary will also focus onhosting quantum consulting workshops for use case identification, assessment, and infrastructure requirements. In addition, Mphasis will identify several industry partners and prospective clients for the commercialization of jointly developed quantum solutions.

"Quantum City is a leading example of how world-class talent, investment, and advanced technology are coming into Calgary. Calgary's economy will grow and diversify as a result of the technologies developed through Quantum City. The University of Calgary will offer deep research expertise that can bring innovations to life and will reap immense benefits for people and societies. UCalgary is excited to be partnering with Mphasis and the Government of Alberta to leapfrog towards the future of innovation," said Ed McCauley, President and Vice-Chancellor, University of Calgary.

"Our collaboration with the University of Calgary and Govt. of Alberta will enable us to tap into quantum computing's enormous potential, allowing us to create cutting-edge capabilities and talents for the future. The opening of our center illustrates our commitment to bringing the most creative, game-changing solutions to market and to investing in skills early on, to stay ahead of the curve. With the world on the verge of a new age of computing, Quantum Computers will soon be able to tackle issues that were previously unsolvable by traditional computers. Building a unified quantum computing business demands a concerted effort to grow the ecosystem across industries, which is what our partnership aims to do. The Quantum Computing Centre is another example of our dedication to fostering open innovation ecosystems to address the big problems of our time," said Rohit Jayachandran, Senior Vice President and Head Strategic Accounts, Mphasis.

The center will work in alignment with the goals of the government and harness the computational technology for traffic management, vehicle routing, financial services portfolio, and social network analysis. The confluence of machine learning and quantum simulation & modeling will be utilized for supply chain demand prediction, anomaly detection, drug development, human mobility modeling, cybersecurity, and climate modeling.

Mphasis is at the forefront of leveraging the power of quantum computing in solving complex business problems in areas such as machine learning, optimization, and simulation problems. As a pioneer in delivering AI/ML solutions, Mphasis foresee quantum computing as a major driver in solving clients' business problems. Mphasis EON (Energy Optimized Network) quantum computing framework is a patent-pending framework that overcomes the limitations of quantum computing systems not being able to work on varied input datasets. It consists of Quantum assisted Machine Learning, Quantum Circuit and Deep neural network layers.

About MphasisMphasis' purpose is to be the "Driver in a Driverless Car"for global enterprisesby applying next-generationdesign, architecture, and engineering services, to deliver scalable and sustainable software and technology solutions.Customer-centricity is foundational to Mphasis, and it is reflected in Mphasis' Front2Back Transformation approach. Front2Back uses the exponential power of cloud and cognitive computing to provide a hyper-personalized (C=X2C2TM=1) digital experience to clients and their end customers.Mphasis' Service Transformation approach helps 'shrink the core' through the application of digital technologies across legacy environments within an enterprise, enabling businesses to stay ahead in a changing world. Mphasis' core reference architectures and tools, speed and innovation with domain expertise and specialization, combined with an integrated sustainability and purpose-led approach across its operations and solutions are key to building strong relationships with marquee clients. Click hereto know more.(BSE: 526299;NSE: MPHASIS)

About the University of CalgaryFounded in 1966, the University of Calgary is a broad-based research university with several campuses in Calgary and the surrounding area. It is one of the highest-ranked universities in North America. About 35,000 students experience an innovative learning environment here, made rich by research and hands-on experiences. For more information, visit ucalgary.ca. Stay up to date with UCalgary's news headlines on Twitter @UCalgary. For access to news releases, details on faculties, and experts, go to our media center at ucalgary.ca/newsroom.

SOURCE Mphasis

Cision

View original content to download multimedia: http://www.newswire.ca/en/releases/archive/June2022/09/c9077.html

More:
Mphasis accelerates the world-leading Quantum Computing Ecosystem in partnership with the University of Calgary and the Government of Alberta - Yahoo...

Quantum Computing Inc. Unveils Software Built to Expand Quantum Processing Power By Up to 20x – insideHPC

LEESBURG, Va., June 07, 2022 Quantum Computing Inc. today unveiled QAmplify, a suite of quantum software technologies designed to expand the processing power of current quantum computers by up to 20x. QAmplify is intended to supercharge any quantum computer to solve business problems today. The company is actively working with customers and partners in scaling the amplification capabilities of its ready-to-run Qatalystsoftware, which is designed to eliminate the need for complex quantum programming and runs seamlessly across a variety of quantum computers. QCI has filed for patents on QAmplify technology.

Currently there are two primary technology approaches that deliver a wide range of capabilities spanning the current Quantum Processing Unit (QPU) hardware landscape; gate model (e.g. IBM, IonQ, Rigetti, OQC, etc.) and annealing (e.g. D-Wave) quantum computers. Both are limited in the size of problems (i.e., number of variables and complexity of computations) they can process. For example, gate models can typically process from 10-120 data variables, and annealing machines can process approximately 400 variables in a simple problem set. These small problem sets restrict the size of the problems that can be solved by todays QPUs, limiting businesses ability to explore the value of quantum computing.

QCIs patent-pending QAmplify suite of powerful QPU-expansion software technologies overcomes these challenges, dramatically increasing the problem set size that each can process. The QAmplify gate model expansions demonstrated capabilities have been benchmarked at a 500% (5x) increase and the annealing expansion has been benchmarked at up to a 2,000% (20x) increase.

QAmplify maximizes end-user investment in current QPUs by allowing quantum users to transform from science experiments to solving real-world problems without waiting for the quantum hardware industry to catch up. For example, in terms of real-world applications, this means that an IBM quantum computer with QAmplify could solve a problem with over 600 variables, versus the current limit of 127 variables. A D-Wave annealing computer with QAmplify could solve an optimization with over 4,000 variables, versus the current limit of 200 for a dense matrix problem set.

It is central to QCIs mission to deliver practical and sustainable value to the quantum computing industry, said William McGann, Chief Operating and Technology Officer of QCI. QCIs innovative software solutions deliver expansive compute capabilities for todays state-of-the-art QPU systems and offer great future scalability as those technologies continually advance. The use of our QAmplify algorithm in the2021 BMW Group Quantum Computing Challengefor vehicle sensor optimization provided proof of performance by expanding the effective capability of the annealer by 20-fold, to 2,888 qubits.

More here:
Quantum Computing Inc. Unveils Software Built to Expand Quantum Processing Power By Up to 20x - insideHPC

Inside how IBMs engineers are designing quantum computers – Vox.com

A few weeks ago, I woke up unusually early in the morning in Brooklyn, got in my car, and headed up the Hudson River to the small Westchester County community of Yorktown Heights. There, amid the rolling hills and old farmhouses, sits the Thomas J. Watson Research Center, the Eero Saarinen-designed, 1960s Jet Age-era headquarters for IBM Research.

Deep inside that building, through endless corridors and security gates guarded by iris scanners, is where the companys scientists are hard at work developing what IBM director of research Dario Gil told me is the next branch of computing: quantum computers.

I was at the Watson Center to preview IBMs updated technical roadmap for achieving large-scale, practical quantum computing. This involved a great deal of talk about qubit count, quantum coherence, error mitigation, software orchestration and other topics youd need to be an electrical engineer with a background in computer science and a familiarity with quantum mechanics to fully follow.

I am not any of those things, but I have watched the quantum computing space long enough to know that the work being done here by IBM researchers along with their competitors at companies like Google and Microsoft, along with countless startups around the world stands to drive the next great leap in computing. Which, given that computing is a horizontal technology that touches everything, as Gil told me, will have major implications for progress in everything from cybersecurity to artificial intelligence to designing better batteries.

Provided, of course, they can actually make these things work.

The best way to understand a quantum computer short of setting aside several years for grad school at MIT or Caltech is to compare it to the kind of machine Im typing this piece on: a classical computer.

My MacBook Air runs on an M1 chip, which is packed with 16 billion transistors. Each of those transistors can represent either the 1 or 0 of binary information at a single time a bit. The sheer number of transistors is what gives the machine its computing power.

Sixteen billion transistors packed onto a 120.5 sq. mm chip is a lot TRADIC, the first transistorized computer, had fewer than 800. The semiconductor industrys ability to engineer ever more transistors onto a chip, a trend forecast by Intel co-founder Gordon Moore in the law that bears his name, is what has made possible the exponential growth of computing power, which in turn has made possible pretty much everything else.

But there are things classic computers cant do that theyll never be able to do, no matter how many transistors get stuffed onto a square of silicon in a Taiwan semiconductor fabrication plant (or fab, in industry lingo). And thats where the unique and frankly weird properties of quantum computers come in.

Instead of bits, quantum computers process information using qubits, which can represent 0 and 1 simultaneously. How do they do that? Youre straining my level of expertise here, but essentially qubits make use of the quantum mechanical phenomenon known as superposition, whereby the properties of some subatomic particles are not defined until theyre measured. Think of Schrdingers cat, simultaneously dead and alive until you open its box.

A single qubit is cute, but things get really exciting when you start adding more. Classic computing power increases linearly with the addition of each transistor, but a quantum computers power increases exponentially with the addition of each new reliable qubit. Thats because of another quantum mechanical property called entanglement, whereby the individual probabilities of each qubit can be affected by the other qubits in the system.

All of which means that the upper limit of a workable quantum computers power far exceeds what would be possible in classic computing.

So quantum computers could theoretically solve problems that a classic computer, no matter how powerful, never could. What kind of problems? How about the fundamental nature of material reality, which, after all, ultimately runs on quantum mechanics, not classical mechanics? (Sorry, Newton.) Quantum computers simulate problems that we find in nature and in chemistry, said Jay Gambetta, IBMs vice president of quantum computing.

Quantum computers could simulate the properties of a theoretical battery to help design one that is far more efficient and powerful than todays versions. They could untangle complex logistical problems, discover optimal delivery routes, or enhance forecasts for climate science.

On the security side, quantum computers could break cryptography methods, potentially rendering everything from emails to financial data to national secrets insecure which is why the race for quantum supremacy is also an international competition, one that the Chinese government is pouring billions into. Those concerns helped prompt the White House earlier this month to release a new memorandum to architect national leadership in quantum computing and prepare the country for quantum-assisted cybersecurity threats.

Beyond the security issues, the potential financial upsides could be significant. Companies are already offering early quantum-computing services via the cloud for clients like Exxon Mobil and the Spanish bank BBVA. While the global quantum-computing market was worth less than $500 million in 2020, International Data Corporation projects that it will reach $8.6 billion in revenue by 2027, with more than $16 billion in investments.

But none of that will be possible unless researchers can do the hard engineering work of turning a quantum computer from what is still largely a scientific experiment into a reliable industry.

Inside the Watson building, Jerry Chow who directs IBMs experimental quantum computer center opened a 9-foot glass cube to show me something that looked like a chandelier made out of gold: IBMs Quantum System One. Much of the chandelier is essentially a high-tech fridge, with coils that carry superfluids capable of cooling the hardware to 100th of a degree Celsius above absolute zero colder, Chow told me, than outer space.

Refrigeration is key to making IBMs quantum computers work, and it also demonstrates why doing so is such an engineering challenge. While quantum computers are potentially far more powerful than their classic counterparts, theyre also far, far more finicky.

Remember what I said about the quantum properties of superposition and entanglement? While qubits can do things a mere bit could never dream of, the slightest variation in temperature or noise or radiation can cause them to lose those properties through something called decoherence.

That fancy refrigeration is designed to keep the systems qubits from decohering before the computer has completed its calculations. The very earliest superconducting qubits lost coherence in less than a nanosecond, while today IBMs most advanced quantum computers can maintain coherence for as many as 400 microseconds. (Each second contains 1 million microseconds.)

The challenge IBM and other companies face is engineering quantum computers that are less error-prone while scaling the systems beyond thousands or even tens of thousands of qubits to perhaps millions of them, Chow said.

That could be years off. Last year, IBM introduced the Eagle, a 127-qubit processor, and in its new technical roadmap, it aims to unveil a 433-qubit processor called the Osprey later this year, and a 4,000-plus qubit computer by 2025. By that time, quantum computing could move beyond the experimentation phase, IBM CEO Arvind Krishna told reporters at a press event earlier this month.

Plenty of experts are skeptical that IBM or any of its competitors will ever get there, raising the possibility that the engineering problems presented by quantum computers are simply too hard for the systems to ever be truly reliable. Whats happened over the last decade is that there have been a tremendous number of claims about the more immediate things you can do with a quantum computer, like solve all these machine learning problems, Scott Aaronson, a quantum computing expert at the University of Texas, told me last year. But these claims are about 90 percent bullshit. To fulfill that promise, youre going to need some revolutionary development.

In an increasingly digital world, further progress will depend on our ability to get ever more out of the computers we create. And that will depend on the work of researchers like Chow and his colleagues, toiling away in windowless labs to achieve a revolutionary new development around some of the hardest problems in computer engineering and along the way, trying to build the future.

A version of this story was initially published in the Future Perfect newsletter. Sign up here to subscribe!

Original post:
Inside how IBMs engineers are designing quantum computers - Vox.com

Quantum computing just might save the planet – McKinsey

The emerging technology of quantum computingcould revolutionize the fight against climate change, transforming the economics of decarbonization and becoming a major factor in limiting global warming to the target temperature of 1.5C (see sidebar What is quantum computing?).

Even though the technology is in the early stages of developmentexperts estimate the first generation of fault-tolerant quantum computing will arrive in the second half of this decadebreakthroughs are accelerating, investment dollars are pouring in, and start-ups are proliferating. Major tech companies have already developed small, so-called noisy intermediate-scale quantum (NISQ) machines, though these arent capable of performing the type of calculations that fully capable quantum computers are expected to perform.

Countries and corporates set ambitious new targets for reducing emissions at the 2021 United Nations Climate Change Conference (COP26). Those goals, if fully met, would represent an extraordinary annual investment of $4 trillion by 2030, the largest reallocation of capital in human history. But the measures would only reduce warming to between 1.7C and 1.8C by 2050, far short of the 1.5C level believed necessary to avoid catastrophic, runaway climate change.

Meeting the goal of net-zero emissions that countries and some industries have committed to wont be possible without huge advances in climate technology that arent achievable today. Even the most powerful supercomputers available now are not able to solve some of these problems. Quantum computing could be a game changer in those areas. In all, we think quantum computing could help develop climate technologies able to abate carbon on the order of 7 gigatons a year of additional CO2 impact by 2035, with the potential to bring the world in line with the 1.5C target.

Quantum computing could help reduce emissions in some of the most challenging or emissions-intensive areas, such as agriculture or direct-air capture, and could accelerate improvements in technologies required at great scale, such as solar panels or batteries. This article offers a look at some of the breakthroughs the technology could permit and attempts to quantify the impact of leveraging quantum-computer technology that are expected become available this decade.

Quantum computing could bring about step changes throughout the economy that would have a huge impact on carbon abatement and carbon removal, including by helping to solve persistent sustainability problems such as curbing methane produced by agriculture, making the production of cement emissions-free, improving electric batteries for vehicles, developing significantly better renewable solar technology, finding a faster way to bring down the cost of hydrogen to make it a viable alternative to fossil fuels, and using green ammonia as a fuel and a fertilizer.

Addressing the five areas designated in the Climate Math Reportas key for decarbonization, we have identified quantum-computing use cases that can pave the way to a net-zero economy. We project that by 2035 the use cases listed below could make it possible to eliminate more than 7 gigatons of CO2 equivalent (CO2e) from the atmosphere a year, compared with the current trajectory, or in aggregate more than 150 gigatons over the next 30 years (Exhibit 1).

Exhibit 1

Batteries are a critical element of achieving zero-carbon electrification. They are required to reduce CO2 emissions from transportation and to obtain grid-scale energy storage for intermittent energy sources such as solar cells or wind.

Improving the energy density of lithium-ion (Li-ion) batteries enables applications in electric vehicles and energy storage at an affordable cost. Over the past ten years, however, innovation has stalledbattery energy density improved 50 percent between 2011 and 2016, but only 25 percent between 2016 and 2020, and is expected to improve by just 17 percent between 2020 and 2025.

Recent research has shown that quantum computing will be able to simulate the chemistry of batteries in ways that cant be achieved now. Quantum computing could allow breakthroughs by providing a better understanding of electrolyte complex formation, by helping to find a replacement material for cathode/anode with the same properties and/or by eliminating the battery separator.

As a result, we could create batteries with 50 percent higher energy density for use in heavy-goods electric vehicles, which could substantially bring forward their economic use. The carbon benefits to passenger EVs wouldnt be huge, as these vehicles are expected to reach cost parity in many countries before the first generation of quantum computers is online, but consumers might still enjoy cost savings.

In addition, higher-density energy batteries can serve as a grid-scale storage solution. The impact on the worlds grids could be transformative. Halving the cost of grid-scale storage could enable a step change in the use of solar power, which is becoming economically competitive but is challenged by its generation profile. Our modeling suggests that halving the cost of solar panels could increase their use by 25 percent in Europe by 2050 but halving both solar and batteries might increase solar use by 60 percent (Exhibit 2). Geographies without such a high carbon price will see even greater impacts.

Exhibit 2

Through the combination of use cases described above, improved batteries could bring about an additional reduction in carbon dioxide emissions of 1.4 gigatons by 2035.

Many parts of the industry produce emissions that are either extremely expensive or logistically challenging to abate.

Cement is a case in point. During calcination in the kiln for the process of making clinker, a powder used to make cement, CO2 is released from raw materials. This process accounts for approximately two-thirds of cement emissions.

Alternative cement-binding materials (or clinkers) can eliminate these emissions, but theres currently no mature alternative clinker that can significantly reduce emissions at an affordable cost.

There are many possible permutations for such a product, but testing by trial and error is time-consuming and costly. Quantum computing can help to simulate theoretical material combinations to find one that overcomes todays challengesdurability, availability of raw materials and efflorescence (in the case of alkali-activated binders). This would have an estimated additional impact of 1 gigaton a year by 2035.

Solar cells will be one of the key electricity-generation sources in a net-zero economy. But even though they are getting cheaper, they still are far from their theoretical maximum efficiency.

Todays solar cells rely on crystalline silicon and have an efficiency on the order of 20 percent. Solar cells based on perovskite crystal structures, which have a theoretical efficiency of up to 40 percent, could be a better alternative. They present challenges, however, because they lack long-term stability and could, in some varieties, be more toxic. Furthermore, the technology has not been mass produced yet.

Quantum computing could help tackle these challenges by allowing for precise simulation of perovskite structures in all combinations using different base atoms and doping, thereby identifying higher efficiency, higher durability, and nontoxic solutions. If the theoretical efficiency increase can be reached, the levelized cost of electricity (LCOE) would decrease by 50 percent.

By simulating the impact of cheaper and more efficient quantum-enabled solar panels, we see a significant increase in use in areas with lower carbon prices (China, for example). This is also true of countries in Europe with high irradiance (Spain, Greece) or poor conditions for wind energy (Hungary). The impact is magnified when combined with cheap battery storage, as discussed above.

This technology could abate an additional 0.4 gigatons of CO2 emissions by 2035.

Hydrogen is widely considered to be a viable replacement for fossil fuels in many parts of the economy, especially in industry where high temperature is needed and electrification isnt possible or sufficient, or where hydrogen is needed as a feedstock, such as steelmaking or ethylene production.

Before the 2022 gas price spikes, green hydrogen was about 60 percent more expensive than natural gas. But improving electrolysis could significantly decrease the cost of hydrogen.

Polymer electrolyte membrane (PEM) electrolyzers split water and are one way to make green hydrogen. They have improved in recent times but still face two major challenges.

Quantum computing can help model the energy state of pulse electrolysis to optimize catalyst usage, which would increase efficiency. Quantum computing could also model the chemical composition of catalysts and membranes to ensure the most efficient interactions. And it could push the efficiency of the electrolysis process up to 100 percent and reduce the cost of hydrogen by 35 percent. If combined with cheaper solar cells discovered by quantum computing (discussed above), the cost of hydrogen could be reduced by 60 percent (Exhibit 3).

Exhibit 3

Increased hydrogen use as a result of these improvements could reduce CO2 emissions by an additional 1.1 gigatons by 2035.

Ammonia is best known as a fertilizer, but could also be used as fuel, potentially making it one of the best decarbonization solutions for the worlds ships. Today, it represents 2 percent of total global final energy consumption.

For the moment, ammonia is made through the energy-intensive Haber-Bosch process using natural gas. There are several options for creating green ammonia, but they rely on similar processes. For example, green hydrogen can be used as a feedstock, or the carbon dioxide emissions that are caused by the process can be captured and stored.

However, there are other potential approaches, such as nitrogenase bioelectrocatalysis, which is how nitrogen fixation works naturally when plants take nitrogen gas directly from the air and nitrogenase enzymes catalyze its conversion into ammonia. This method is attractive because it can be done at room temperature and at 1 bar pressure, compared with 500C at high pressure using Haber-Bosch, which consumes large amounts of energy (in the form of natural gas) (Exhibit 4).

Exhibit 4

Innovation has reached a stage where it might be possible to replicate nitrogen fixation artificially, but only if we can overcome challenges such as enzyme stability, oxygen sensitivity, and low rates of ammonia production by nitrogenase. The concept works in the lab but not at scale.

Quantum computing can help simulate the process of enhancing the stability of the enzyme, protecting it from oxygen and improving the rate of ammonia production by nitrogenase. That would result in a 67 percent cost reduction over todays green ammonia produced through electrolysis, which would make green ammonia even cheaper than traditionally produced ammonia. Such a cost reduction could not only lessen the CO2 impacts of the production of ammonia for agricultural use but could also bring forward the breakeven for ammonia in shippingwhere it is expected to be a major decarbonization optionforward by ten years.

Using quantum computing to facilitate cheaper green ammonia as a shipping fuel could abate an additional CO2 by 0.4 gigatons by 2035.

Carbon capture is required to achieve net zero. Both types of carbon capturepoint source and directcould be aided by quantum computing.

Point-source carbon capture allows CO2 to be captured directly from industrial sources such as a cement or steel blast furnace. But the vast majority of CO2 capture is too expensive to be viable for now, mainly because it is energy intense.

One possible solution: novel solvents, such as water-lean and multiphase solvents, which could offer lower-energy requirements, but it is difficult to predict the properties of the potential material at a molecular level.

Quantum computing promises to enable more accurate modeling of molecular structure to design new, effective solvents for a range of CO2 sources, which could reduce the cost of the process by 30 to 50 percent.

We believe this has significant potential to decarbonize industrial processes, which could lead to additional decarbonization of up to 1.5 gigatons a year, including cement. If the cement clinker approach described above is successful, this would still have an effect of 0.5 gigatons a year, due to fuel emissions. In addition, alternative clinkers may not be available in some regions.

Direct-air capture, which involves sucking CO2 from the air, is a way to address carbon removals. While the Intergovernmental Panel on Climate Change says this approach is required to achieve net zero, it is very expensive (ranging from $250 to $600 per ton a day today) and even more energy intensive than point-source capture.

Adsorbents are best suited for effective direct-air capture and novel approaches, such as metal organic frameworks, or MOFs, have the potential to greatly reduce the energy requirements and the capital cost of the infrastructure. MOFs act like a giant spongeas little as a gram can have a surface area larger than a football fieldand can absorb and release CO2 at far lower temperature changes than conventional technology.

Quantum computing can help advance research on novel adsorbents such as MOFs and resolve challenges arising from sensitivity to oxidation, water, and degradation caused by CO2.

Novel adsorbents that have a higher adsorption rate could reduce the cost of technology to $100 per ton of CO2e captured. This could be a critical threshold for uptake, given that corporate climate leaders such as Microsoft have publicly announced an expectation to pay $100 a ton long term for the highest-quality carbon removals. This would lead to an additional CO2reduction of 0.7 gigatons a year by 2035.

Twenty percent of annual greenhouse-gas emissions come from agricultureand methane emitted by cattle and dairy is the primary contributor (7.9 gigatons of CO2e, based on 20-year global-warming potential).

Research has established that low-methane feed additives could effectively stop up to 90 percent of methane emissions. Yet applying those additives for free-range livestock is particularly difficult.

An alternative solution is an antimethane vaccine that produces methanogen-targeting antibodies. This method has had some success in lab conditions, but in a cows gutchurning with gastric juices and foodthe antibodies struggle to latch on to the right microbes. Quantum computing could accelerate the research to find the right antibodies by precise molecule simulation instead of a costly and long trial-and-error method. With estimated uptake determined according to data from the US Environmental Protection Agency, we arrive at carbon reduction of up to an additional 1 gigaton a year by 2035.

Another prominent use case in agriculture is green ammonia discussed as a fuel above, where todays Haber-Bosch process uses large amounts of natural gas. Using such an alternative process could have an additional impact of up to 0.25 gigatons a year by 2035, replacing current conventionally produced fertilizers.

There are many more ways that quantum computing could be applied to the fight against climate change. Future possibilities include identification of new thermal-storage materials, high-temperature superconductors as a future base for lower losses in grids, or simulations to support nuclear fusion. Use cases arent limited to climate mitigation, but can also apply to adaptation, for example, improvements in weather prediction to give greater warning of major climatic events. But progress on those innovations will have to wait because first-generation machines will not be powerful enough for such breakthroughs (see sidebar Methodology).

The leap in CO2 abatement could be a major opportunity for corporates. With $3 to $5 trillion in value at stake in sustainability, according to McKinsey research, climate investment is an imperative for big companies. The use cases presented above represent major shifts and potential disruptions in these areas, and they are associated with huge value for players who take the lead. This opportunity is recognized by industry leaders who are already developing capabilities and talent.

Nevertheless, quantum technology is in the early stage and comes with the risks linked to leading-edge technology development, as well as tremendous cost. We have highlightedthe stage of the industry in the Quantum Technology Monitor. The risk to investors can be mitigated somewhat through steps such as onboarding technical experts to run in-depth diligence, forming joint investments with public entities or consortia, and investing in companies that bundle various ventures under one roof and provide the necessary experience to set up and scale these ventures.

In addition, governments have an important role to play by creating programs at universities to develop quantum talent and by providing incentives for quantum innovation for climate, particularly for use cases that today do not have natural corporate partners, such as disaster prediction, or that arent economical, such as direct-air capture. Governments could start more research programs like the partnership between IBM and the United Kingdom, the collaboration between IBM and Fraunhofer-Gesellschaft, the publicprivate partnership Quantum Delta in the Netherlands, and the collaboration between the United States and the United Kingdom. By tapping into quantum computing for sustainability, countries will accelerate the green transition, achieve national commitments, and get a head start in export markets. But even with those measures, the risk and expense remain high (Exhibit 5).

Exhibit 5

Here are some questions corporates and investors need to ask before taking a leap into quantum computing.

Is quantum computing relevant for you?

Determine whether there are use cases that can potentially disrupt your industry or your investments and address the decarbonization challenges of your organization. This article has highlighted anecdotal use cases across several categories to showcase the potential impact of quantum computing, but weve identified more than 100 sustainability-relevant use cases where quantum computing could play a major role. Quickly identifying use cases that are applicable to you and deciding how to address them can be highly valuable, as talent and capacity will be scarce in this decade.

How do I approach quantum computing now, if it is relevant?

Once you have engaged on quantum computing, building the right kind of approach, mitigating risk and securing access to talent and capacity are key.

Because of the high cost of this research, corporates can maximize their impact by forming partnerships with other players from their value chains and pooling expense and talent. For example, major consumers of hydrogen might join up with electrolyzer manufacturers to bring down the cost and share the value. These arrangements will require companies to figure out how to share innovation without losing competitive advantage. Collaborations such as joint ventures or precompetitive R&D could be an answer. We also foresee investors willing to support such endeavors to potentially remove some of the risk for corporates. And there are large amounts of dedicated climate finance available, judging by pledges made at COP26 that aim to reach the target of $100 billion a year in spending.

Do I have to start now?

While the first fault-tolerant quantum computer is several years away, it is important to start development work now. There is significant prework to be done to get to a maximal return on the significant investment that application of quantum computing will require.

Determining the exact parameters of a given problem and finding the best possible application will mean collaboration between application experts and quantum-computing technicians well versed in algorithm development. We estimate algorithm development would take up to 18 months, depending on the complexity.

It will also take time to set up the value chain, production, and go-to-market to ensure they are ready when quantum computing can be deployed and to fully benefit from the value created.

Quantum computing is a revolutionary technology that could allow for precise molecular-level simulation and a deeper understanding of natures basic laws. As this article shows, its development over the next few years could help solve scientific problems that until recently were believed to be insoluble. Clearing away these roadblocks could make the difference between a sustainable future and climate catastrophe.

Making quantum computing a reality will require an exceptional mobilization of resources, expertise, and funds. Only close cooperation between governments, scientists, academics, and investors in developing this technology can make it possible to reach the target for limiting emissions that will keep global warming at 1.5C and save the planet.

More here:
Quantum computing just might save the planet - McKinsey

Special Operations Command trying to prepare for quantum computing threat – FedScoop

Written by Jon Harper May 19, 2022 | FEDSCOOP

U.S. Special Operations Command is worried about the future threat from adversaries quantum technologies, and officials are trying to get out ahead of the problem.

Improving intelligence fusion through real-time data integration is a key pillar of SOCOMs plans for digital transformation. That data must not only be gathered, fused and transferred to the appropriate end users; it also has to be secured a challenge that will grow with the development of quantum computing capabilities.

How do we get after the way those bits and bytes interact with each other and create the intelligence that we need, while at the same time protecting that data, you know, ensuring that the data is trustworthy? Thomas Kenney, chief data officer at Special Operations Command, said Thursday at the SOFIC conference.

Heres a really interesting aspect of this that were looking at today because we know in a few years this is going to become really important by some accounts, were less than eight years away from quantum cryptography being able to break the non-quantum cryptography that we have today We need an answer for that, he said.

When the technology is ready for prime time, officials say it could be a game changer.

Data may very easily be decrypted by a capability that has a quantum decrypt capability, Kenney warned.

The time is now to be thinking about that problem before adversaries have already acquired that capability, he added.

Technology developers are putting a lot of effort into quantum computing, he noted, highlighting the implications of quantum processing.

One of the really interesting tenants of quantum computing is that you can compute multiple outcomes simultaneously. And when you think about the speed of battle and where were going to, that ability will be absolutely essential, Kenney said.

Quantum computing is being played with right now. And we look at where were going for quantum cryptography, we need about a factor of 1,000 qubits to be able to get to that next level, he said.

A qubit is a computing unit that leverages the principle of superposition the ability of quantum systems to exist in two or more states simultaneously to encode information, the Congressional Research Service explained in a recent report on the technology.

Whereas a classical computer encodes information in bits that can represent binary states of either 0 or 1, a quantum computer encodes information in qubits, each of which can represent 0, 1, or a combination of both at the same time. As a result, the power of a quantum computer increases exponentially with the addition of each qubit, according to CRS.

Being able to have multiple outcomes calculated at the same time on a battlefield thats happening extremely fast is going to be mission essential to us. Are the technologies there today? Maybe not. But they certainly need to be there in the future, so its something that were taking a look at, Kenney said.

Earlier this month, President Biden signed two new policy directives aimed at advancing U.S. quantum technologies and the ability to defend U.S. infrastructure against the threat posed by quantum computers.

See more here:
Special Operations Command trying to prepare for quantum computing threat - FedScoop