Does the Butterfly Effect Exist? Maybe, But Not in the Quantum Realm – Discover Magazine

In A Sound of Thunder, the short story by Ray Bradbury, the main character travels back in time to hunt dinosaurs. He crushes a butterfly underfoot in the prehistoric jungle, and when he returns to the present, the world he knows is changed: the feel of the air, a sign in an office, the election of a U.S. president. The butterfly was a small thing that could upset balances and knock down a line of small dominoes and then big dominoes and then gigantic dominoes, all down the years across Time.

This butterfly effect that Bradbury illustrated where a small change in the past can result in enormous future effects is not reserved for fiction. As the famed mathematician and meteorologist Edward Lorenz discovered by accident, natural systems do exist in which tiny shifts in initial conditions can lead to highly variable outcomes. These systems, including weather and even how fluids mix are known as chaotic. Chaotic systems are normally understood within the realm of classical physics, which is the method we use to predict how objects will move to a certain degree of accuracy (think motion, force or momentum from your high school science class.)

But a new study shows that the effect doesnt work in a quantum realm. Two researchers at Los Alamos National Labs in New Mexico, created a simulation where a qubit, a quantum bit, moved backwards and forwards in time on a quantum computer. Despite being damaged, the qubit held on to its original information instead of becoming unrecognizable like the time travelers world after he killed the butterfly. In the study, the process used to simulate time travel forwards and backwards is known as evolution.

From the point of view of classical physics, it's very unexpected because classical physics predicts that complex evolution has a butterfly effect, so that small changes deep in the past lead to huge changes in our world, says Nikolai Sinitsyn, a theoretical physicist and one of the researchers who conducted the study.

The finding furthers our understanding of quantum systems, and also has potential applications in securing information systems and even determining the quantum-ness of a quantum processor.

The rules of the quantum realm, which explain how subatomic particles move, can be truly mind-boggling because they defy traditional logic. But briefly: Particles as small as electrons and protons don't just exist in one point in space, they can occupy many at a time. The mathematical framework of quantum mechanics tries to explain the motion of these particles.

The laws of quantum mechanics can also be applied to quantum computers. These are very different from computers we use today, and can solve certain problems exponentially faster than normal computers can because they adhere to these completely different laws of physics. A standard computer uses bits with a value of either 0 or 1. A quantum computer uses qubits, which can attain a kind of combined state of 0 or 1, a unique characteristic of quantum systems for example, an electron called superposition.

In a quantum system, small changes to qubits even looking at or measuring them can have immense effects. So in the new study, the researchers wanted to see what would happen when they simulated sending a qubit back in time while also damaging it. Researchers constructing quantum experiments often use the stand-ins Alice and Bob to illustrate their theoretical process. In this case, they let Alice bring her qubit back in time, scrambling the information as part of what they call reverse evolution. Once in the past, Bob, an intruder, measures Alices qubit, changing it. Alice brings her qubit forward in time.

If the butterfly effect had held, the original information in Alices qubit would have been exponentially changed. But instead, the evolution forward in time allowed Alice to recover the original information, even though Bobs intrusion had destroyed all the connections between her qubit and others that travelled with hers.

So normally, many people believe that if you go back in time, and scramble the information, that information is lost forever, says Jordan Kyriakidis, an expert in quantum computing and former physicist at Dalhousie University in Nova Scotia. What they have shown in this paper is that for quantum systems, that under certain circumstances, if you go back in time, you can recover the original information even though someone tried to scramble it on you.

So does this mean that the butterfly effect doesnt exist at all? No. Sinitsyn and his coauthor, Bin Yan, showed it doesnt exist within the quantum realm, specifically.

But this does have implications for real-world problems. One is information encryption. Encryption has two important principles: It should be hidden so well that no one can get to it, but who it was intended for should to be able to reliably decipher it. For example, explains Kyriakidis, if a hacker attempts to crack a code that hides information in todays world, the hacker might not be able to get to it, but they could damage it irreparably, preventing anyone from reading the original message. This study may point to a way to avoid this by protecting information, even after its damaged, so the intended recipient can interpret it.

And because this effect (or non-effect) is so particular to quantum systems, it could theoretically be used to test the integrity of a quantum computer. If one were to replicate Yan and Sinitsyns protocol in a quantum computer, according to the study, it would confirm that the system was truly operating by quantum principles. Because quantum computers are highly prone to errors, a tool to easily test how well they work has huge value. A reliable quantum computer can solve incredibly complex problems, which have applications from chemistry and medicine to traffic direction and financial strategy.

Quantum computing is only in its birth but if Yan and Sinitsyns quantum time machine can exist in a realm usually saved for subatomic particles, well, the possibilities could be endless.

Read this article:
Does the Butterfly Effect Exist? Maybe, But Not in the Quantum Realm - Discover Magazine

Quantum Information Processing Market Forecast 2020-2026| Post Impact of Worldwide COVID-19 Spread Analysis- 1QB Information Technologies, Airbus,…

The latest market research study launched by reportsandmarkets on Global Quantum Information Processing Market Report 2019 provides you the details analysis on current market condition, business plans, investment analysis, size, share, industry growth drivers, COVID-19 impact analysis, global as well as regional outlook.

This report will help youtake informed decisions, understand opportunities, plan effective business strategies, plan new projects, analyse drivers and restraints and give you a vision on the industry forecast.Further, Quantum Information Processing market report also covers the marketing strategies followed bytop Quantum Information Processing players, distributors analysis, Quantum Information Processing marketing channels, potential buyers and Quantum Information Processing development history.

Get Exclusive Sample Report on Quantum Information Processing Marketis available athttps://www.reportsandmarkets.com/sample-request/global-quantum-information-processing-market-report-2019?utm_source=thedailychronicle&utm_medium=38

Along with Quantum Information Processing Market research analysis, buyer also gets valuable information about global Quantum Information Processing Production and its market share,Revenue, Price and Gross Margin, Supply, Consumption, Export, Import volumeand values for followingRegions: North America, Europe, China, Japan, Middle East & Africa, India, South America, Others

In the Quantum Information Processing Market research report, following points market opportunities, market risk and market overview are enclosed along with in-depth study of each point. Production of the Quantum Information Processing is analyzed with respect to various regions, types and applications.The sales, revenue, and price analysis by types and applicationsof Quantum Information Processing market key players is also covered.

Quantum Information Processing Market Covers following MajorKey Players:1QB Information Technologies, Airbus, Anyon Systems, Cambridge Quantum Computing, D-Wave Systems, Google, Microsoft, IBM, Intel, QC Ware, Quantum, Rigetti Computing, Strangeworks, Zapata Computing.

COVID-19 can affect the global economy in 3 main ways: by directly affecting production and demand, by creating supply chain and market disturbance, and by its financial impact on firms and financial markets.

The objectives of the report are:

To analyze and forecast the market size of Quantum Information Processing Industry in the global market.

To study the global key players, SWOT analysis, value and global market share for leading players.

To determine, explain and forecast the market by type, end use, and region.

To analyze the market potential and advantage, opportunity and challenge, restraints and risks of global key regions.

To find out significant trends and factors driving or restraining the market growth.

To analyze the opportunities in the market for stakeholders by identifying the high growth segments.

To critically analyze each submarket in terms of individual growth trend and their contribution to the market.

To understand competitive developments such as agreements, expansions, new product launches, and possessions in the market.

To strategically outline the key players and comprehensively analyze their growth strategies.

Major Points from Table of Contents

1 Market Overview

2 Global and Regional Market by Company

3 Global and Regional Market by Type

4 Global and Regional Market by Application

5 Regional Trade

6 Key Manufacturers

7 Industry Upstream

Continue.

List of Tables and Figures..

Inquire more about this report @https://www.reportsandmarkets.com/enquiry/global-quantum-information-processing-market-report-2019?utm_source=thedailychronicle&utm_medium=38

If you have any special requirements about this Quantum Information Processing Market report, please let us know and we can provide custom report.

About Us

Market research is the new buzzword in the market, which helps in understanding the market potential of any product in the market. This helps in understanding the market players and the growth forecast of the products and so the company. This is where market research companies come into the picture. Reports And Markets is not just another company in this domain but is a part of a veteran group called Algoro Research Consultants Pvt. Ltd. It offers premium progressive statistical surveying, market research reports, analysis & forecast data for a wide range of sectors both for the government and private agencies all across the world.

Contact Us

Sanjay Jain

Manager Partner Relations & International Marketing

http://www.reportsandmarkets.com

Ph: +1-352-353-0818 (US)

View original post here:
Quantum Information Processing Market Forecast 2020-2026| Post Impact of Worldwide COVID-19 Spread Analysis- 1QB Information Technologies, Airbus,...

Is Machine Learning The Quantum Physics Of Computer Science ? – Forbes

Preamble: Intermittently, I will be introducing some columns which introduce some seemingly outlandish concepts. The purpose is a bit of humor, but also to provoke some thought. Enjoy.

atom orbit abstract

God does not play dice with the universe, Albert Einstein is reported to have said about the field of Quantum Physics. He was referring to the great divide at the time in the physics community between general relativity and quantum physics. General relativity was a theory which beautifully explained a great deal of physical phenomena in a deterministic fashion. Meanwhile, quantum physics grew out of a model which fundamentally had a probabilistic view of the world. Since Einstein made that statement in the mid 1950s, quantum physics has proven to be quite a durable theory, and in fact, it is used in a variety of applications such as semiconductors.

One might imagine a past leader in computer science such as Donald Knuth exclaiming, Algorithms should be deterministic. That is, given any input, the output should be exact and known. Indeed, since its formation, the field of computer science has focused on building elegant deterministic algorithms which have a clear view of the transformation between inputs and outputs. Even in the regime of non-determinism such as parallel processing, the objective of the overall algorithm is to be deterministic. That is, despite the fact that operations can run out-of-order, the outputs are still exact and known. Computer scientists work very hard to make that a reality.

As computer scientists have engaged with the real world, they frequently face very noisy inputs such as sensors or even worse, human beings. Computer algorithms continue to focus on faithfully and precisely translating input noise to output noise. This has given rise to the Junk In Junk Out (JIJO) paradigm. One of the key motivations for pursuing such a structure has been the notion of causality and diagnosability. After all, if the algorithms are noisy, how is one to know the issue is not a bug in the algorithm? Good point.

With machine learning, computer science has transitioned to a model where one trains a machine to build an algorithm, and this machine can then be used to transform inputs to outputs. Since the process of training is dynamic and often ongoing, the data and the algorithm are intertwined in a manner which is not easily unwound. Similar to quantum physics, there is a class of applications where this model seems to work. Recognizing patterns seems to be a good application. This is a key building block for autonomous vehicles, but the results are probabilistic in nature.

In quantum physics, there is an implicit understanding that the answers are often probabilistic Perhaps this is the key insight which can allow us to leverage the power of machine learning techniques and avoid the pitfalls. That is, if the requirements of the algorithm must be exact, perhaps machine learning methods are not appropriate. As an example, if your bank statement was correct with somewhat high probability, this may not be comforting. However, if machine learning algorithms can provide with high probability the instances of potential fraud, the job of a forensic CPA is made quite a bit more productive. Similar analogies exist in the area of autonomous vehicles.

Overall, machine learning seems to define the notion of probabilistic algorithms in computer science in a similar manner as quantum physics. The critical challenge for computing is to find the correct mechanisms to design and validate probabilistic results.

See original here:
Is Machine Learning The Quantum Physics Of Computer Science ? - Forbes

Flux-induced topological superconductivity in full-shell nanowires – Science Magazine

INTRODUCTION

Majorana zero modes (MZMs) localized at the ends of one-dimensional topological superconductors are promising candidates for fault-tolerant quantum computing. One approach among the proposals to realize MZMsbased on semiconducting nanowires with strong spin-orbit coupling subject to a Zeeman field and superconducting proximity effecthas received considerable attention, yielding increasingly compelling experimental results over the past few years. An alternative route to MZMs aims to create vortices in topological superconductors, for instance, by coupling a vortex in a conventional superconductor to a topological insulator.

We intoduce a conceptually distinct approach to generating MZMs by threading magnetic flux through a superconducting shell fully surrounding a spin-orbitcoupled semiconducting nanowire core; this approach contains elements of both the proximitized-wire and vortex schemes. We show experimentally and theoretically that the winding of the superconducting phase around the shell induced by the applied flux gives rise to MZMs at the ends of the wire. The topological phase sets in at relatively low magnetic fields, is controlled by moving from zero to one phase twist around the superconducting shell, and does not require a large g factor in the semiconductor, which broadens the landscape of candidate materials.

In the destructive Little-Parks regime, the modulation of critical temperature with flux applied along the hybrid nanowire results in a sequence of lobes with reentrant superconductivity. Each lobe is associated with a quantized number of twists of the superconducting phase in the shell, determined by the external field. The result is a series of topologically locked boundary conditions for the proximity effect in the semiconducting core, with a dramatic effect on the subgap density of states.

Tunneling into the core in the zeroth superconducting lobe, around zero flux, we measure a hard proximity-induced gap with no subgap features. In the superconducting regions around one quantum of applied flux, 0 = h/2e, corresponding to phase twists of 2 in the shell, tunneling spectra into the core show stable zero-bias peaks, indicating a discrete subgap state fixed at zero energy.

Theoretically, we find that a Rashba field arising from the breaking of local radial inversion symmetry at the semiconductor-superconductor interface, along with 2-phase twists in the boundary condition, can induce a topological state supporting MZMs. We calculate the topological phase diagram of the system as a function of Rashba spin-orbit coupling, radius of the semiconducting core, and band bending at the superconductor-semiconductor interface. Our analysis shows that topological superconductivity extends in a reasonably large portion of the parameter space. Transport simulations of the tunneling conductance in the presence of MZMs qualitatively reproduce the experimental data in the entire voltage-bias range.

We obtain further experimental evidence that the zero-energy states are delocalized at wire ends by investigating Coulomb blockade conductance peaks in full-shell wire islands of various lengths. In the zeroth lobe, Coulomb blockade peaks show 2e spacing; in the first lobe, peak spacings are roughly 1e-periodic, with slight even-odd alternation that vanishes exponentially with island length, consistent with overlapping Majorana modes at the two ends of the Coulomb island. The exponential dependence on length, as well as incompatibility with a power-law dependence, provides compelling evidence that MZMs reside at the ends of the hybrid islands.

While being of similar simplicity and practical feasibility as the original nanowire proposals with a partial shell coverage, full-shell nanowires provide several key advantages. The modest magnetic field requirements, protection of the semiconducting core from surface defects, and locked phase winding in discrete lobes together suggest a relatively easy route to creating and controlling MZMs in hybrid materials. Our findings open the possibility of studying an interplay of mesoscopic and topological physics in this system.

(A) Colorized electron micrograph of a tunneling device composed of a hybrid nanowire with hexagonal semiconducting core and full superconducting shell. (B) Tunneling conductance (color) into the core as a function of applied flux (horizontal axis) and source-drain voltage (vertical axis) reveals a hard induced superconducting gap near zero applied flux and a gapped region with a discrete zero-energy state around one applied flux quantum, 0. (C) Realistic transport simulations in the presence of MZMs reproduce key features of the experimental data.

Link:
Flux-induced topological superconductivity in full-shell nanowires - Science Magazine

The future’s bright for quantum computing but it will need big backing – The Union Journal

IT stakeholders throughout markets are delighted by the potential customers of quantum computing, but it will take a whole lot a lot more source to make sure both the technologys all set for a large swimming pool of customers, and also those very same customers prepare to release it.

Thats according to a brand-new study by the International Data Corporation (IDC) qualified Quantum Computing Adoption Trends: 2020 Survey Findings, which has actually assembled information and also end-user metrics from over 2,700 European entities associated with the quantum ball, and also the people managing quantum financial investments.

Despite the slower price of quantum fostering total( financial investments consist of in between 0 2 percent of yearly budget plans), end-users are confident that quantum computing will placed them at an affordable benefit, supplied that very early seed financial investment gets on hand.

The favorable overview adheres to the growth of brand-new models and also very early progression in markets such as FinTech, cybersecurity and also production.

Made up of those that would certainly look after financial investment in quantum in their organisations, participants pointed out far better company knowledge information event, enhanced expert system (AI) capacities, in addition to increased effectiveness and also efficiency of their cloud-based systems and also solutions, as one of the most amazing applications.

While the innovation itself still has a lengthy means to precede its practical for organisations, also when it is, IT directors stress over high prices refuting them accessibility, restricted expertise of the area, scarcity of essential sources in addition to the high degree of details entailed within the innovation itself.

However, with such large applications and also possibility of the technology, quantum area makers and also vendors are established on making the innovation readily available for as wide a swathe of customers as feasible that implies production it easy to use, and also readily available to business with even more restricted source, as cloud-based Quantum-Computing- as-a-Service (QCaaS).

According to Heather Wells, the IDCs elderly study expert of Infrastructure Systems, Platforms, and also Technology, Quantum computing is the future market and also facilities disruptor for companies wanting to make use of big quantities of information, expert system, and also artificial intelligence to speed up real-time company knowledge and also introduce item growth.

Many organizations from many industries are already experimenting with its potential.

These understandings more mention one of the most prominent applications and also methods of quantum innovation, that include cloud-centric quantum computing, quantum networks, facility quantum formulas, and also crossbreed quantum computing which takes in 2 or even more adaptions of quantum technological opportunities.

The future appears significantly encouraging for quantum computing mass fostering, nonetheless, those business creating should act rapidly to make its very early power easily accessible to organisations in order to protect the financial investment to drive the innovations real future possibility.

Post Views: 312

Read more:
The future's bright for quantum computing but it will need big backing - The Union Journal

Research by University of Chicago PhD Student and EPiQC Wins IBM Q Best Paper – Quantaneo, the Quantum Computing Source

The interdisciplinary team of researchers from UChicago, University of California, Berkeley, Princeton University and Argonne National Laboratory won the $2,500 first-place award for Best Paper. Their research examined how the VQE quantum algorithm could improve the ability of current and near-term quantum computers to solve highly complex problems, such as finding the ground state energy of a molecule, an important and computationally difficult chemical calculation the authors refer to as a killer app for quantum computing.

Quantum computers are expected to perform complex calculations in chemistry, cryptography and other fields that are prohibitively slow or even impossible for classical computers. A significant gap remains, however, between the capabilities of todays quantum computers and the algorithms proposed by computational theorists.

VQE can perform some pretty complicated chemical simulations in just 1,000 or even 10,000 operations, which is good, Gokhale says. The downside is that VQE requires millions, even tens of millions, of measurements, which is what our research seeks to correct by exploring the possibility of doing multiple measurements simultaneously.

Gokhale explains the research in this video.

With their approach, the authors reduced the computational cost of running the VQE algorithm by 7-12 times. When they validated the approach on one of IBMs cloud-service 20-qubit quantum computers, they also found lower error as compared to traditional methods of solving the problem. The authors have shared their Python and Qiskit code for generating circuits for simultaneous measurement, and have already received numerous citations in the months since the paper was published.

For more on the research and the IBM Q Best Paper Award, see the IBM Research Blog. Additional authors on the paper include Professor Fred Chong and PhD student Yongshan Ding of UChicago CS, Kaiwen Gui and Martin Suchara of the Pritzker School of Molecular Engineering at UChicago, Olivia Angiuli of University of California, Berkeley, and Teague Tomesh and Margaret Martonosi of Princeton University.

Link:
Research by University of Chicago PhD Student and EPiQC Wins IBM Q Best Paper - Quantaneo, the Quantum Computing Source

Scientists Have Discovered a Brand New Electronic State of Matter – ScienceAlert

Scientists have observed a new state of electronic matter on the quantum scale, one that forms when electrons clump together in transit, and it could advance our understanding and application of quantum physics.

Movement is key to this new quantum state. When electric current is applied to semiconductors or metals, the electrons inside usually travel slowly and somewhat haphazardly in one direction.

Not so in a special type of medium known as aballistic conductor, where the movement is faster and more uniform.

The new study shows how in very thin ballistic conducting wires, electrons can gang up creating a whole new quantum state of matter made solely from speeding electrons.

"Normally, electrons in semiconductors or metals move and scatter, and eventually drift in one direction if you apply a voltage," says physicist Jeremy Levy, from the University of Pittsburgh. "But in ballistic conductors the electrons move more like cars on a highway."

"The discovery we made shows that when electrons can be made to attract one another, they can form bunches of two, three, four and five electrons that literally behave like new types of particles, new forms of electronic matter."

Ballistic conductors can be used for stretching the boundaries of what's possible in electronics and classical physics, and the one used in this particular experiment was made from lanthanum aluminate and strontium titanate.

Interestingly, when the researchers measured the levels of conductance they found they followed one of the most well-known patterns in mathematics Pascal's triangle. Asconductanceincreased, it stepped up in a pattern that matches one of the rows of Pascal's triangle, following the order 1, 3, 6, 10 and so on.

"The discovery took us some time to understand but it was because we initially did not realise we were looking at particles made up of one electron, two electrons, three electrons and so forth," says Levy.

This clumping of electrons is similar to the way that quarks bind together to form neutrons and protons, according to the researchers. Electrons in superconductors can team up like this too, joining together in pairs to coordinate movement.

The findings may have something to teach us about quantum entanglement, which in turn is key to making progress with quantum computing and a super-secure, super-fast quantum internet.

According to Levy, it's another example of how we're reverse engineering the world based on what we've found from the discovery of the fundamentals of quantum physics building on important work done in the last few decades.

"Now in the 21st century, we're looking at all the strange predictions of quantum physics and turning them around and using them," says Levy.

"When you talk about applications, we're thinking about quantum computing, quantum teleportation, quantum communications, quantum sensing ideas that use the properties of the quantum nature of matter that were ignored before."

The research has been published in Science.

See original here:
Scientists Have Discovered a Brand New Electronic State of Matter - ScienceAlert

The view of quantum threats from the front lines – JAXenter

The future is here. Or just about. After a number of discoveries, researchers have proven that quantum computing is possible and on its way. The wider world did not pause long on this discovery: Goldman Sachs, Amazon, Google, and IBM have just announced their own intentions to embark on their own quantum developments.

Now that its within our reach we have to start seriously considering what that means in the real world. Certainly, we all stand to gain from the massive benefits that quantum capabilities can bring, but so do cybercriminals.

Scalable quantum computing will defeat much of modern-day encryption, such as the RSA 2048 bit keys, which secure computer networks everywhere. The U.S. National Institute of Standards and Technology says as much, projecting that quantum in this decade will be able to break the protocols on which the modern internet relies.

The security profession hasnt taken the news lying down either. Preparations have begun in earnest. The DigiCert 2019 Post Quantum Cryptography (PQC) Survey aimed to examine exactly how companies were doing. Researchers surveyed 400 enterprises, each with 1,000 or more employees, across the US, Germany and Japan to get answers. They also conducted a focus group of nine different IT managers to further reveal those preparations.

SEE ALSO:DevSecOps Panel Best DevOps Security Practices & Best Tools

An encouraging development is that 35 percent of respondents already have a PQC budget, and a further 56 percent are discussing one in their organisations. Yet, many are still very early in the process of PQC planning. An IT manager within a manufacturing company said, We have a budget for security overall. Theres a segment allotted to this, but its not to the level or expense that is appropriate and should be there yet.

The time to start preparing, including inquiring of your vendors readiness for quantum computing threats, is now. One of the respondents, an IT Security manager at a financial services company, told surveyors, Were still in the early discussion phases because were not the only ones who are affected. There are third party partners and vendors that were in early discussions with on how we can be proactive and beef up our security. And quantum cryptology is one of the topics that we are looking at.

Others expanded upon that, noting that their early preparations heavily involve discussing the matter with third parties and vendors. Another focus group member, an IT manager at an industrial construction company, told the group, We have third party security companies that are working with us to come up with solutions to be proactive. So obviously, knock on wood, nothing has happened yet. But we are definitely always proactive from a security standpoint and were definitely trying to make sure that were ready once a solution is available.

Talking to your vendors and third parties should be a key part of any organisations planning process. To that end, organisations should be checking whether their partners will keep supporting and securing customers operations into the age of quantum.

The data itself was still at the centre of respondents minds when it came to protection from quantum threats, and when asked what they were focusing on in their preparations, respondents said that above all they were monitoring their own data. One respondent told us, The data is everything for anybody thats involved in protecting it. And so you just have to stay on top of it along with your vendors and continue to communicate.

One of the prime preparatory best practices that respondents called upon was monitoring. Knowing what kind of data flows within your environment, how its used and how its currently protected are all things that an enterprise has to find out as they prepare.

SEE ALSO:As quantum computing draws near, cryptography security concerns grow

To be sure, overhauling an enterprises cryptographic infrastructure is no small feat, but respondents listed understanding their organisations level of crypto agility as a priority. Quantum might be a few years off, but becoming crypto agile may take just as long.

Organisations will have to plan for a system which can easily swap out, integrate and change cryptographic algorithms within an organisation. Moreover, it must be able to do so quickly, cheaply and without any significant changes to the broader system. Practically, this means installing automated platforms which follow your cryptographic deployments so that you can remediate, revoke, renew, reissue or otherwise control any and all of your certificates at scale.

Many organisations are still taking their first tentative steps, and others have yet to take any. Now is the time for organisations to be assessing their deployments of crypto and digital certificates so they have proper crypto-agility and are ready to deploy quantum-resistant algorithms soon rather than being caught lacking when it finally arrives.

More:
The view of quantum threats from the front lines - JAXenter

How Quantum Computers Work | HowStuffWorks

The massive amount of processing power generated by computer manufacturers has not yet been able to quench our thirst for speed and computing capacity. In 1947, American computer engineer Howard Aiken said that just six electronic digital computers would satisfy the computing needs of the United States. Others have made similar errant predictions about the amount of computing power that would support our growing technological needs. Of course, Aiken didn't count on the large amounts of data generated by scientific research, the proliferation of personal computers or the emergence of the Internet, which have only fueled our need for more, more and more computing power.

Will we ever have the amount of computing power we need or want? If, as Moore's Law states, the number of transistors on a microprocessor continues to double every 18 months, the year 2020 or 2030 will find the circuits on a microprocessor measured on an atomic scale. And the logical next step will be to create quantum computers, which will harness the power of atoms and molecules to perform memory and processing tasks. Quantum computers have the potential to perform certain calculations significantly faster than any silicon-based computer.

Scientists have already built basic quantum computers that can perform certain calculations; but a practical quantum computer is still years away. In this article, you'll learn what a quantum computer is and just what it'll be used for in the next era of computing.

You don't have to go back too far to find the origins of quantum computing. While computers have been around for the majority of the 20th century, quantum computing was first theorized less than 30 years ago, by a physicist at the Argonne National Laboratory. Paul Benioff is credited with first applying quantum theory to computers in 1981. Benioff theorized about creating a quantum Turing machine. Most digital computers, like the one you are using to read this article, are based on the Turing Theory. Learn what this is in the next section.

More:
How Quantum Computers Work | HowStuffWorks

Quantum Computing Market 2020 Trends, Market Share, Industry Size, Opportunities, Analysis and Forecast by 2026 – Instant Tech News

Quantum Computing Market Overview:

Global Quantum Computing Market was valued at USD 89.35 million in 2016 and is projected to reach USD 948.82 million by 2025, growing at a CAGR of 30.02% from 2017 to 2025.

In the report, we thoroughly examine and analyze the Global market for Quantum Computing so that market participants can improve their business strategy and ensure long-term success. The reports authors used easy-to-understand language and complex statistical images, but provided detailed information and data on the global Quantum Computing market. This report provides players with useful information and suggests result-based ideas to give them a competitive advantage in the global Quantum Computing market. Show how other players compete in the global Quantum Computing market and explain the strategies you use to differentiate yourself from other participants.

The researchers provided quantitative and qualitative analyzes with evaluations of the absolute dollar opportunity in the report. The report also includes an analysis of Porters Five Forces and PESTLE for more detailed comparisons and other important studies. Each section of the report offers players something to improve their gross margins, sales and marketing strategies, and profit margins. As a tool for insightful market analysis, this report enables players to identify the changes they need to do business and improve their operations. You can also identify key electrical bags and compete with other players in the global Quantum Computing market.

Request a Report Brochure @ https://www.verifiedmarketresearch.com/download-sample/?rid=24845&utm_source=ITN&utm_medium=001

Top 10 Companies in the Quantum Computing Market Research Report:

QC Ware Corp., D-Wave Systems, Cambridge Quantum Computing, IBM Corporation, Magiq Technologies, Qxbranch, Research at Google Google, Rigetti Computing, Station Q Microsoft Corporation, 1qb Information Technologies

Quantum Computing Market Competition:

Each company evaluated in the report is examined for various factors such as the product and application portfolio, market share, growth potential, future plans and recent developments. Readers gain a comprehensive understanding and knowledge of the competitive environment. Most importantly, this report describes the strategies that key players in the global Quantum Computing market use to maintain their advantage. It shows how market competition will change in the coming years and how players are preparing to anticipate the competition.

Quantum Computing Market Segmentation:

The analysts who wrote the report ranked the global Quantum Computing market by product, application, and region. All sectors were examined in detail, focusing on CAGR, market size, growth potential, market share and other important factors. The segment studies included in the report will help players focus on the lucrative areas of the global Quantum Computing market. Regional analysis will help players strengthen their base in the major regional markets. This shows the opportunities for unexplored growth in local markets and how capital can be used in the forecast period.

Regions Covered by the global market for Smart Camera:

Middle East and Africa (GCC countries and Egypt)North America (USA, Mexico and Canada)South America (Brazil, etc.)Europe (Turkey, Germany, Russia, Great Britain, Italy, France etc.)Asia Pacific (Vietnam, China, Malaysia, Japan, Philippines, Korea, Thailand, India, Indonesia and Australia)

Ask for Discount @ https://www.verifiedmarketresearch.com/ask-for-discount/?rid=24845&utm_source=ITN&utm_medium=001

Table of Content

1 Introduction of Quantum Computing Market

1.1 Overview of the Market1.2 Scope of Report1.3 Assumptions

2 Executive Summary

3 Research Methodology of Verified Market Research

3.1 Data Mining3.2 Validation3.3 Primary Interviews3.4 List of Data Sources

4 Quantum Computing Market Outlook

4.1 Overview4.2 Market Dynamics4.2.1 Drivers4.2.2 Restraints4.2.3 Opportunities4.3 Porters Five Force Model4.4 Value Chain Analysis

5 Quantum Computing Market, By Deployment Model

5.1 Overview

6 Quantum Computing Market, By Solution

6.1 Overview

7 Quantum Computing Market, By Vertical

7.1 Overview

8 Quantum Computing Market, By Geography

8.1 Overview8.2 North America8.2.1 U.S.8.2.2 Canada8.2.3 Mexico8.3 Europe8.3.1 Germany8.3.2 U.K.8.3.3 France8.3.4 Rest of Europe8.4 Asia Pacific8.4.1 China8.4.2 Japan8.4.3 India8.4.4 Rest of Asia Pacific8.5 Rest of the World8.5.1 Latin America8.5.2 Middle East

9 Quantum Computing Market Competitive Landscape

9.1 Overview9.2 Company Market Ranking9.3 Key Development Strategies

10 Company Profiles

10.1.1 Overview10.1.2 Financial Performance10.1.3 Product Outlook10.1.4 Key Developments

11 Appendix

11.1 Related Research

Get a Complete Market Report in your Inbox within 24 hours @ https://www.verifiedmarketresearch.com/product/Quantum-Computing-Market/?utm_source=ITN&utm_medium=001

About Us:

Verified market research partners with clients to provide insight into strategic and growth analytics; data that help achieve business goals and targets. Our core values include trust, integrity, and authenticity for our clients.

Analysts with high expertise in data gathering and governance utilize industry techniques to collate and examine data at all stages. Our analysts are trained to combine modern data collection techniques, superior research methodology, subject expertise and years of collective experience to produce informative and accurate research reports.

Contact Us:

Mr. Edwyne FernandesCall: +1 (650) 781 4080Email: [emailprotected]

TAGS: Quantum Computing Market Size, Quantum Computing Market Growth, Quantum Computing Market Forecast, Quantum Computing Market Analysis, Quantum Computing Market Trends, Quantum Computing Market

Here is the original post:
Quantum Computing Market 2020 Trends, Market Share, Industry Size, Opportunities, Analysis and Forecast by 2026 - Instant Tech News

Explainer: What is a quantum computer? – MIT Technology Review

This is the first in a series of explainers on quantum technology. The other two are on quantum communication and post-quantum cryptography.

A quantum computer harnesses some of the almost-mystical phenomena of quantum mechanics to deliver huge leaps forward in processing power. Quantum machines promise to outstrip even the most capable of todaysand tomorrowssupercomputers.

They wont wipe out conventional computers, though. Using a classical machine will still be the easiest and most economical solution for tackling most problems. But quantum computers promise to power exciting advances in various fields, from materials science to pharmaceuticals research. Companies are already experimenting with them to develop things like lighter and more powerful batteries for electric cars, and to help create novel drugs.

The secret to a quantum computers power lies in its ability to generate and manipulate quantum bits, or qubits.

What is a qubit?

Today's computers use bitsa stream of electrical or optical pulses representing1s or0s. Everything from your tweets and e-mails to your iTunes songs and YouTube videos are essentially long strings of these binary digits.

Quantum computers, on the other hand, usequbits, whichare typically subatomic particles such as electrons or photons. Generating and managing qubits is a scientific and engineering challenge. Some companies, such as IBM, Google, and Rigetti Computing, use superconducting circuits cooled to temperatures colder than deep space. Others, like IonQ, trap individual atoms in electromagnetic fields on a silicon chip in ultra-high-vacuum chambers. In both cases, the goal is to isolate the qubits in a controlled quantum state.

Qubits have some quirky quantum properties that mean a connected group of them can provide way more processing power than the same number of binary bits. One of those properties is known as superposition and another is called entanglement.

Qubits can represent numerous possible combinations of 1and 0 at the same time. This ability to simultaneously be in multiple states is called superposition. To put qubits into superposition, researchers manipulate them using precision lasers or microwave beams.

Thanks to this counterintuitive phenomenon, a quantum computer with several qubits in superposition can crunch through a vast number of potential outcomes simultaneously. The final result of a calculation emerges only once the qubits are measured, which immediately causes their quantum state to collapse to either 1or 0.

Researchers can generate pairs of qubits that are entangled, which means the two members of a pair exist in a single quantum state. Changing the state of one of the qubits will instantaneously change the state of the other one in a predictable way. This happens even if they are separated by very long distances.

Nobody really knows quite how or why entanglement works. It even baffled Einstein, who famously described it as spooky action at a distance. But its key to the power of quantum computers. In a conventional computer, doubling the number of bits doubles its processing power. But thanks to entanglement, adding extra qubits to a quantum machine produces an exponential increase in its number-crunching ability.

Quantum computers harness entangled qubits in a kind of quantum daisy chain to work their magic. The machines ability to speed up calculations using specially designed quantum algorithms is why theres so much buzz about their potential.

Thats the good news. The bad news is that quantum machines are way more error-prone than classical computers because of decoherence.

The interaction of qubits with their environment in ways that cause their quantum behavior to decay and ultimately disappear is called decoherence. Their quantum state is extremely fragile. The slightest vibration or change in temperaturedisturbances known as noise in quantum-speakcan cause them to tumble out of superposition before their job has been properly done. Thats why researchers do their best to protect qubits from the outside world in those supercooled fridges and vacuum chambers.

But despite their efforts, noise still causes lots of errors to creep into calculations. Smart quantum algorithmscan compensate for some of these, and adding more qubits also helps. However, it will likely take thousands of standard qubits to create a single, highly reliable one, known as a logical qubit. This will sap a lot of a quantum computers computational capacity.

And theres the rub: so far, researchers havent been able to generate more than 128 standard qubits (see our qubit counter here). So were still many years away from getting quantum computers that will be broadly useful.

That hasnt dented pioneers hopes of being the first to demonstrate quantum supremacy.

What is quantum supremacy?

Its the point at which a quantum computer can complete a mathematical calculation that is demonstrably beyond the reach of even the most powerful supercomputer.

Its still unclear exactly how many qubits will be needed to achieve this because researchers keep finding new algorithms to boost the performance of classical machines, and supercomputing hardware keeps getting better. But researchers and companies are working hard to claim the title, running testsagainst some of the worlds most powerful supercomputers.

Theres plenty of debate in the research world about just how significant achieving this milestone will be. Rather than wait for supremacy to be declared, companies are already starting to experiment with quantum computers made by companies like IBM, Rigetti, and D-Wave, a Canadian firm. Chinese firms like Alibaba are also offering access to quantum machines. Some businesses are buying quantum computers, while others are using ones made available through cloud computing services.

Where is a quantum computer likely to be most useful first?

One of the most promising applications of quantum computers is for simulating the behavior of matterdown to the molecular level. Auto manufacturers like Volkswagen and Daimler are using quantum computers to simulate the chemical composition of electrical-vehicle batteries to help find new ways to improve their performance. And pharmaceutical companies are leveraging them to analyze and compare compounds that could lead to the creation of new drugs.

The machines are also great for optimization problems because they can crunch through vast numbers of potential solutions extremely fast. Airbus, for instance, is using them to help calculate the most fuel-efficient ascent and descent paths for aircraft. And Volkswagen has unveiled a service that calculates the optimal routes for buses and taxis in cities in order to minimize congestion. Some researchers also think the machines could be used to accelerate artificial intelligence.

It could take quite a few years for quantum computers to achieve their full potential. Universities and businesses working on them are facing a shortage of skilled researchersin the fieldand a lack of suppliersof some key components. But if these exotic new computing machines live up to their promise, they could transform entire industries and turbocharge global innovation.

Read the original here:
Explainer: What is a quantum computer? - MIT Technology Review

What Is Quantum Computing and How Does it Work? – Built In

Accustomed to imagining worst-case scenarios, many cryptography experts are more concerned than usual these days: one of the most widely used schemes for safely transmitting data is poised to become obsolete once quantum computing reaches a sufficiently advanced state.

The cryptosystem known as RSA provides the safety structure for a host of privacy and communication protocols, from email to internet retail transactions. Current standards rely on the fact that no one has the computing power to test every possible way to de-scramble your data once encrypted, but a mature quantum computer could try every option within a matter of hours.

It should be stressed that quantum computers havent yet hit that level of maturity and wont for some time but when a large, stable device is built (or if its built, asan increasingly diminishing minority argue), its unprecedented ability to factor large numbers would essentially leave the RSA cryptosystem in tatters. Thankfully, the technology is still a ways away and the experts are on it.

Dont panic. Thats what Mike Brown, CTO and co-founder of quantum-focused cryptography company ISARA Corporation, advises anxious prospective clients. The threat is far from imminent. What we hear from the academic community and from companies like IBM and Microsoft is that a 2026-to-2030 timeframe is what we typically use from a planning perspective in terms of getting systems ready, he said.

Cryptographers from ISARA are among several contingents currently taking part in the Post-Quantum Cryptography Standardization project, a contest of quantum-resistant encryption schemes. The aim is to standardize algorithms that can resist attacks levied by large-scale quantum computers. The competition was launched in 2016 by the National Institute of Standards and Technology (NIST), a federal agency that helps establish tech and science guidelines, and is now gearing up for its third round.

Indeed, the level of complexity and stability required of a quantum computer to launch the much-discussed RSA attack is very extreme, according to John Donohue, scientific outreach manager at the University of Waterloos Institute for Quantum Computing. Even granting that timelines in quantum computing particularly in terms of scalability are points of contention, the community is pretty comfortable saying thats not something thats going to happen in the next five to 10 years, he said.

When Google announced that it had achieved quantum supremacy or that it used a quantum computer to run, in minutes, an operation that would take thousands of years to complete on a classical supercomputer that machine operated on 54 qubits, the computational bedrocks of quantum computing. While IBMs Q 53 system operates at a similar level, many current prototypes operate on as few as 20 or even five qubits.

But how many qubits would be needed to crack RSA? Probably on the scale of millions of error-tolerant qubits, Donohue told Built In.

Scott Aaronson, a computer scientist at the University of Texas at Austin, underscored the same last year in his popular blog after presidential candidate Andrew Yang tweeted that no code is uncrackable in the wake of Googles proof-of-concept milestone.

Thats the good news. The bad news is that, while cryptography experts gain more time to keep our data secure from quantum computers, the technologys numerous potential upsides ranging from drug discovery to materials science to financial modeling is also largely forestalled. And that question of error tolerance continues to stand as quantum computings central, Herculean challenge. But before we wrestle with that, lets get a better elemental sense of the technology.

Quantum computers process information in a fundamentally different way than classical computers. Traditional computers operate on binary bits information processed in the form of ones or zeroes. But quantum computers transmit information via quantum bits, or qubits, which can exist either as one or zero or both simultaneously. Thats a simplification, and well explore some nuances below, but that capacity known as superposition lies at the heart of quantums potential for exponentially greater computational power.

Such fundamental complexity both cries out for and resists succinct laymanization. When the New York Times asked 10 experts to explain quantum computing in the length of a tweet, some responses raised more questions than they answered:

Microsoft researcher David Reilly:

A quantum machine is a kind of analog calculator that computes by encoding information in the ephemeral waves that comprise light and matter at the nanoscale.

D-Wave Systems executive vice president Alan Baratz:

If were honest, everything we currently know about quantum mechanics cant fully describe how a quantum computer works.

Quantum computing also cries out for a digestible metaphor. Quantum physicist Shohini Ghose, of Wilfrid Laurier University, has likened the difference between quantum and classical computing to light bulbs and candles: The light bulb isnt just a better candle; its something completely different.

Rebecca Krauthamer, CEO of quantum computing consultancy Quantum Thought, compares quantum computing to a crossroads that allows a traveler to take both paths. If youre trying to solve a maze, youd come to your first gate, and you can go either right or left, she said. We have to choose one, but a quantum computer doesnt have to choose one. It can go right and left at the same time.

It can, in a sense, look at these different options simultaneously and then instantly find the most optimal path, she said. That's really powerful.

The most commonly used example of quantum superposition is Schrdingers cat:

Despite its ubiquity, many in the QC field arent so taken with Schrodingers cat. The more interesting fact about superposition rather than the two-things-at-once point of focus is the ability to look at quantum states in multiple ways, and ask it different questions, said Donohue. That is, rather than having to perform tasks sequentially, like a traditional computer, quantum computers can run vast numbers of parallel computations.

Part of Donohues professional charge is clarifying quantums nuances, so its worth quoting him here at length:

In superposition I can have state A and state B. I can ask my quantum state, are you A or B? And it will tell me, I'm a or I'm B. But I might have a superposition of A + B in which case, when I ask it, Are you A or B? Itll tell me A or B randomly.

But the key of superposition is that I can also ask the question, Are you in the superposition state of A + B? And then in that case, they'll tell me, Yes, I am the superposition state A + B.

But theres always going to be an opposite superposition. So if its A + B, the opposite superposition is A - B.

Thats about as simplified as we can get before trotting out equations. But the top-line takeaway is that that superposition is what lets a quantum computer try all paths at once.

Thats not to say that such unprecedented computational heft will displace or render moot classical computers. One thing that we can really agree on in the community is that it wont solve every type of problem that we run into, said Krauthamer.

But quantum computing is particularly well suited for certain kinds of challenges. Those include probability problems, optimization (what is, say, the best possible travel route?) and the incredible challenge of molecular simulation for use cases like drug development and materials discovery.

The cocktail of hype and complexity has a way of fuzzing outsiders conception of quantum computing which makes this point worth underlining: quantum computers exist, and they are being used right now.

They are not, however, presently solving climate change, turbocharging financial forecasting probabilities or performing other similarly lofty tasks that get bandied about in reference to quantum computings potential. QC may have commercial applications related to those challenges, which well explore further below, but thats well down the road.

Today, were still in whats known as the NISQ era Noisy, Intermediate-Scale Quantum. In a nutshell, quantum noise makes such computers incredibly difficult to stabilize. As such, NISQ computers cant be trusted to make decisions of major commercial consequence, which means theyre currently used primarily for research and education.

The technology just isnt quite there yet to provide a computational advantage over what could be done with other methods of computation at the moment, said Dohonue. Most [commercial] interest is from a long-term perspective. [Companies] are getting used to the technology so that when it does catch up and that timeline is a subject of fierce debate theyre ready for it.

Also, its fun to sit next to the cool kids. Lets be frank. Its good PR for them, too, said Donohue.

But NISQ computers R&D practicality is demonstrable, if decidedly small-scale. Donohue cites the molecular modeling of lithium hydrogen. Thats a small enough molecule that it can also be simulated using a supercomputer, but the quantum simulation provides an important opportunity to check our answers after a classical-computer simulation. NISQs have also delivered some results for problems in high-energy particle physics, Donohue noted.

One breakthrough came in 2017, when researchers at IBM modeled beryllium hydride, the largest molecule simulated on a quantum computer to date. Another key step arrived in 2019, when IonQ researchers used quantum computing to go bigger still, by simulating a water molecule.

These are generally still small problems that can be checked using classical simulation methods. But its building toward things that will be difficult to check without actually building a large particle physics experiment, which can get very expensive, Donohue said.

And curious minds can get their hands dirty right now. Users can operate small-scale quantum processors via the cloud through IBMs online Q Experience and its open-source software Quiskit. Late last year, Microsoft and Amazon both announced similar platforms, dubbed Azure Quantum and Braket. Thats one of the cool things about quantum computing today, said Krauthamer. We can all get on and play with it.

RelatedQuantum Computing and the Gaming Industry

Quantum computing may still be in its fussy, uncooperative stage, but that hasnt stopped commercial interests from diving in.

IBM announced at the recent Consumer Electronics Show that its so-called Q Network had expanded to more than 100 companies and organizations. Partners now range from Delta Air Lines to Anthem health to Daimler AG, which owns Mercedes-Benz.

Some of those partnerships hinge on quantum computings aforementioned promise in terms of molecular simulation. Daimler, for instance, is hoping the technology will one day yield a way to produce better batteries for electric vehicles.

Elsewhere, partnerships between quantum computing startups and leading companies in the pharmaceutical industry like those established between 1QBit and Biogen, and ProteinQure and AstraZeneca point to quantum molecular modelings drug-discovery promise, distant though it remains. (Today, drug development is done through expensive, relatively low-yield trial-and-error.)

Researchers would need millions of qubits to compute the chemical properties of a novel substance, noted theoretical physicist Sabine Hossenfelder in the Guardian last year. But the conceptual underpinning, at least, is there. A quantum computer knows quantum mechanics already, so I can essentially program in how another quantum system would work and use that to echo the other one, explained Donohue.

Theres also hope that large-scale quantum computers will help accelerate AI, and vice versa although experts disagree on this point. The reason theres controversy is, things have to be redesigned in a quantum world, said Krauthamer, who considers herself an AI-quantum optimist. We cant just translate algorithms from regular computers to quantum computers because the rules are completely different, at the most elemental level.

Some believe quantum computers can help combat climate change by improving carbon capture. Jeremy OBrien, CEO of Palo Alto-based PsiQuantum, wrote last year that quantum simulation of larger molecules if achieved could help build a catalyst for scrubbing carbon dioxide directly from the atmosphere.

Long-term applications tend to dominate headlines, but they also lead us back to quantum computings defining hurdle and the reason coverage remains littered with terms like potential and promise: error correction.

Qubits, it turns out, are higher maintenance than even the most meltdown-prone rock star. Any number of simple actions or variables can send error-prone qubits falling into decoherence, or the loss of a quantum state (mainly that all-important superposition). Things that can cause a quantum computer to crash include measuring qubits and running operations in other words: using it. Even small vibrations and temperature shifts will cause qubits to decohere, too.

Thats why quantum computers are kept isolated, and the ones that run on superconducting circuits the most prominent method, favored by Google and IBM have to be kept at near-absolute zero (a cool -460 degrees Fahrenheit).

Thechallenge is two-fold, according to Jonathan Carter, a scientist at Berkeley Quantum. First, individual physical qubits need to have better fidelity. That would conceivably happen either through better engineering, discovering optimal circuit layout, and finding the optimal combination of components. Second, we have to arrange them to form logical qubits.

Estimates range from hundreds to thousands to tens of thousands of physical qubits required to form one fault-tolerant qubit. I think its safe to say that none of the technology we have at the moment could scale out to those levels, Carter said.

From there, researchers would also have to build ever-more complex systems to handle the increase in qubit fidelity and numbers. So how long will it take until hardware-makers actually achieve the necessary error correction to make quantum computers commercially viable?

Some of these other barriers make it hard to say yes to a five- or 10-year timeline, Carter said.

Donohue invokes and rejects the same figure. Even the optimist wouldnt say its going to happen in the next five to 10 years, he said. At the same time, some small optimization problems, specifically in terms of random number generation could happen very soon.

Weve already seen some useful things in that regard, he said.

For people like Michael Biercuk, founder of quantum-engineering software company Q-CTRL, the only technical commercial milestone that matters now is quantum advantage or, as he uses the term, when a quantum computer provides some time or cost advantage over a classical computer. Count him among the optimists: he foresees a five-to-eight year time scale to achieve such a goal.

Another open question: Which method of quantum computing will become standard? While superconducting has borne the most fruit so far, researchers are exploring alternative methods that involve trapped ions, quantum annealing or so-called topological qubits. In Donohues view, its not necessarily a question of which technology is better so much as one of finding the best approach for different applications. For instance, superconducting chips naturally dovetail with the magnetic field technology that underpins neuroimaging.

The challenges that quantum computing faces, however, arent strictly hardware-related. The magic of quantum computing resides in algorithmic advances, not speed, Greg Kuperberg, a mathematician at the University of California at Davis, is quick to underscore.

If you come up with a new algorithm, for a question that it fits, things can be exponentially faster, he said, using exponential literally, not metaphorically. (There are currently 63 algorithms listed and 420 papers cited at Quantum Algorithm Zoo, an online catalog of quantum algorithms compiled by Microsoft quantum researcher Scott Jordan.)

Another roadblock, according to Krauthamer, is general lack of expertise. Theres just not enough people working at the software level or at the algorithmic level in the field, she said. Tech entrepreneur Jack Hidaritys team set out to count the number of people working in quantum computing and found only about 800 to 850 people, according to Krauthamer. Thats a bigger problem to focus on, even more than the hardware, she said. Because the people will bring that innovation.

While the community underscores the importance of outreach, the term quantum supremacy has itself come under fire. In our view, supremacy has overtones of violence, neocolonialism and racism through its association with white supremacy, 13 researchers wrote in Nature late last year. The letter has kickstarted an ongoing conversation among researchers and academics.

But the fields attempt to attract and expand also comes at a time of uncertainty in terms of broader information-sharing.

Quantum computing research is sometimes framed in the same adversarial terms as conversations about trade and other emerging tech that is, U.S. versus China. An oft-cited statistic from patent analytics consultancy Patinformatics states that, in 2018, China filed 492 patents related to quantum technology, compared to just 248 in the United States. That same year, the think tank Center for a New American Security published a paper that warned, China is positioning itself as a powerhouse in quantum science. By the end of 2018, the U.S. passed and signed into law the National Quantum Initiative Act. Many in the field believe legislators were compelled due to Chinas perceived growing advantage.

The initiative has spurred domestic research the Department of Energy recently announced up to $625 million in funding to establish up to five quantum information research centers but the geopolitical tensions give some in the quantum computing community pause, namely for fear of collaboration-chilling regulation. As quantum technology has become prominent in the media, among other places, there has been a desire suddenly among governments to clamp down, said Biercuk, who has warned of poorly crafted and nationalistic export controls in the past.

What they dont understand often is that quantum technology and quantum information in particular really are deep research activities where open transfer of scientific knowledge is essential, he added.

The National Science Foundation one of the government departments given additional funding and directives under the act generally has a positive track record in terms of avoiding draconian security controls, Kuperberg said. Even still, the antagonistic framing tends to obscure the on-the-ground facts. The truth behind the scenes is that, yes, China would like to be doing good research and quantum computing, but a lot of what theyre doing is just scrambling for any kind of output, he said.

Indeed, the majority of the aforementioned Chinese patents are quantum tech, but not quantum computing tech which is where the real promise lies.

The Department of Energy has an internal list of sensitive technologies that it could potentially restrict DOE researchers from sharing with counterparts in China, Russia, Iran and North Korea. It has not yet implemented that curtailment, however, DOE Office of Science director Chris Fall told the House committee on science, space and technology and clarified to Science, in January.

Along with such multi-agency-focused government spending, theres been a tsunami of venture capital directed toward commercial quantum-computing interests in recent years. A Nature analysis found that, in 2017 and 2018, private funding in the industry hit at least $450 million.

Still, funding concerns linger in some corners. Even as Googles quantum supremacy proof of concept has helped heighten excitement among enterprise investors, Biercuk has also flagged the beginnings of a contraction in investment in the sector.

Even as exceptional cases dominate headlines he points to PsiQuantums recent $230 million venture windfall there are lesser-reported signs of struggle. I know of probably four or five smaller shops that started and closed within about 24 months; others were absorbed by larger organizations because they struggled to raise, he said.

At the same time, signs of at least moderate investor agitation and internal turmoil have emerged. The Wall Street Journal reported in January that much-buzzed quantum computing startup Rigetti Computing saw its CTO and COO, among other staff, depart amid concerns that the companys tech wouldnt be commercially viable in a reasonable time frame.

Investor expectations had become inflated in some instances, according to experts. Some very good teams have faced more investor skepticism than I think has been justified This is not six months to mobile application development, Biercuk said.

In Kuperbergs view, part of the problem is that venture capital and quantum computing operate on completely different timelines. Putting venture capital into this in the hope that some profitable thing would arise quickly, that doesnt seem very natural to me in the first place, he said, adding the caveat that he considers the majority of QC money prestige investment rather than strictly ROI-focused.

But some startups themselves may have had some hand in driving financiers over-optimism. I wont name names, but there definitely were some people giving investors outsize expectations, especially when people started coming up with some pieces of hardware, saying that advantages were right around the corner, said Donohe. That very much rubbed the academic community the wrong way.

Scott Aaronson recently called out two prominent startups for what he described as a sort of calculated equivocation. He wrote of a pattern in which a party will speak of a quantum algorithms promise, without asking whether there are any indications that your approach will ever be able to exploit interference of amplitudes to outperform the best classical algorithm.

And, mea culpa, some blame for the hype surely lies with tech media. Trying to crack an area for a lay audience means you inevitably sacrifice some scientific precision, said Biercuk. (Thanks for understanding.)

Its all led to a willingness to serve up a glass of cold water now and again. As Juani Bermejo-Vega, a physicist and researcher at University of Granada in Spain, recently told Wired, the machine on which Google ran its milestone proof of concept is mostly still a useless quantum computer for practical purposes.

Bermejo-Vegas quote came in a story about the emergence of a Twitter account called Quantum Bullshit Detector, which decrees, @artdecider-like, a bullshit or not bullshit quote tweet of various quantum claims. The fact that leading quantum researchers are among the accounts 9,000-plus base of followers would seem to indicate that some weariness exists among the ranks.

But even with the various challenges, cautious optimism seems to characterize much of the industry. For good and ill, Im vocal about maintaining scientific and technical integrity while also being a true optimist about the field and sharing the excitement that I have and to excite others about whats coming, Biercuk said.

This year could prove to be formative in the quest to use quantum computers to solve real-world problems, said Krauthamer. Whenever I talk to people about quantum computing, without fail, they come away really excited. Even the biggest skeptics who say, Oh no, theyre not real. Its not going to happen for a long time.

Related20 Quantum Computing Companies to Know

Link:
What Is Quantum Computing and How Does it Work? - Built In

Quantum Computing: How To Invest In It, And Which Companies Are Leading the Way? – Nasdaq

Insight must precede application. ~ Max Planck, Father of Quantum Physics

Quantum computing is no ordinary technology. It has attracted huge interest at the national level with funding from governments. Today, some of the biggest technology giants are working on the technology, investing substantial sums into research and development and collaborating with state agencies and corporates for various projects across industries.

Heres an overview of quantum computing as well as the players exploring this revolutionary technology, and ways to invest in it.

Understanding Quantum Computing

Lets begin with understanding quantum computing. While standard computers are built on classical bits, every quantum computer has a qubit or quantum bit as its building block. Thus, unlike a classical computer where information is stored as binary 0 or 1 using bits, a quantum computer harnesses the unique ability of subatomic participles in the form of a qubit which can exist in superposition of 0 and 1 at the same time.As a result, quantum computers can achieve higher information density and handle very complex operations at speeds exponentially higher than conventional computers while consuming much lessenergy.

It is believed that quantum computing will have a huge impact on areas such as logistics, military affairs, pharmaceuticals (drug design and discovery), aerospace (designing), utilities (nuclear fusion), financial modeling, chemicals (polymer design), Artificial Intelligence (AI), cybersecurity, fault detection, Big Data, and capital goods, especially digital manufacturing. The productivity gains by end users of quantum computing, in the form of both cost savings and revenue opportunities, are expected to surpass $450 billion annually.

It will be a slow build for the next few years: we anticipate value for end users in these sectors to reach a relatively modest $2 billion to $5 billion by 2024. But value will then increase rapidly as the technology and its commercial viability mature,reportsBCG.

The market for quantum computing isprojectedto reach $64.98 billion by 2030 from just $507.1 million in 2019, growing at aCAGR of 56.0%during the forecast period (20202030).According to aCIRestimate, revenue from quantum computing is pegged at $8 billion by 2027.

Which Nations Are Investing In Quantum Computing?

To gain the quantum advantage, China has been at the forefront of the technology. The first quantum satellite was launched by China in 2016. Apaperby The Center for a New American Security (CNAS) highlights how, China is positioning itself as a powerhouse in quantum science.

Understanding the strategic potential that quantum science holds, U.S., Germany, Russia, India and European Union have intensified efforts towards quantum computing. In the U.S., President Trumpestablishedthe National Quantum Initiative Advisory Committee in 2019 in accordance with the National Quantum Initiative Act, signed into law in late 2018, which authorizes $1.2 billion to be spent on the quantum science over the next five years.

The Indian government in its 2020 budget has announced a National Mission on Quantum Technologies & Applications with a total budgetoutlayof 8000 crore ($1.12 billion) for a period of five years while Europe has a 1 billioninitiativeproviding funding for the entire quantum value chain over the next ten years. In October 2019, the first prototype of a quantum computer waslaunchedin Russia while in Germany, the Fraunhofer-Gesellschaft, Europes leading organization for applied research,partneredwith IBM for advance research in the field of quantum computing.

The Companies Leading the Way

IBM has been one of the pioneers in the field of quantum computing. In January 2019, IBM (IBM)unveiledthe IBM Q System One, the world's first integrated universal approximatequantum computing system designed for scientific and commercial use. In September itopenedthe IBM quantum computation center in New York to expand its quantum computing systems for commercial and research activity. It has also recentlyinvestedin Cambridge Quantum Computing, which was one of the first startups to become a part of IBMs Q Network in 2018.

In October 2019, Google (GOOG,GOOGL) made anannouncementclaiming the achievement of "quantum supremacy."It published the results of this quantum supremacy experiment in theNaturearticle, Quantum Supremacy Using a Programmable Superconducting Processor.The term "quantum supremacy" wascoinedin 2012 by John Preskill. He wrote, one way to achieve quantum supremacy would be to run an algorithm on a quantum computer which solves a problem with a super-polynomial speedup relative to classical computers. The claim wascounteredby IBM.

Vancouver, Canada headquartered D-Wave is the worlds first commercial supplier of quantum computers and its systems are being used by organizations such as NEC, Volkswagen, DENSO, Lockheed Martin, USRA, USC, Los Alamos National Laboratory and Oak Ridge National Laboratory.In February 2019, D-Waveannounceda preview of its next-generation quantum computing platform incorporating hardware, software and tools to accelerate and ease the delivery of quantum computing applications. In September 2019, itnamedits next-generation quantum system as Advantage, which will be available in the Leap quantum cloud service in mid-2020.In December 2019, the companysignedan agreement with NEC to accelerate commercial quantum computing.

Amazon (AMZN)introducedits service Amazon basket in late 2019, which is designed to let its users get some hands-on experience with qubits and quantum circuits. It allows to build and test circuits in a simulated environment and then run them on an actual quantum computer.

Around the same time, Intel (INTC)unveileda first-of-its-kind cryogenic control chip code-named Horse Ridgethat will speed up development of full-stack quantum computing systems.

In addition, companies such as Microsoft (MSFT), Alibaba (BABA), Tencent (TCEHY), Nokia (NOK), Airbus, HP (HPQ), AT&T (T) Toshiba, Mitsubishi, SK Telecom, Raytheon, Lockheed Martin, Righetti, Biogen, Volkswagen and Amgen are researching and working on applications of quantum computing.

Final Word

Investors looking to invest in the technology can either look at individual stocks or consider Defiance Quantum ETF (QTUM) to take exposure to companies developing and applying quantum computing and other advanced technologies. Launched in April 2018, QTUM is a liquid, low-cost and efficient way to invest in the technology. The ETF tracks the BlueStar Quantum and Machine Learning Index, which tracks approximately 60 globally listed stocks across all market capitalizations.

While quantum computing is not mainstream yet, the quest to harness its potential is on, and the constant progress made is shrinking the gap between research labs and real-world applications.

Disclaimer: The author has no position in any stocks mentioned. Investors shouldconsider the above information not as a de facto recommendation, but as an idea for further consideration. The report has been carefully prepared, and any exclusions or errors in reporting are unintentional.

The views and opinions expressed herein are the views and opinions of the author and do not necessarily reflect those of Nasdaq, Inc.

Read the rest here:
Quantum Computing: How To Invest In It, And Which Companies Are Leading the Way? - Nasdaq

What Is Quantum Computing, And How Can It Unlock Value For Businesses? – Computer Business Review

Add to favorites

We are at an inflection point

Ever since Professor Alan Turing proposed the principle of the modern computer in 1936, computing has come a long way. While advancements to date have been promising, the future is even brighter, all thanks to quantum computing, which performs calculations based on the behaviour of particles at the sub-atomic level, writes Kalyan Kumar, CVP and CTO IT Services,HCL Technologies.

Quantum computing promises to unleash unimaginable computing power thats not only capable of addressing current computational limits, but unearthing new solutions to unsolved scientific and social mysteries. Whats more, thanks to increasing advancement since the 1980s, quantum computing can now drive some incredible social and business transformations.

Quantum computing holds immense promise in defining a positive, inclusive and human centric future, which is what theWEF Future Council on Quantum Computingenvisages. The most anticipated uses of quantum computing are driven by its potential to simulate quantum structures and behaviours across chemicals and materials. This promise is being seen guardedly by current scientists who claim quantum computing is still far from making a meaningful impact.

This said, quantum computing is expected to open amazing and much-needed possibilities in medical research. Drug development time, which usually takes more than 10 to 12 years with billions of dollars of investment, is expected to reduce considerably, alongside the potential to explore unique chemical compositions that may just be beyond the limits of current classical computing. Quantum computing can also help with more accurate weather forecasting, and provide accurate information that can help save tremendous amounts of agriculture production from damage.

Quantum computing promises a better and improved future, and while humans are poised to benefit greatly from this revolution, businesses too can expect unapparelled value.

When it comes to quantum computing, it can be said that much of the world is at the they dont know what they dont know stage. Proof points are appearing, and it is seemingly becoming clear that quantum computing solves problems that cannot be addressed by todays computers. Within transportation, for example, quantum computing is being used to develop battery and self-driving technologies, while Volkswagen has also been using quantum computing to match patterns and predict traffic conditions in advance, ensuring a smoother movement of traffic. In supply chains, logistics and trading are receiving a significant boost from the greater computing power and high-resolution modelling quantum computing provides, adding a huge amount of intelligence using new approaches to machine learning.

The possibilities for businesses are immense and go way beyond these examples mentioned above, in domains such as healthcare, financial services and IT. Yet a new approach is required. The companies that succeed in quantum computing will be those that create value chains to exploit the new insights, and form a management system to match the high-resolution view of the business that will emerge.

While there are some initial stage quantum devices already available, these are still far from what the world has been envisaging. Top multinational technology companies have been investing considerably in this field, but they still have some way to go. There has recently been talk of prototype quantum computers performing computations that would have previously taken 10,000 years in just 200 seconds. Though of course impressive, this is just one of the many steps needed to achieve the highest success in quantum computing.

It is vital to understand how and when we are going to adopt quantum computing, so we know the right time to act. The aforementioned prototype should be a wakeup call to early adopters who are seeking to find ways to create a durable competitive advantage. We even recently saw a business announcing its plans to make a prototype quantum computer available on its cloud, something we will all be able to buy or access some time from now. If organisations truly understand the value and applications of quantum computing, they will be able to create new products and services that nobody else has. However, productising and embedding quantum computing into products may take a little more time.

One important question arises from all this: are we witnessing the beginning of the end for classical computing? When looking at the facts, it seems not. With the advent of complete and practical quantum computers, were seeing a hybrid computing model emerging where digital binary computers will co-process and co-exist with quantum Qbit computers. The processing and resource sharing needs are expected to be optimised using real time analysis, where quantum takes over exponential computational tasks. To say the least, quantum computing is not about replacing digital computing, but about coexistence enabling composed computing that handles different tasks at the same time similar to humans having left and right brains for analytical and artistic dominance.

If one things for sure, its that we are at an inflection point, witnessing what could arguably be one of the most disruptive changes in human existence. Having a systematic and planned approach to adoption of quantum computing will not only take some of its mystery away, but reveal its true strategic value, helping us to know when and how to become part of this once in a lifetime revolution.

Continued here:
What Is Quantum Computing, And How Can It Unlock Value For Businesses? - Computer Business Review

LIVE FROM DAVOS: Henry Blodget leads panel on the next decade of tech – Business Insider Nordic

The past decade saw technological advancements that transformed how we work, live, and learn. The next one will bring even greater change as quantum computing, cloud computing, 5G, and artificial intelligence mature and proliferate. These changes will happen rapidly, and the work to manage their impact will need to keep pace.

This session at the World Economic Forum, in Davos, Switzerland, brought together industry experts to discuss how these technologies will shape the next decade, followed by a panel discussion about the challenges and benefits this era will bring and if the world can control the technology it creates.

Henry Blodget, CEO, cofounder, and editorial director, Insider Inc.

This interview is part of a partnership between Business Insider and Microsoft at the 2020 World Economic Forum. Business Insider editors independently decided on the topics broached and questions asked.

Below, find each of the panelists' most memorable contributions:

Julie Love, senior director of quantum business development, Microsoft Microsoft

Julie Love believes global problems such as climate change can potentially be solved far more quickly and easily through developments in quantum computing.

She said: "We [Microsoft] think about problems that we're facing: problems that are caused by the destruction of the environment; by climate change, and [that require] optimization of our natural resources, [such as] global food production."

"It's quantum computing that really a lot of us scientists and technologists are looking for to solve these problems. We can have the promise of solving them exponentially faster, which is incredibly profound. And that the reason is this: [quantum] technology speaks the language of nature.

"By computing the way that nature computes, there's so much information contained in these atoms and molecules. Nature doesn't think about a chemical reaction; nature doesn't have to do some complex computation. It's inherent in the material itself.

Love claimed that, if harnessed in this way, quantum computing could allow scientists to design a compound that could remove carbon from the air. She added that researchers will need to be "really pragmatic and practical about how we take this from, from science fiction into the here-and-now."

Justine Cassell, a professor specializing in AI and linguistics. YouTube/Business Insider

"I believe the future of AI is actually interdependence, collaboration, and cooperation between people and systems, both at the macro [and micro] levels," said Cassell, who is also a faculty member of the Human-Computer Interaction Institute at Carnegie Mellon University.

"At the macro-level, [look], for example, at robots on the factory floor," she said. "Today, there's been a lot of fear about how autonomous they actually are. First of all, they're often dangerous. They're so autonomous, you have to get out of their way. And it would be nice if they were more interdependent if we could be there at the same time as they are. But also, there is no factory floor where any person is autonomous.

In Cassell's view, AI systems could also end up being built collaboratively with experts from non-tech domains, such as psychologists.

"Today, tools [for building AI systems] are mostly machine learning tools," she noted. "And they are, as you've heard a million times, black boxes. You give [the AI system] lots of examples. You say: 'This is somebody being polite. That is somebody being impolite. Learn about that.' But when they build a system that's polite, you don't know why they did that.

"What I'd like to see is systems that allow us to have these bottom-up, black-box approaches from machine learning, but also have, for example, psychologists in there, saying 'that's not actually really polite,' or 'it's polite in the way that you don't ever want to hear.'"

Microsoft president Brad Smith. YouTube/Business Insider

"One thing I constantly wish is that there was a more standardized measurement for everybody to report how much they're spending per employee on employee training because that really doesn't exist, when you think about it," said Smith, Microsoft's president and chief legal officer since 2015.

"I think, anecdotally, one can get a pretty strong sense that if you go back to the 1980s and 1990s employers invested a huge amount in employee training around technology. It was teaching you how to use MS-DOS, or Windows, or how to use Word or Excel interestingly, things that employers don't really feel obliged to teach employees today.

"Learning doesn't stop when you leave school. We're going to have to work a little bit harder. And that's true for everyone.

He added that this creates a further requirement: to make sure the skills people do pick up as they navigate life are easily recognizable by other employers.

"Ultimately, there's a wide variety of post-secondary credentials. The key is to have credentials that employers recognize as being valuable. It's why LinkedIn and others are so focused on new credentialing systems. Now, the good news is that should make things cheaper. It all should be more accessible.

"But I do think that to go back to where I started employers are going to have to invest more [in employee training]. And we're going to have to find some ways to do it in a manner that perhaps is a little more standardized."

Nokia president and CEO, Rajeev Suri. YouTube/Business Insider

Suri said 5G will be able to help develop industries that go far beyond entertainment and telecoms, and will impact physical or manual industries such as manufacturing.

"The thing about 5G is that it's built for machine-type communications. When we received the whole idea of 5G, it was 'how do we get not just human beings to interact with each other, but also large machines," he said.

"So we think that there is a large economic boost possible from 5G and 5G-enabled technologies because it would underpin many of these other technologies, especially in the physical industries."

Suri cited manufacturing, healthcare, and agriculture as just some of the industries 5G could help become far more productive within a decade.

He added: "Yes, we'll get movies and entertainment faster, but it is about a lot of physical industries that didn't quite digitize yet. Especially in the physical industries, we [Nokia] think that the [productivity] gains could be as much as 35% starting in the year 2028 starting with the US first, and then going out into other geographies, like India, China, the European Union, and so on.

More:
LIVE FROM DAVOS: Henry Blodget leads panel on the next decade of tech - Business Insider Nordic

Alibaba’s 10 Tech Trends to Watch in… – Alizila

The Alibaba DAMO Academy, Alibaba Groups global program for tackling ambitious, high-impact technology research, has made some predictions about the trends that will shape the industry in the year ahead. From more-advanced artificial intelligence to large-scale blockchain applications, heres what you can expect in 2020.

1. Artificial Intelligence Gets More Human2020 is set to be a breakthrough year for AI, according to DAMO. Researchers will be taking inspiration from a host of new areas to upgrade the technology, namely cognitive psychology and neuroscience combined with insights into human behavior and history. Theyll also adopt new machine-learning techniques, such as continual learning, which allows machines to remember what theyve learned in order to more quickly learn new things something humans take for granted. With these advances in cognitive intelligence, machines will be able to better understand and make use of knowledge rather than merely perceive and express information.

2. The Next Generation of ComputationComputers these days send information back and forth between the processor and the memory in order to complete tasks. The problem? Computing demands have grown to such an extent in the digital age that our computers cant keep up. Enter processing-in-memory architecture, which integrates the processor and memory into a single chip for faster processing speed. PIM innovations will play a critical role in spurring next-generation AI, DAMO said.

3. Hyper-Connected ManufacturingThe rapid deployment of 5G, Internet of Things and cloud- and edge-computing applications will help manufacturers go digital, including everything from automating equipment, logistics and production scheduling to integrating their factory, IT and communications systems. In turn, DAMO predicts theyll be faster to react to changes in demand and coordinate with suppliers in real time to help productivity and profitability.

WATCH: An Inside Look at Cainiaos Hyperconnected Warehouse

4. Machines Talking to Machines at ScaleMore-advanced IoT and 5G will enable more large-scale deployments of connected devices, which brings with them a range of benefits for governments, companies and consumers. For example, traffic-signal systems could be optimized in real time to keep drivers moving (and happy), while driverless cars could access roadside sensors to better navigate their surroundings. These technologies would also allow warehouse robots to maneuver around obstacles and sort parcels, and fleets of drones to efficiently and securely make last-mile deliveries.

5. Chip Design Gets EasierHave you heard? Moores Law is dying. It is now becoming too expensive to build faster and smaller semiconductors. In its place, chipmakers are now piecing together smaller chiplets into single wafers to handle more-demanding tasks. Think Legos. Another advantage of chiplets is that they often use already-inspected silicon, speeding up time to market. Barriers to entry in chipmaking are dropping, too, as open-source communities provide alternatives to traditional, proprietary design. And as more companies design their own custom chips, they are increasingly contributing to a growing ecosystem of development tools, product information and related software that will enable still easier and faster chip design in the future.

6. Blockchain Moves Toward MainstreamThe nascent blockchain industry is about to see some changes of its own. For one, expect the rise of the blockchain-as-a-service model to make these applications more accessible to businesses. Also, there will be a rise in specialized hardware chips for cloud and edge computing, powered by core algorithms used in blockchain technologies. Scientists at DAMO forecast that the number of new blockchain applications will grow significantly this year, as well, while blockchain-related collaborations across industries will become more common. Lastly, the academy expects large-scale blockchain applications to see wide-scale adoption.

7. A Turning Point for Quantum ComputingRecent advancements in this field have stirred up hopes for making large-scale quantum computers a reality, which will prompt more investments into quantum R&D, according to DAMO. That will result in increased competition and ecosystem growth around quantum technologies, as well as more attempts to commercialize the technology. DAMO predicts that after a difficult but critical period of intensive research in the coming years, quantum information science will deliver breakthroughs such as computers that can correct computation errors in real time.

8. More Revolution in SemiconductorsDemand is surging for computing power and storage, but major chipmakers still havent developed a better solution than 3-nanometer node silicon-based transistors. Experiments in design have led to the discovery of other materials that might boost performance. Topological insulators and two-dimensional superconducting materials, for example, may become connective materials as their properties allow electrical currents to flow without resistance. New magnetic and resistive switching materials might also be used to create next-generation magnetic memory technology, which can run on less power than their predecessors.

9. Data Protection Powered by AIAs businesses face a growing number of data-protection regulations and the rising compliance costs to meet them interest is growing in new solutions that support data security. AI algorithms can do that. They help organizations manage and filter through information, protect user information shared across multiple parties and make regulatory compliance easier, or even automatic. These technologies can help companies promote trust in the reuse and sharing of analytics, as well as overcome problems such as data silos, where certain information is not accessible to an entire organization and causes inefficiencies as a result.

10. Innovation Starts on the CloudCloud computing has evolved far beyond its intended purpose as technological infrastructure to take on a defining role in IT innovation. Today, clouds computing power is the backbone of the digital economy as it transforms the newest, most-advanced innovations into accessible services. From semiconductor chips, databases and blockchain to IoT and quantum computing, nearly all technologies are now tied to cloud computing. It has also given rise to new technologies, such as serverless computing architecture and cloud-powered robotic automation.

Originally posted here:
Alibaba's 10 Tech Trends to Watch in... - Alizila

Why India is falling behind in the Y2Q race – Livemint

Now, the world faces a new scare that some scientists are calling the Y2Q (years to quantum") moment. Y2Q, say experts, could be the next major cyber disruption. When this moment will come is not certain; most predictive estimates range from 10 to 20 years. But one thing is certain: as things stand, India has not woken up to the implications (both positive and negative) of quantum computing.

What is quantum computing? Simply put, it is a future technology that will exponentially speed up the processing power of classical computers, and solve problems in a few seconds that todays fastest supercomputers cant.

Most importantly, a quantum computer would be able to factor the product of two big prime numbers. And that means the underlying assumptions powering modern encryption wont hold when a practical quantum computer becomes a reality. Encryption forms the backbone of a secure cyberspace. It helps to protect the data we send, receive or store.

So, a quantum computer could translate into a complete breakdown of current encryption infrastructure. Cybersecurity experts have been warning about this nightmarish scenario since the late 1990s.

In October, Google announced a major breakthrough, claiming its quantum computer can solve a problem in 200 seconds, which would take even the fastest classical computer 10,000 years. That means their computer had achieved quantum supremacy", claimed the companys scientists. IBM, its chief rival in the field, responded that the claims should be taken with a large dose of skepticism". Clearly, Googles news suggests a quantum future is not a question of if, but when.

India lags behind

As the US and China lead the global race in quantum technology, and other developed nations follow by investing significant intellectual and fiscal resources (see Future Danger), India lags far behind. Indian government is late, but efforts have begun in the last two years," said Debajyoti Bera, a professor at Indraprastha Institute of Information Technology (IIIT) Delhi, who researches quantum computing.

Mints interviews with academic researchers, private sector executives and government officials paint a bleak picture of Indias ability to be a competent participant. For one, the ecosystem is ill-equipped: just a few hundred researchers living in the country work in this domain, that too in discrete silos.

There are legacy reasons: Indias weakness in building hardware and manufacturing technology impedes efforts to implement theoretical ideas into real products. Whatever little is moving is primarily through the government: private sector participationand investmentremains lacklustre. And, of course, theres a funding crunch.

All this has left Indias top security officials concerned. Lieutenant General (retd) Rajesh Pant, national cybersecurity coordinator, who reports to the Prime Ministers Office, identified many gaps in the Indian quantum ecosystem. There is an absence of a quantum road map. There is no visibility in the quantum efforts and successes, and there is a lack of required skill power," Pant said at an event in December, while highlighting the advances China has made in the field. As the national cybersecurity coordinator, this is a cause of concern for me."

The task at hand

In a traditional computerfor instance, your phone and laptopevery piece of information, be it text or video, is ultimately a larger string of bits": each bit can be either zero or one. No other value is possible. In a quantum computer, bits" are replaced by qubits" where each unit can exist in both states, zero and one, at the same time. That makes the processing superfast: qubits can encode and process more information than bits.

Whats most vulnerable is information generated today that has long-term value: diplomatic and military secrets or sensitive financial and healthcare data. The information circulating on the internet that is protected with classical encryption can be harvested by an adversary. Whenever the decryption technology becomes available with the advent of quantum computers, todays secrets will break apart," explains Vadim Makarov, the chief scientist running Russias quantum hacking lab.

From a national security perspective, there are two threads in global efforts. One is to build a quantum computer: whoever gets there first will have the capability to decrypt secrets of the rest. Two, every country is trying to make ones own communications hack-proof and secure.

The Indian game plan

There are individual programmes operating across government departments in India. The ministry of electronics and information technology is interested in computing aspects; DRDO in encryption products and Isro in satellite communication," said a senior official at the department of science and technology (DST) who is directly involved in formulating Indias quantum policy initiatives, on condition of anonymity. DRDO is Defence Research and Development organisation, and Isro is Indian Space Research Organisation. DST, which works under the aegis of the central ministry of science and technology, mandate revolves around making advances in scientific research.

To that end, in 2019, DST launched Quantum Information Science and Technology (QuEST), a programme wherein the government will invest 80 crore in the next three years to fund research directed to build quantum computers, channels for quantum communication and cryptography, among other things. Some 51 projects were selected for funding under QuEST. A quarter of the money has been released, said the DST official.

K. VijayRaghavan, principal scientific adviser, declined to be interviewed for this story. However, in a recent interview to The Print, he said: It[QuEST] will ensure that the nation reaches, within a span of 10 years, the goal of achieving the technical capacity to build quantum computers and communications systems comparable with the best in the world, and hence earn a leadership role."

Not everyone agrees. While QuEST is a good initiative and has helped build some momentum in academia, it is too small to make any meaningful difference to the country," said Sunil Gupta, co-founder and chief executive of QNu Labs, a Bengaluru-based startup building quantum-safe encryption products. India needs to show their confidence and trust in startups." He added that the country needs to up the ante by committing at least $1 billion in this field for the next three years if India wants to make any impact on the global level".

More recently, DRDO announced a new initiative: of the five DRDO Young Scientists Laboratories that were launched by Prime Minister Narendra Modi in January with the aim to research and develop futuristic defence technologies. One lab set up at Indian Institute of Technology Bombay is dedicated to quantum technology.

The DST official said that the government is planning to launch a national mission on quantum technology. It will be a multi-departmental initiative to enable different agencies to work together and focus on the adoption of research into technology," the official said, adding that the mission will have clearly defined deliverables for the next 5 to 10 years." While the details are still in the works, the official said equipping India for building quantum-secure systems is on the cards.

The flaws in the plan

Why is India lagging behind? First, India doesnt have enough people working on quantum technology: the estimates differ, but they fall in the range of 100-200 researchers. That is not enough to compete with IBM," said Anirban Pathak, a professor at Jaypee Institute of Information Technology, and a recipient of DSTs QuEST funding.

Contrast that with China. One of my former students is now a faculty member in a Chinese university. She joined a group that started just two years ago and they are already 50 faculty members in the staff," added Pathak. In India, at no place, you will find more than three faculty members working in quantum."

IIIT Delhis Bera noted: A lot of Indians in quantum are working abroad. Many are working in IBM to build a quantum computer. India needs to figure out a way to get those people back here."

Secondly, theres the lack of a coordinated effort. There are many isolated communities in India working on various aspects: quantum hardware, quantum key distribution, information theory and other fields," said Bera. But there is not much communication across various groups. We cross each other mostly at conferences."

Jaypees Pathak added: In Delhi, there are eight researchers working in six different institutes. Quantum requires many kinds of expertise, and that is needed under one roof. We need an equivalent of Isro (for space) and Barc (for atomic research) for quantum."

Third is Indias legacy problem: strong on theory, but weak in hardware. That has a direct impact on the countrys ability to advance in building quantum technology. The lack of research is not the impediment to prepare for a quantum future, say experts. Implementation is the challenge, the real bottleneck. The DST official quoted earlier acknowledged that some Indian researchers he works with are frustrated.

They need infrastructure to implement their research. For that, we need to procure equipment, instal it and then set it up. That requires money and time," said the official. Indian government has recognized the gap and is working towards it."

Bera said that India should start building a quantum computer. But the problem is that the country doesnt even have good fabrication labs. If we want to design chips, Indians have to outsource," he said. Hardware has never been Indias strong point." QNu Labs is trying to fill that gap. The technology it is developing is based on research done over a decade ago: the effort is to build hardware and make it usable.

Finally, Indias private sector and investors have not stepped up in the game. If India wants something bigger, Indian tech giants like Wipro and Infosys need to step in. They have many engineers on the bench who can be involved. Academia alone or DST-funded projects cant compete with IBM," said Pathak.

The DST official agreed. R&D is good for building prototypes. But industry partnership is crucial for implementing it in the real world," he said. One aim of the national quantum mission that is under the works would be to spin-off startup companies and feed innovation into the ecosystem. We plan to bring venture capitalists (VCs) under one umbrella."

In conclusion

Pant, the national cybersecurity chief, minced no words at the event in December 2019 on quantum technology.

In 1993, there was an earthquake in Latur and we created the National Disaster Management Authority which now has a presence across the country." He added: Are we waiting for a cybersecurity earthquake to strike before we get our act together?"

Samarth Bansal is a freelance journalist based in Delhi. He writes about technology, politics and policy

Here is the original post:
Why India is falling behind in the Y2Q race - Livemint

Inside the race to quantum-proof our vital infrastructure – www.computing.co.uk

"We were on the verge of giving up a few years ago because people were not interested in quantum at the time. Our name became a joke," said Andersen Cheng, CEO of the UK cybersecurity firm Post-Quantum. After all, he continued, how can you be post- something that hasn't happened yet?

But with billions of pounds, renminbi, euros and dollars (US, Canadian and Australian) being pumped into the development of quantum computers by both governments and the private sector and with that research starting to bear fruit, exemplified by Google's achievement of quantum supremacy, no-one's laughing now.

One day, perhaps quite soon, the tried and trusted public-key cryptography algorithms that protect internet traffic will be rendered obsolete. Overnight, a state in possession of a workable quantum computer could start cracking open its stockpiles of encrypted secrets harvested over the years from rival nations. Billions of private conversations and passwords would be laid bare and critical national infrastructure around the world would be open to attack.

A situation often compared with the Y2K problem, the impact could be disastrous. Like Y2K, no-one can be quite sure what the exact consequences will be; unlike Y2k the timing is unclear. But with possible scenarios ranging from massive database hacks to unstoppable cyberattacks on the military, transport systems, power generation and health services, clearly, this is a risk not to be taken lightly.

Critical infrastructure including power generation would be vulnerable to quantum computers

Post-quantum cryptography uses mathematical theory and computer science to devise algorithms that are as hard to crack as possible, even when faced with the massive parallel processing power of a quantum computer. However, such algorithms must also be easy to deploy and use or they will not gain traction.

In 2016, the US National Institute of Standards and Technology (NIST) launched its competition for Public-Key Post-Quantum Cryptographic Algorithms, with the aim of arriving at quantum-safe standards across six categories by 2024. The successful candidates will supplement or replace the three standards considered most vulnerable to quantum attack: FIPS 186-4 (digital signatures), plusNIST SP 800-56AandNIST SP 800-56B (public-key cryptography).

Not all types of cryptography are threatened by quantum computers. Symmetric algorithms (where the same key is used for encryption and decryption) such as AES, which are often deployed to protect data at rest, and hashing algorithms like SHA, used to prove the integrity of files, should be immune to the quantum menace, although they will eventually need larger keys to withstand increases in classical computing power. But the asymmetric cryptosystems like RSA and elliptic curve cryptography (ECC) which form the backbone of secure communications are certainly in danger.

Asymmetric cryptography and public-key infrastructure (PKI) address the problem of how parties can exchange encryption keys where there's a chance that an eavesdropper could intercept and use them. Two keys (a keypair) are generated at the same time: a public key for encrypting data and a private key for decrypting it. These keys are related by a mathematical function that's trivial to perform one in one direction (as when generating the keys) but very difficult in the other (trying to derive the private key from the corresponding public key). One example of such a 'one-way' function is factorising very large integers into primes. This is used in the ubiquitous RSA algorithms that form the basis of the secure internet protocols SSL and TLS. Another such function, deriving the relationship between points on a mathematical elliptic curve, forms the basis of ECC which is sometimes used in place of RSA where short keys and reduced load on the CPU are required, as in IoT and mobile devices.

It is no exaggeration to say that in the absence of SSL and TLS the modern web with its ecommerce and secure messaging could not exist. These protocols allow data to be transmitted securely between email correspondents and between customers and their banks with all the encryption and decryption happening smoothly and seamlessly in the background. Unfortunately, though, factorising large integers and breaking ECC will be a simple challenge for a quantum computer. Such a device running something like Shor's algorithm will allow an attacker to decrypt data locked with RSA-2048 in minutes or hours rather than the billions of years theoretically required by a classical computer to do the same. This explains NIST's urgency in seeking alternatives that are both quantum-proof and flexible enough to replace RSA and ECC.

NIST is not the only organisation trying to get to grips with the issue. The private sector has been involved too. Since 2016 Google has been investigating post-quantum cryptography in the Chrome browser using NewHope, one of the NIST candidates. Last year Cloudflare announced it was collaborating with Google in evaluating the performance of promising key-exchange algorithms in the real world on actual users' devices.

Of the original 69 algorithms submitted to NIST in 2016, 26 have made it through the vetting process as candidates for replacing the endangered protocols; this number includes NewHope in the Lattice-based' category.

One of the seven remaining candidates in the Code-based' category is Post-Quantum's Never-The-Same Key Encapsulation Mechanism (NTS-KEM) which is based on the McEliece cryptosystem. First published in 1978, McEliece never really took off at the time because of the large size of the public and private keys (100kB to several MB). However, it is a known quantity to cryptographers who have had plenty of time to attack it, and it's agreed to be NP-hard' (a mathematical term that in this context translates very roughly as extremely difficult to break in a human timescale - even with a quantum computer'). This is because it introduces randomisation into the ciphertext with error correction codes.

"We actually introduce random errors every time we encrypt the same message," Cheng (pictured) explained. "If I encrypt the letters ABC I might get a ciphertext of 123. And if I encrypt ABC again you'd expect to get 123, right? But we introduce random errors so this time we get 123, next time we get 789."

The error correction codes allow the recipient of the encrypted message to cut out the random noise added to the message when decrypting it, a facility not available to any eavesdropper intercepting the message.

With today's powerful computers McEliece's large key size is much less of an issue than in the past.Indeed, McEliece has some advantages of its own - encryption/decryption is quicker than RSA, for example - but it still faces implementation challenges compared with RSA, particularly for smaller devices. So for the past decade, Cheng's team has been working on making the technology easier to implement. "We have patented some know-how in order to make our platform work smoothly and quickly to shorten the keys to half the size," he said.

Post-Quantum has open-sourced its code (a NIST requirement so that the successful algorithms can be swiftly distributed) and packaged it into libraries to make it as drop-in' as possible and backwards-compatible with existing infrastructure.

Nevertheless, whichever algorithms are chosen, replacing the incumbents like-with-like won't be easy. "RSA is very elegant," Cheng admits. "You can do both encryption and signing. For McEliece and its derivatives because it's so powerful in doing encryption you cannot do signing."

An important concept in quantum resistance is crypto-agility' - the facility to change and upgrade defences as the threat landscape evolves. Historically, industry has been the very opposite of crypto-agile: upgrading US bank ATMs from insecure DES to 3DES took an entire decade to complete. Such leisurely timescales are not an option now that a quantum computer capable of cracking encryption could be just three to five years away.

Because of the wide range of environments, bolstering defences for the quantum age is not as simple as switching crypto libraries. In older infrastructure and applications encryption may be hard-coded, for example. Some banks and power stations still rely on yellowing ranks of servers that they dare not decommission but where the technicians who understand how the encryption works have long since retired. Clearly, more than one approach is needed.

It's worth pointing out that the threat to existing cryptosystems comes not only from quantum computers. The long-term protection afforded by encryption algorithms has often been wildly overestimated even against bog standard' classical supercomputers. RSA 768, introduced in the 1970s, was thought to be safe for 7,000 years, yet it was broken in 2010.

For crypto-agility algorithms need to be swappable

Faced with the arrival of quantum computers and a multiplicity of use cases and environments, cryptographers favour a strength-in-depth or hybridised approach. Cheng uses the analogy of a universal electrical travel plug which can be used in many different counties.

"You can have your RSA, the current protocol, with a PQ [post-quantum] wrapper and make the whole thing almost universal, like a plug with round pins, square pins or a mixture of both. Then when the day comes customers can just turn off RSA and switch over to the chosen PQ algorithm".

Code-based systems like NTS-KEM are not the only type being tested by NIST. The others fall into two main categories: multivariate cryptography, which involves solving complex polynomial equations, and lattice-based cryptography, which is a geometric approach to encrypting data. According to Cheng, the latter offers advantages of adaptability but at the expense of raw encryption power.

"Lattice is less powerful but you can do both encryption and signing,

but it has not been proven to be NP-hard," he said, adding: "In the PQ world everyone's concluded you need to mix-and-match your crypto protocols in order to cover everything."

Professor Alan Woodward (pictured) of Surrey University's Department of Computing said that it's still too early to guess which will ultimately prove successful.

"Lattice-based schemes seem to be winning favour, if you go by numbers still in the race, but there is a lot of work being done on the cryptanalysis and performance issues to whittle it down further," he said. "If I had to bet, I'd say some combination of lattice-based crypto and possibly supersingular isogeny-based schemes will emerge for both encryption and signature schemes."

Quantum mechanics can be an aid in the generation of secure classical encryption keys. Because of their deterministic nature, classical computers cannot generate truly random numbers; instead they produce pseudo-random numbers that are predictable, even if only to a tiny degree. One of Edward Snowden's revelations was that the NSA had cracked the random number generator used by RSA. More recently, weaknesses in RSA's random number generation were discovered in some IoT devices, where one in 172 were found to use the same factor to generate keys. However, a quantum random number generator (QRNG) produces numbers that are truly random, according to quantum theory, resolving this key area of vulnerability.

QKD commonly uses polarised photos to represent ones and zeros

Whereas post-quantum cryptography is based on maths, the other major area of research interest, quantum key distribution (QKD), is rooted in physics, specifically the behaviour of subatomic particles. QKD is concerned with key exchange, using quantum-mechanics to ensure that eavesdroppers cannot intercept the keys without being noticed.

In BB84, the first proposed QKD scheme and still the basis for many implementations, the quantum mechanical properties of subatomic particle, such as the polarity of a photon, is manipulated to represent either a zero or a one. A stream of such photons, polarised at random, is then sent by one party to a detector controlled by the other.

Before they reach the detector, each photon must pass through a filter. One type of filter will allow ones' to pass, the other zeros'; as with the polarisation process, the filters are selected at random, so we'd expect half of the photons to be blocked by the filtering process. Counterintuitively, however, their quantum mechanical properties mean that even those photons that are blocked' by a filter still have a 50 per cent chance of passing their correct value to the detector. Thus, we'd expect an overall agreement between transmission and detection of 75 per cent (50 per cent that pass straight through plus 25 per cent that are blocked' but still communicate their correct value).

Once enough photons have been transmitted to produce a key of the required length, the parties compare, over a separate channel, the sequence of emitted ones and zeros with the filter used for each, discarding the individual results where they disagree. A classical symmetric encryption key is then created from the remaining string of ones and zeros. This key can be used as an uncrackable one-time pad' which is then used to encrypt data such as a message or a login.

Should a man-in-the-middle intercept the stream of photons, the parties will be alerted because of the observer effect: measuring the state of a quantum particle will change it. Statistically, the number of photons registered as correct' by the detector will drop from 75 per cent to around 62.5 per cent and this will be noticed when the two parties compare a random sample of their results at the end of the process. Any such discrepancy will cause the key to be rejected. Properly implemented, QKD can be considered as a provably unbreakable method of exchanging keys.

Switzerland is a QKD pioneer, deploying the technology to secure electoral votes as far back as 2007. The company that helped to achieve this feat, Geneva University spin-off ID Quantique (IDQ), has since become one of the main manufacturers of QKD and QRNG hardware. CEO Grgoire Ribordy (pictured) has seen an recent upsurge of interest beginning in 2016 when the European Commission unveiled its 1 billion, ten-year Quantum Flagship programme. The market is now starting to mature, he said, adding that his company boasts customers in government, finance and "other organisations that have high-value IP to protect".

There's a certain rivalry between physics and maths, between QKD and post-quantum encryption, not least because funding has been hard to come by. Being hardware-based, QKD has so far gobbled up the lion's share of the research grants, but it's possible that when NIST returns its verdicts more money will flow into PQ. Arguments also rage over the practical limits of security.

"The physicists tend to talk about QKD as being perfectly secure' which sets the cryptographers on edge as there is no such thing in practice," Woodward said.

Ribordy is adamant that both techniques will be required. As with the hybrid approach to adopting algorithms, it's not an either-or situation; it all depends on the use case.

"I think they're actually complementary. Quantum crypto [another name for QKD] will provide a higher security and should be used maybe in backbone networks where there's a lot of at stake, big pipes must be protected with more security, and then the quantum-resistant algorithms can find an application in areas where security is not as critical or maybe where there's less data at stake."

One company that's looking to scale up QKD on a national basis is

the startup Quantum Xchange. Based in Bethesda, Maryland, USA, it was founded in 2018 with VC funding to provide ultra-secure data networks. President and CEO John Prisco (pictured) bemoaned the fact that his country, while forging ahead with quantum computers, is behind the curve when it comes to defending against them. It's possible that by 2024 when NIST selects its winning algorithms, the game will already be up.

"Everybody is saying, OK, let's fight quantum with quantum and I subscribe to that," he said. "We've got quantum computers that are offensive weapons and quantum keys that are the defensive of counterpart to that. The rest of the world outside of the United States is embracing this a lot more quickly - Europe, Japan and China."

Quantum particles are uniquely sensitive to any kind of disturbance, so while China may have successfully transmitted quantum keys between Earth and the Micius satellite, this was only possible because of ideal weather conditions at the time (although, interestingly, Woodward believes it could ultimately be the winning approach).

Particles transmitted through the more common fibreoptic cable are also limited by the tendency of the polarised photons to react with the medium. Even with the most pristine fibre, this limits real-world transmission distance to around 100km. After that, you need intermediary repeaters and trusted nodes' to relay the signal. Since it's not possible to directly clone quantum states, the quantum signal must be converted to classical and then back to quantum again, representing a weak point in the otherwise unbreakable chain. So trusted nodes must be very thoroughly secured, which inevitably increases costs and limits current applications. It is also possible for an attacker to interfere with emitters and detectors to corrupt the key generation process.

Other issues? Well, there's a lack of standards and certifications and the equipment is costly. Also, without some sort of secure signature process, how can parties exchanging keys be sure who they are exchanging them with? In addition, it's restricted to point-to-point communications and it's also incompatible with existing networks.

The theory is sound, said Woodward, but the engineering is still a challenge.

"It's in practice that QKD is encountering difficulties. For example, QKD is not yet at a stage where it is using single photons - it uses pulses of light. Hence, the very basis of not being able to clone the quantum state of a photon is put in question as there is more than one of them."

Woodward added that even after the kinks in QKD - be that via satellite, fibreoptic cables or over the airwaves - have been ironed out, the technology will still likely be confined to highly sensitive data and backbone networks because PQ cryptography will be easier to slot into existing infrastructure.

"Whichever [QKD] scheme proves most reliable and robust they all require that expensive infrastructure over what we have now, and so I can envisage it being used for, possibly, government communications but not for home users whose machines are picking a means to communicate securely with their bank's website," he said.

"The post-quantum schemes in the NIST competition would simply replace the software we already have in places such as TLS so the cost would be much lower, and the level of disruption needed for adoption by end-users would be far less."

However, Quantum Xchange is working on overcoming some of these limitations. The firm already operates a small number of high security QKD connections between financial institutions in New York and datacentres in nearby New Jersey over dedicated fibreoptic cables using trusted nodes to extend the reach of its QKD infrastructure. But it is also working on a hybrid system called Phio TX. This will allow the transmission of electronic quantum keys (i.e. keys created using a QRNG) or classical symmetric keys created from the quantum key via a secure channel separate from that used for the encrypted data. The idea is to make the technology more widely applicable by straddling the QKD-PQ divide and removing the point-to-point restrictions.

"The point is to be crypto-agile," Prisco said. "If a company is trying to come up with a quantum-safe strategy they can implement this product that has quantum-resistant algorithms, electronic quantum keys and optical quantum keys, so it becomes a level-of-service discussion. If you have a link that absolutely has to be protected by the laws of physics, you'd use an optical quantum key. If there's virtually no chance of someone intercepting the data with your key you could use a trusted exchange and the combination of the quantum-resistant algorithm with the quantum random number generated key is very powerful."

Edit: the original article stated the $1.2 billionNational Quantum Initiative Act was passed by the House of Representatives in December 2019 whereas this took place in December 2018.

Original post:
Inside the race to quantum-proof our vital infrastructure - http://www.computing.co.uk

IBM research director at CES 2020: We will hit the quantum advantage this decade – TechRepublic

Q Network users are studying carbon chemistry, route optimization, and risk analysis on IBM's quantum computer.

IBM research director Dario Gil started his session at CES 2020 with a science lesson to explain the basics of quantum computing. He hit the highlights of superposition, interference, and entanglement.

After this primer, Gil said that the promise of quantum computing is that it offers the power to model natural processes and understand how they work.

"Quantum is the only technology we know that alters the equation of what is possible to solve versus impossible to solve," he said.

SEE:CES 2020: The big trends for business(ZDNet/TechRepublic special feature)

Jeannette Garcia, senior manager for quantum applications algorithms and theory at IBM Research, shared some of the real-world problems that IBM is working on:

Garcia's focus is battery research, which is also the topic of a new IBM partnership with Daimler. She said researchers are using quantum computing to figure out quantum chemistry.

"We are looking at the fundamental behavior of atoms on a molecular scale," she said.

IBM launched the Q Network a year ago and the growth has been impressive:

Gil said that the numbers show that people want access to quantum hardware. Users in the IBM Q network are studying these topics:

Gil said that it's easy to print more qubits, the hard part is making the interactions among qubits high quality. IBM reports that researchers are making progress on that metric as well.

Quantum volume is a metric that incorporates the number of qubits and the error rate of interactions. IBM has doubled the quantum volume of the system every year, with the latest improvement increasing the current volume to 32.

"This is the fourth time we doubled the quantum volume of a quantum computer," he said.

"We call this Gambetta's Law after our head of science and technology who came up with the methodology of measuring the power of quantum computing," he said.

Gil said that the "quantum ready" era started in 2016 and the next phase will start when the technology improves enough to achieve "quantum advantage.""First, a whole generation of developers is going to need to learn how to program these computers," he said. "Then when we hit quantum advantage, we'll be able to solve real-world problems and it's absolutely going to happen this decade."

For more CES 2020 coverage, check out this list of the top products of CES 2020.

Master the fundamentals of big data analytics by following these expert tips, and by reading insights about data science innovations. Delivered Mondays

At CES 2020, IBM research director Dario Gil gave the audience a primer on quantum computing and predicted that the industry will achieve quantum advantage this decade.

Image: IBM

See the original post:
IBM research director at CES 2020: We will hit the quantum advantage this decade - TechRepublic

AI, ML and quantum computing to cement position in 2020 – Tech Observer

From the emerge of cognitive intelligence, in-memory-computing, fault-tolerant quantum computing, new materials-based semiconductor devices, to faster growth of industrial IoT, large-scale collaboration between machines, production-grade blockchain applications, modular chip design, and AI technologies to protect data privacy, more technology advancements and breakthroughs are expected to gain momentum and generate big impacts on our daily life.

We are at the era of rapid technology development. In particular, technologies such as cloud computing, artificial intelligence, blockchain, and data intelligence are expected to accelerate the pace of the digital economy, said Jeff Zhang, Head of Alibaba DAMO Academy and President of Alibaba Cloud Intelligence.

The following are highlights from the Alibaba DAMO Academy predictions for the top 10 trends in the tech community for this year:

Artificial intelligence has reached or surpassed humans in the areas of perceptual intelligence such as speech to text, natural language processing, video understanding etc. but in the field of cognitive intelligence that requires external knowledge, logical reasoning, or domain migration, it is still in its infancy. Cognitive intelligence will draw inspiration from cognitive psychology, brain science, and human social history, combined with techniques such as cross domain knowledge graph, causality inference, and continuous learning to establish effective mechanisms for stable acquisition and expression of knowledge. These make machines to understand and utilize knowledge, achieving key breakthroughs from perceptual intelligence to cognitive intelligence.

In Von Neumann architecture, memory and processor are separate and the computation requires data to be moved back and forth. With the rapid development of data-driven AI algorithms in recent years, it has come to a point where the hardware becomes the bottleneck in the explorations of more advanced algorithms. In Processing-in-Memory (PIM) architecture, in contrast to the Von Neumann architecture, memory and processor are fused together and computations are performed where data is stored with minimal data movement. As such, computation parallelism and power efficiency can be significantly improved. We believe the innovations on PIM architecture are the tickets to next-generation AI.

In 2020, 5G, rapid development of IoT devices, cloud computing and edge computing will accelerate the fusion of information system, communication system, and industrial control system. Through advanced Industrial IoT, manufacturing companies can achieve automation of machines, in-factory logistics, and production scheduling, as a way to realize C2B smart manufacturing. In addition, interconnected industrial system can adjust and coordinate the production capability of both upstream and downstream vendors. Ultimately it will significantly increase the manufacturers productivity and profitability. For manufacturers with production goods that value hundreds of trillion RMB, if the productivity increases 5-10%, it means additional trillions of RMB.

Traditional single intelligence cannot meet the real-time perception and decision of large-scale intelligent devices. The development of collaborative sensing technology of Internet of things and 5G communication technology will realize the collaboration among multiple agents machines cooperate with each other and compete with each other to complete the target tasks. The group intelligence brought by the cooperation of multiple intelligent bodies will further amplify the value of the intelligent system: large-scale intelligent traffic light dispatching will realize dynamic and real-time adjustment, while warehouse robots will work together to complete cargo sorting more efficiently; Driverless cars can perceive the overall traffic conditions on the road, and group unmanned aerial vehicle (UAV) collaboration will get through the last -mile delivery more efficiently.

Traditional model of chip design cannot efficiently respond to the fast evolving, fragmented and customized needs of chip production. The open source SoC chip design based on RISC-V, high-level hardware description language, and IP-based modular chip design methods have accelerated the rapid development of agile design methods and the ecosystem of open source chips. In addition, the modular design method based on chiplets uses advanced packaging methods to package the chiplets with different functions together, which can quickly customize and deliver chips that meet specific requirements of different applications.

BaaS (Blockchain-as-a-Service) will further reduce the barriers of entry for enterprise blockchain applications. A variety of hardware chips embedded with core algorithms used in edge, cloud and designed specifically for blockchain will also emerge, allowing assets in the physical world to be mapped to assets on blockchain, further expanding the boundaries of the Internet of Value and realizing multi-chain interconnection. In the future, a large number of innovative blockchain application scenarios with multi-dimensional collaboration across different industries and ecosystems will emerge, and large-scale production-grade blockchain applications with more than 10 million DAI (Daily Active Items) will gain mass adoption.

In 2019, the race in reaching Quantum Supremacy brought the focus back to quantum computing. The demonstration, using superconducting circuits, boosts the overall confidence on superconducting quantum computing for the realization of a large-scale quantum computer. In 2020, the field of quantum computing will receive increasing investment, which comes with enhanced competitions. The field is also expected to experience a speed-up in industrialization and the gradual formation of an eco-system. In the coming years, the next milestones will be the realization of fault-tolerant quantum computing and the demonstration of quantum advantages in real-world problems. Either is of a great challenge given the present knowledge. Quantum computing is entering a critical period.

Under the pressure of both Moores Law and the explosive demand of computing power and storage, it is difficult for classic Si based transistors to maintain sustainable development of the semiconductor industry. Until now, major semiconductor manufacturers still have no clear answer and option to chips beyond 3nm. New materials will make new logic, storage, and interconnection devices through new physical mechanisms, driving continuous innovation in the semiconductor industry. For example, topological insulators, two-dimensional superconducting materials, etc. that can achieve lossless transport of electron and spin can become the basis for new high-performance logic and interconnect devices; while new magnetic materials and new resistive switching materials can realize high-performance magnetics Memory such as SOT-MRAM and resistive memory.

Abstract: The compliance costs demanded by the recent data protection laws and regulations related to data transfer are getting increasingly higher than ever before. In light of this, there have been growing interests in using AI technologies to protect data privacy. The essence is to enable the data user to compute a function over input data from different data providers while keeping those data private. Such AI technologies promise to solve the problems of data silos and lack of trust in todays data sharing practices, and will truly unleash the value of data in the foreseeable future.

With the ongoing development of cloud computing technology, the cloud has grown far beyond the scope of IT infrastructure, and gradually evolved into the center of all IT technology innovations. Cloud has close relationship with almost all IT technologies, including new chips, new databases, self-driving adaptive networks, big data, AI, IoT, blockchain, quantum computing and so forth. Meanwhile, it creates new technologies, such as serverless computing, cloud-native software architecture, software-hardware integrated design, as well as intelligent automated operation. Cloud computing is redefining every aspect of IT, making new IT technologies more accessible for the public. Cloud has become the backbone of the entire digital economy.

See the original post:
AI, ML and quantum computing to cement position in 2020 - Tech Observer