The Quantum Dream: Are We There Yet? – Toolbox

The emergence of quantum computing has led industry heavyweights to fast track their research and innovations. This week, Google conducted the largest chemical simulation on a quantum computer to date. The U.S. Department of Energy, on the other hand, launched five new Quantum Information Science (QIS) Research Centers. Will this accelerate quantum computings progress?

Quantum technology is the next big wave in the tech landscape. As opposed to traditional computers where all the information emails, tweets, YouTube videos, and Facebook photos are streams of electrical pulses in binary digits, 1s and 0s; quantum computers rely on quantum bits or qubits to store information. Qubits are subatomic particles, such as electrons or photons which change their state regularly. Therefore, they can be 1s and 0s at the same time. This enables quantum computers to run multiple complex computational tasks simultaneously and faster when compared to digital computers, mainframes, and servers.

Introduced in the 1980s, quantum computing can unlock the complexities across different industries much faster than traditional computers. A quantum computer can decipher complex encryption systems that can easily impact digital banking, cryptocurrencies, and e-commerce sectors, which heavily depend on encrypted data. Quantum computers can expedite the discovery of new medicines, aid in climate change, power AI, transform logistics, and design new materials. In the U.S., technology giants, including IBM, Google, Honeywell, Microsoft, Intel, IonQ, and Rigetti Computing, are leading the race to build quantum computers and gain a foothold in the quantum computing space. Whereas Alibaba, Baidu, Huawei are leading companies in China.

For a long time, the U.S. and its allies, such as Japan and Germany, had been working hard to compete with China to dominate the quantum technology space. In 2018, the U.S. government released the National Strategy Overview for Quantum Information Science to reduce technical skills gaps and accelerate quantum computing research and development.

In 2019, Google claimed quantum supremacy for supercomputers when the companys Sycamore processor performed specific tasks in 200 seconds, which would have taken a supercomputer 10,000 years to complete. In the same year, Intel rolled out Horse Ridge, a cryogenic quantum control chip, to reduce the quantum computing complexities and accelerate quantum practicality.

Tech news: Is Data Portability the Answer To Anti-Competitive Practices?

Whats 2020 Looking Like For Quantum Computing?

In July 2020, IBM announced a research partnership with the Japanese business and academia to advance quantum computing innovations. This alliance will deepen ties between the countries and build an ecosystem to improve quantum skills and advance research and development.

More recently, in June 2020, Honeywell announced the development of the worlds highest-performing quantum computer. AWS, Microsoft, and several other IaaS providers have announced quantum cloud services, an initiative to advance quantum computing adoption. In August 2020, AWS announced the general availability of its Amazon Braket, a quantum cloud service that allows developers to design, develop, test, and run quantum algorithms.

Since last year, auto manufacturers, such as Daimler and Volkswagen have been leveraging quantum computers to identify new methods to improve electric vehicle battery performance. Pharmaceutical companies are also using the technology to develop new medicines and drugs.

Last week, the Google AI Quantum team used their quantum processor, Sycamore, to simulate changes in the configuration of a chemical molecule, diazene. During the process, the computer was able to describe the changes in the positions of hydrogen accurately. The computer also gave an accurate description of the binding energy of hydrogen in bigger chains.

If quantum computers develop the ability to predict chemical processes, it would advance the development of a wide range of new materials with unknown properties. Current quantum computers, unfortunately, lack the augmented scaling required for such a task. Although todays computers are not ready to take on such a challenge yet, computer scientists hope to accomplish this in the near future as tech giants like Google invest in quantum computing-related research.

Tech news: Will Googles Nearby Share Have Anything Transformative to Offer?

It, therefore, came as a relief to many computer scientists when the U.S. Department of Energy announced an investment of $625 million over the next five years for five newly formed Quantum Information Science (QIS) Research Centers in the U.S. The newly formed hubs are an amalgam of research universities, national labs, and tech titans in quantum computing. Each of the research hubs is led by the Energy Departments Argonne National Laboratory, Oak Ridge National Laboratory, Brookhaven National Laboratory, Fermi National Laboratory, and Lawrence Berkeley National Laboratory; powered by Microsoft, IBM, Intel, Riggeti, and ColdQuanta. This partnership aims to advance quantum computing commercialization.

Chetan Nayak, general manager of Quantum Hardware at Microsoft, says, While quantum computing will someday have a profound impact, todays quantum computing systems are still nascent technologies. To scale these systems, we must overcome a number of scientific challenges. Microsoft has been tackling these challenges head-on through our work towards developing topological qubits, classical information processing devices for quantum control, new quantum algorithms, and simulations.

At the start of this year, Daniel Newman, principal analyst and founding partner at Futurum Research, predicted that 2020 will be a big year for investors and Silicon Valley to invest in quantum computing companies. He said, It will be incredibly impactful over the next decade, and 2020 should be a big year for advancement and investment.

Quantum computing is still in the development phase, and the lack of suppliers and skilled researchers might be one of the influential factors in its establishment. However, if tech giants, and researchers continue to collaborate on a large scale, quantum technology can turbocharge innovation at a large scale.

What are your thoughts on the progress of quantum computing? Comment below or let us know on LinkedIn, Twitter, or Facebook. Wed love to hear from you!

Originally posted here:
The Quantum Dream: Are We There Yet? - Toolbox

How Amazon Quietly Powers The Internet – Forbes

Amazon (AMZN)

What was the last thing you heard about Amazon (AMZN)?

Let me guess. Its battle with Walmart WMT ? Or was it the FAAs approval of Amazons delivery drones? Most of this news about Amazons store is just noise that distracts investors from Amazons real force.

As Ill show, Amazon is running an operating system that powers some of todays most important technologies such as virtual reality, machine learning, and even quantum computing. Behind the scenes, it is utilized by over a million companiesincluding tech giants Apple AAPL , Netflix NFLX , and Facebook FB .

This is Amazons key and ever-growing moneymaker that has been driving Amazon stock to the moon. But before I pull the curtains, lets step back for a moment.

First, how Amazon makes moneyfor real

For all the online shopping fuss, Amazon doesn't earn much from its store. Yes, Amazon.com AMZN flips hundreds of billions of dollars worth of products every yearand its revenues are on a tear. But Amazon turns only a sliver of that into profits.

In the past year, Amazons store generated a record $282 billion in revenue from Amazon.com. That translated to just $5.6 billion in profitskeep in mind that was Amazon.coms most profitable year ever.

Meanwhile, most of Amazons profits came from the lesser-known side of its business called Amazon Web Services (AWS), as you can see below:

Amazon's profits from AWS vs Amazon.com

Its Amazons cloud arm that is serving over a million companies across the world. You may have heard that AWS has something to do with storing data in the cloud. But its much,muchmore than that.

AWS is the operating system of the internet

To get an idea of how AWS works, take your computer as an example.

Like every other computer, it runs on an operating system such as Windows or MacOS, which comes with a set of programs. This software puts your computer resources to use and helps you carry out daily taskssuch as sending emails or sorting out your files.

Now, think of AWS as an operating system thats running not one, but hundreds of thousands of big computers (in tech lingo: servers). It gives companies nearly unlimited computing power and storageas well as tools to build and run their software on the internet.

The difference is that these big computers sit in Amazons warehouses. And companies work on them remotelyor via the cloud. In other words, AWS is like the operating system of the internet.

Amazons operating system now powers AI, blockchain, and other next-gen technologies

In 2003, when Amazons AWS first started out, it offered only a couple of basic cloud services for storage and mail. Today, this system offers an unmatched set of 175+ tools that help companies build software harnesses todays top technologies.

The list includes blockchain, VR, machine learning (AI), quantum computing, augmented reality (AR), and other technologies that are the building blocks of todays internet.

For example, Netflix is using AWS for more than simply storing and streaming its shows on the internet. Its also employing AWS machine learning technology to recommend movies and shows to you.

Youve also probably heard of Slack (WORK), the most popular messaging app for business. Slack recently announced it will use Amazons media technology to introduce video and audio calls on its app.

And its not just tech companies that are utilizing Amazons AWS tools.

Take GE Power. The worlds energy leader is using AWS analytics technology to store and sift through avalanches of data from its plants. Or Fidelity. Americas mutual fund giant experiments with Amazons VR technology to build VR chat rooms for its clients.

In a picture, Amazons AWS works like this:

How Amazon's AWS powers the internet

Amazons AWS is earning more and more... and more

Amazon is not the only company running a cloud service. Google, Microsoft MSFT , Alibibaba, IBM IBM , and other tech giants are all duking it out for a slice of this lucrative business. But Amazon is the biggest and most feature-rich.

Today, Amazon controls 33% of the market, leaving its closest competitors Microsoft (2nd with 18%) and Google (3rd with 9%) far behind in the dust. That means nearly one third of the internet is running on Amazons AWS.

And it doesnt appear that Amazon will step down from its cloud throne anytime soon. Amazons sales from AWS soared 10X in the past six years. And last year, Amazon reported a bigger sales gain from AWS (dollar-wise) than any other cloud company.

Heres the main takeaway for investors

If you are looking into Amazon stock, dont get caught up in the online shopping fuss.

For years, AWS has been the linchpin of Amazons business. And this invisible side of Amazon is where Amazons largest gears turn.

Problem is, AWS is like a black box. Amazon reports very little on its operations. So if you want to dig deeper, youll have to do your own research.

Youll also have to weigh a couple of risks before putting your money into Amazon stock:

Other than that, Amazon is an outstanding stock, killing it in one of the most lucrative businesses on the planet. And its proven to be resilient to Covid, whose spread could hit the markers again.

Get investing tips that make you go Hmm...

Every week, I put out a big picture story to help explain whats driving the markets. Subscribe here to get my analysis and stock picks right in your inbox.

Go here to read the rest:
How Amazon Quietly Powers The Internet - Forbes

Could Quantum Computing Progress Be Halted by Background Radiation? – Singularity Hub

Doing calculations with a quantum computer is a race against time, thanks to the fragility of the quantum states at their heart. And new research suggests we may soon hit a wall in how long we can hold them together thanks to interference from natural background radiation.

While quantum computing could one day enable us to carry out calculations beyond even the most powerful supercomputer imaginable, were still a long way from that point. And a big reason for that is a phenomenon known as decoherence.

The superpowers of quantum computers rely on holding the qubitsquantum bitsthat make them up in exotic quantum states like superposition and entanglement. Decoherence is the process by which interference from the environment causes them to gradually lose their quantum behavior and any information that was encoded in them.

It can be caused by heat, vibrations, magnetic fluctuations, or any host of environmental factors that are hard to control. Currently we can keep superconducting qubits (the technology favored by the fields leaders like Google and IBM) stable for up to 200 microseconds in the best devices, which is still far too short to do any truly meaningful computations.

But new research from scientists at Massachusetts Institute of Technology (MIT) and Pacific Northwest National Laboratory (PNNL), published last week in Nature, suggests we may struggle to get much further. They found that background radiation from cosmic rays and more prosaic sources like trace elements in concrete walls is enough to put a hard four-millisecond limit on the coherence time of superconducting qubits.

These decoherence mechanisms are like an onion, and weve been peeling back the layers for the past 20 years, but theres another layer that left unabated is going to limit us in a couple years, which is environmental radiation, William Oliver from MIT said in a press release. This is an exciting result, because it motivates us to think of other ways to design qubits to get around this problem.

Superconducting qubits rely on pairs of electrons flowing through a resistance-free circuit. But radiation can knock these pairs out of alignment, causing them to split apart, which is what eventually results in the qubit decohering.

To determine how significant of an impact background levels of radiation could have on qubits, the researchers first tried to work out the relationship between coherence times and radiation levels. They exposed qubits to irradiated copper whose emissions dropped over time in a predictable way, which showed them that coherence times rose as radiation levels fell up to a maximum of four milliseconds, after which background effects kicked in.

To check if this coherence time was really caused by the natural radiation, they built a giant shield out of lead brick that could block background radiation to see what happened when the qubits were isolated. The experiments clearly showed that blocking the background emissions could boost coherence times further.

At the minute, a host of other problems like material impurities and electronic disturbances cause qubits to decohere before these effects kick in, but given the rate at which the technology has been improving, we may hit this new wall in just a few years.

Without mitigation, radiation will limit the coherence time of superconducting qubits to a few milliseconds, which is insufficient for practical quantum computing, Brent VanDevender from PNNL said in a press release.

Potential solutions to the problem include building radiation shielding around quantum computers or locating them underground, where cosmic rays arent able to penetrate so easily. But if you need a few tons of lead or a large cavern in order to install a quantum computer, thats going to make it considerably harder to roll them out widely.

Its important to remember, though, that this problem has only been observed in superconducting qubits so far. In July, researchers showed they could get a spin-orbit qubit implemented in silicon to last for about 10 milliseconds, while trapped ion qubits can stay stable for as long as 10 minutes. And MITs Oliver says theres still plenty of room for building more robust superconducting qubits.

We can think about designing qubits in a way that makes them rad-hard, he said. So its definitely not game-over, its just the next layer of the onion we need to address.

Image Credit: Shutterstock

Read more here:
Could Quantum Computing Progress Be Halted by Background Radiation? - Singularity Hub

For the first time, the researchers at Google have used a quantum computer to simulate a chemical reaction!!! – Gizmo Posts 24

For the first time, the researchers at Google have used a quantum computer to simulate a chemical reaction. This research and the step of the researchers has marked a step forward in the search for practical use of quantum computers. Read out the blog post to get to know the insights of the reaction!!!

Before moving further we first need to get a brief idea of what Quantum Computers are.

Quantum Computers are those machines that work usingquantum bits, or qubits in order to store data as well as to perform computation.

This is for the first time when Google researchers have used a quantum computer to simulate a chemical reaction. And this is a step forward in quest of the practical application of quantum computers. The reaction that took place was quite simple but obviously it is a markable step.

These quantum computers are expected to be the best way to precisely simulate the quantum mechanics that govern the system of atoms and molecules.

In order to perform the simulation of chemical reaction, the team of researchers that conducted this research used the companys 54 qubit quantum processor, Sycamore. It helped them to simulate changes in the configuration of a molecule calleddiazene ( which is a nitrogen hydride).

Ryan Babbush at Google has said that this simulation can even be performed without having a quantum computer. This reaction is a simpler one but it is a markable step in process of Quantum computing. He further added that they are doing quantum computations of chemistry at a fundamentally different scale now. He further said that the works before are consisted of basic calculations that can be done using pencil and paper. But the demonstrations the team is looking at now will certainly make us require a computer to do it.

Read this article:
For the first time, the researchers at Google have used a quantum computer to simulate a chemical reaction!!! - Gizmo Posts 24

We Just Found Another Obstacle For Quantum Computers to Overcome – And It’s Everywhere – ScienceAlert

Keeping qubits stable those quantum equivalents of classic computing bits will be key to realising the potential of quantum computing. Now scientists have found a new obstacle to this stability: natural radiation.

Natural or background radiation comes from all sorts of sources, both natural and artificial. Cosmic rays contribute to natural radiation, for example, and so do concrete buildings. It's around us all the time, and so this poses something of a problem for future quantum computers.

Through a series of experiments that altered the level of natural radiation around qubits, physicists have been able to establish that this background buzz does indeed nudge qubits off balance in a way that stops them from functioning properly.

"Our study is the first to show clearly that low-level ionising radiation in the environment degrades the performance of superconducting qubits," says physicist John Orrell, from the Pacific Northwest National Laboratory (PNNL).

"These findings suggest that radiation shielding will be necessary to attain long-sought performance in quantum computers of this design."

Natural radiation is by no means the most significant or the only threat to qubit stability, which is technically known as coherence everything from temperature fluctuations to electromagnetic fields can break the qubit 'spell'.

But the scientists say if we're to reach a future where quantum computers are taking care of our most advanced computing needs, then this interference from natural radiation is going to have to be dealt with.

It was after experiencing problems with superconducting qubit decoherence that the team behind the new study decided to investigate the possible problem with natural radiation. They found it breaks up a key quantum binding called a Cooper pair of electrons.

"The radiation breaks apart matched pairs of electrons that typically carry electric current without resistance in a superconductor," says physicist Brent VanDevender, from PNNL. "The resistance of those unpaired electrons destroys the delicately prepared state of a qubit."

Classical computers can be disrupted by the same issues that affect qubits, but quantum states are much more delicate and sensitive. One of the reasons that we don't have genuine full-scale quantum computers today is that no one can keep qubits stable for more than a few milliseconds at a time.

If we can improve on that, the benefits in terms of computing power could be huge: whereas classical computing bits can only be set as 1 or 0, qubits can be set as 1, 0 or both at the same time (known as superposition).

Scientists have been able to get it happening, but only for a very short space of time and in a very tightly controlled environment. The good news is that researchers like those at PNNL are committed to the challenge of figuring out how to make quantum computers a reality and now we know a bit more about what we're up against.

"Practical quantum computing with these devices will not be possible unless we address the radiation issue," says VanDevender. "Without mitigation, radiation will limit the coherence time of superconducting qubits to a few milliseconds, which is insufficient for practical quantum computing."

The research has been published in Nature.

Read more:
We Just Found Another Obstacle For Quantum Computers to Overcome - And It's Everywhere - ScienceAlert

This Equation Calculates The Chances We Live In A Computer Simulation – Discover Magazine

The Drake equation is one of the more famous reckonings in science. It calculates the likelihood that we are not alone in the universe by estimating the number of other intelligent civilizations in our galaxy that might exist now.

Some of the terms in this equation are well known or becoming better understood, such as the number of stars in our galaxy and the proportion that have planets in the habitable zone. But others are unknown, such as the proportion of planets that develop intelligent life; and some may never be known such as the proportion that destroy themselves before they can be discovered.

Nevertheless, the Drake equation allows scientists to place important bounds on the numbers of intelligent civilizations that might be out there.

However, there is another sense in which humanity could be linked with an alien intelligenceour world may just be a simulation inside a massively powerful supercomputer run by such a species. Indeed, various scientists, philosophers and visionaries have said that the probability of such a scenario could be close to one. In other words, we probably are living in a simulation.

The accuracy of these claims is somewhat controversial. So a better way to determine the probability that we live in a simulation would be much appreciated.

Enter Alexandre Bibeau-Delisle and Gilles Brassard at the University of Montreal in Canada. These researchers have derived a Drake-like equation that calculates the chances that we live in a computer simulation. And the results throw up some counterintuitive ideas that are likely to change the way we think about simulations, how we might determine whether we are in one and whether we could ever escape.

Bibeau-Delisle and Brassard begin with a fundamental estimate of the computing power available to create a simulation. They say, for example, that a kilogram of matter, fully exploited for computation, could perform 10^50 operations per second.

By comparison, the human brain, which is also kilogram-sized, performs up to 10^16 operations per second. It may thus be possible for a single computer the mass of a human brain to simulate the real-time evolution of 1.4 10^25 virtual brains, they say.

In our society, a significant number of computers already simulate entire civilizations, in games such as Civilization VI, Hearts of Iron IV, Humankind and so. So it may be reasonable to assume that in a sufficiently advanced civilization, individuals will be able to run games that simulate societies like ours, populated with sentient conscious beings.

So an interesting question is this: of all the sentient beings in existence, what fraction are likely to be simulations? To derive the answer, Bibeau-Delisle and Brassard start with the total number of real sentient beings NRe, multiply that by the fraction with access to the necessary computing power fCiv; multiply this by the fraction of that power that is devoted to simulating consciousness fDed (because these beings are likely to be using their computer for other purposes too); and then multiply this by the number of brains they could simulate Rcal.

The resulting equation is this, where fSim is the fraction of simulated brains:

Here RCal is the huge number of brains that fully exploited matter should be able to simulate.

The sheer size of this number, ~10^25, pushes Bibeau-Delisle and Brassard towards an inescapable conclusion. It is mathematically inescapable from [the above] equation and the colossal scale of RCal that fSim 1 unless fCiv fDed 0, they say.

So there are two possible outcomes. Either we live in a simulation or a vanishingly small proportion of advanced computing power is devoted to simulating brains.

Its not hard to imagine why the second option might be true. A society of beings similar to us (but with a much greater technological development) could indeed decide it is not very ethical to simulate beings with enough precision to make them conscious while fooling them and keeping them cut-off from the real world, say Bibeau-Delisle and Brassard.

Another possibility is that advanced civilizations never get to the stage where their technology is powerful enough to perform these kinds of computations. Perhaps they destroy themselves through war or disease or climate change long before then. There is no way of knowing.

But suppose we are in a simulation. Bibeau-Delisle and Brassard ask whether we might escape while somehow hiding our intentions from our overlords. They assume that the simulating technology will be quantum in nature. If quantum phenomena are as difficult to compute on classical systems as we believe them to be, a simulation containing our world would most probably run on quantum computing power, they say.

This raises the possibility that it may be possible to detect our alien overlords since they cannot measure the quantum nature of our world without revealing their presence. Quantum cryptography uses the same principle; indeed, Brassard is one of the pioneers of this technology.

That would seem to make it possible for us to make encrypted plans that are hidden from the overlords, such as secretly transferring ourselves into our own simulations.

However, the overlords have a way to foil this. All they need to do is to rewire their simulation to make it look as if we are able to hide information, even though they are aware of it all the time. If the simulators are particularly angry at our attempted escape, they could also send us to a simulated hell, in which case we would at least have the confirmation we were truly living inside a simulation and our paranoia was not unjustified...conclude Bibeau-Delisle and Brassard, with their tongues firmly in their cheeks.

In that sense, we are the ultimate laboratory guinea pigs: forever trapped and forever fooled by the evil genius of our omnipotent masters.

Time for another game of Civilization VI.

Ref: arxiv.org/abs/2008.09275 : Probability and Consequences of Living Inside a Computer Simulation

More here:
This Equation Calculates The Chances We Live In A Computer Simulation - Discover Magazine

Why quantum computing matters – Axios

A new government initiative will direct hundreds of millions of dollars to support new centers for quantum computing research.

Why it matters: Quantum information science represents the next leap forward for computing, opening the door to powerful machines that can help provide answers to some of our most pressing questions. The nation that takes the lead in quantum will stake a pole position for the future.

Details: The five new quantum research centers established in national labs across the country are part of a $1 billion White House program announced Wednesday morning that includes seven institutes that will explore different facets of AI, including precision agriculture and forecast prediction.

How it works: While AI is better known and increasingly integrated into our daily lives hey, Siri quantum computing is just as important, promising huge leaps forward in computer processing power.

Of note: Albert Einstein famously hated the concept of entanglement, describing it as "spooky action at a distance." But the idea has held up over decades of research in quantum science.

Quantum computers won't replace classical ones wholesale in part because the process of manipulating quantum particles is still highly tricky but as they develop, they'll open up new frontiers in computing.

What they're saying: "Quantum is the biggest revolution in computers since the advent of computers," says Dario Gil, director of IBM Research. "With the quantum bit, you can actually rethink the nature of information."

The catch: While the underlying science behind quantum computers is decades old, quantum computers are only just now beginning to be used commercially.

What to watch: Who ultimately wins out on quantum supremacy the act of demonstrating that a quantum computer can solve a problem that even the fastest classical computer would be unable to solve in a feasible time frame.

The bottom line: The age of quantum computers isn't quite here yet, but it promises to be one of the major technological drivers of the 21st century.

See the rest here:
Why quantum computing matters - Axios

Concerns about the impact of quantum computing on cryptography, . – Explica

A DigiCert study found that 55% of business Information Technology (IT) specialists are concerned about the impact of quantum computing on cryptography. The company explained in a statement that 71% consider this technology to be a threat in the future and many have heard of quantum computing, but few know what it is.

Although it is a technology that is not widely used, physicists have been talking about quantum computing for more than 30 years. But how can this new computing help? Quantum computing will fundamentally increase processing power, which could mean exciting advances from particle physics to machine learning to medical science, he noted. He added that companies can prepare for and anticipate the challenges that quantum computing poses, increasing the crypto-agility that identifies and replaces outdated cryptographic algorithms when necessary. Hardware Security Modules (HSMs) can also be identified to protect custom keys that are used in your public key infrastructure (PKI).

That is why it is important for companies to investigate how they are being used, if they can be upgraded to support quantum security encryption and, if so, how quickly those upgrades could occur, he said. He recommended relying on SSL / TLS certificates that allow website visitors to know that it is authentic and that the data they enter will be encrypted. An important approach to preparing for post-quantum cryptographic threats is to gain encryption agility. A properly implemented AOSSL makes it easy to update encryption algorithms in response to future threats from quantum computing, said Avesta Hojjati, Director of I + D from DigiCert.

I like it I love it funny surprised angry sad

Read the original here:
Concerns about the impact of quantum computing on cryptography, . - Explica

US begins $1 billion quantum computing plan to get ahead …

Quantum computing, as shown by this Google machine, is still in its infancy. A five-year US program aims to hasten its maturity by combining $625 million in federal funds and $340 in company contributions.

When big technologies like mobile phones, 5G networks and e-commerce arrive, it's important to get in on the ground floor. That's why the US government is establishing 12 new research centers, funded with hundreds of millions of dollars, to boost artificial intelligence and quantum computing.

Congress already has appropriated most of the funds for the projects. But the White House on Wednesday detailed what work will be done, the names of the labs and universities that competed to house the 12 centers, and the reasons it believes the two technologies are so important for the US economy and national security.

Subscribe to the CNET Now newsletter for our editors' picks of the most important stories of the day.

The Department of Energy's five quantum computing centers, housed at US national laboratories, are funded by a five year, $625 million project bolstered by $340 million worth of help from companies including IBM, Microsoft, Intel, Applied Materials and Lockheed Martin. The funds came from the $1.2 billion allocated by the National Quantum Initiative Act, which President Donald Trump signed in 2018, but the private sector contributions add some new clout.

Artificial intelligence is already broadly used for tasks such as voice recognition and spam filtering, and it's a top priority at Google, Facebook, Tesla and every other tech giant. Quantum computing is now a hotly competitive subject, and even though it's very immature, plenty of researcher believe the weird physics of the ultrasmall will revolutionize the new materials development, financial predictions and delivery services. Although businesses are interested in both areas already, the government programs aim to boost more basic research than what's already happening.

The idea is to link government, private and university research to accelerate key areas in the US. It's the same recipe used for earlier US technology triumphs like the Manhattan Project to build the atomic bomb in World War II, the Apollo program to send humans to the moon and the military-funded effort to establish what became the internet.

"The US will continue to be the home for the next great advancements in technology," US Chief Technology Officer Michael Kratsios said in a press conference. "We know our adversaries are pursuing their own advancements."

US-led AI improvements wouldn't stop the Chinese government from using face recognition to identify protesters in Hong Kong or members of the Uigher ethnic minority. But they could mean breakthroughs benefit US companies -- both those building next-gen products and those using them. And AI has military applications like identifying targets.

When it comes to quantum computing, several national security uses are possible: navigation sensors that work even if GPS satellites are disabled; a new class of secure communications; and quantum computers that can decrypt others' previously secure communications.

The US government has a big cautionary tale about American technology leadership: 5G. The biggest players building the important new mobile network technology are outside the US. That includes China-based Huawei, which the US considers a security risk.

The private sector already is sinking billions of dollars into AI and quantum computing research on their own. The federal funds will multiply those investments, helping reach areas beyond today's commercialization plans.

"That'll give us a road map beyond the next three to five years," said Dario Gill, head of IBM's quantum computing program.

The five quantum computing centers will be located at Argonne, Brookhaven, Fermi, Oak Ridge and Lawrence Berkeley national laboratories. Areas of research include materials science, quantum networking and quantum sensor networks.

The AI centers will be at universities, including the University of Oklahoma at Norman, the University of Texas at Austin, the University of Colorado at Boulder, the University of Illinois at Urbana-Champaign, the University of California at Davis and the Massachusetts Institute of Technology.

"Advancing quantum practicality will be a team sport," said James Clarke, Intel's director of quantum hardware, in a statement.

One big quantum computing partner from industry is IBM, which has been aggressively investing in the technology for years. It's involved in three areas, Gil said.

First is the Brookhaven center's work to improve quantum computing error correction, a key technology to making big, widely useful quantum computers. The Argonne center will work on quantum networking to link multiple quantum computers for greater power. And the Oak Ridge center will work on quantum algorithms, applications and sensors.

Although fierce competitors are involved, at the centers, the centers are for cooperative work.

"The idea is to do fundamental research, advance the state of the art, and share it," Gil said.

More:
US begins $1 billion quantum computing plan to get ahead ...

The future of artificial intelligence and quantum computing – Military & Aerospace Electronics

NASHUA, N.H. -Until the 21st Century, artificial intelligence (AI) and quantum computers were largely the stuff of science fiction, although quantum theory and quantum mechanics had been around for about a century. A century of great controversy, largely because Albert Einstein rejected quantum theory as originally formulated, leading to his famous statement, God does not play dice with the universe.

Today, however, the debate over quantum computing is largely about when not if these kinds of devices will come into full operation. Meanwhile, other forms of quantum technology, such as sensors, already are finding their way into military and civilian applications.

Quantum technology will be as transformational in the 21st Century as harnessing electricity was in the 19th, Michael J. Biercuk, founder and CEO of Q-CTRL Pty Ltd in Sydney, Australia, and professor of Quantum Physics & Quantum Technologies at the University of Sydney, told the U.S. Office of Naval Research in a January 2019 presentation.

On that, there is virtually universal agreement. But when and how remains undetermined.

For example, asked how and when quantum computing eventually may be applied to high-performance embedded computing (HPEC), Tatjana Curcic, program manager for Optimization with Noisy Intermediate-Scale Quantum devices (ONISQ) of the U.S. Defense Advanced Research Projects Agency in Arlington, Va., says its an open question.

Until just recently, quantum computing stood on its own, but as of a few years ago people are looking more and more into hybrid approaches, Curcic says. Im not aware of much work on actually getting quantum computing into HPEC architecture, however. Its definitely not mainstream, probably because its too early.

As to how quantum computing eventually may influence the development, scale, and use of AI, she adds:

Thats another open question. Quantum machine learning is a very active research area, but is quite new. A lot of people are working on that, but its not clear at this time what the results will be. The interface between classical data, which AI is primarily involved with, and quantum computing is still a technical challenge.

Quantum information processing

According to DARPAs ONISQ webpage, the program aims to exploit quantum information processing before fully fault-tolerant quantum computers are realized.This quantum computer based on superconducting qubits is inserted into a dilution refrigerator and cooled to a temperature less than 1 Kelvin. It was built at IBM Research in Zurich.

This effort will pursue a hybrid concept that combines intermediate-sized quantum devices with classical systems to solve a particularly challenging set of problems known as combinatorial optimization. ONISQ seeks to demonstrate the quantitative advantage of quantum information processing by leapfrogging the performance of classical-only systems in solving optimization challenges, the agency states. ONISQ researchers will be tasked with developing quantum systems that are scalable to hundreds or thousands of qubits with longer coherence times and improved noise control.

Researchers will also be required to efficiently implement a quantum optimization algorithm on noisy intermediate-scale quantum devices, optimizing allocation of quantum and classical resources. Benchmarking will also be part of the program, with researchers making a quantitative comparison of classical and quantum approaches. In addition, the program will identify classes of problems in combinatorial optimization where quantum information processing is likely to have the biggest impact. It will also seek to develop methods for extending quantum advantage on limited size processors to large combinatorial optimization problems via techniques such as problem decomposition.

The U.S. government has been the leader in quantum computing research since the founding of the field, but that too is beginning to change.

In the mid-90s, NSA [the U.S. National Security Agency at Fort Meade, Md.] decided to begin on an open academic effort to see if such a thing could be developed. All that research has been conducted by universities for the most part, with a few outliers, such as IBM, says Q-CTRLs Biercuk. In the past five years, there has been a shift toward industry-led development, often in cooperation with academic efforts. Microsoft has partnered with universities all over the world and Google bought a university program. Today many of the biggest hardware developments are coming from the commercial sector.

Quantum computing remains in deep space research, but there are hardware demonstrations all over the world. In the next five years, we expect the performance of these machines to be agented to the point where we believe they will demonstrate a quantum advantage for the first time. For now, however, quantum computing has no advantages over standard computing technology. quantum computers are research demonstrators and do not solve any computing problems at all. Right now, there is no reason to use quantum computers except to be ready when they are truly available.

AI and quantum computing

Nonetheless, the race to develop and deploy AI and quantum computing is global, with the worlds leading military powers seeing them along with other breakthrough technologies like hypersonics making the first to successfully deploy as dominant as the U.S. was following the first detonations of atomic bombs. That is especially true for autonomous mobile platforms, such as unmanned aerial vehicles (UAVs), interfacing with those vehicles onboard HPEC.

Of the two, AI is the closest to deployment, but also the most controversial. A growing number of the worlds leading scientists, including the late Stephen Hawking, warn real-world AI could easily duplicate the actions of the fictional Skynet in the Terminator movie series. Launched with total control over the U.S. nuclear arsenal, Skynet became sentient and decided the human race was a dangerous infestation that needed to be destroyed.

The development of full artificial intelligence could spell the end of the human race. Once humans develop artificial intelligence, it will take off on its own and redesign itself at an ever-increasing rate. Humans, who are limited by slow biological evolution, couldnt compete and would be superseded. Stephen Hawking (2014)

Such dangers have been recognized at least as far back as the publication of Isaac Asimovs short story, Runabout, in 1942, which included his Three Laws of Robotics, designed to control otherwise autonomous robots. In the story, the laws were set down in 2058:

First Law A robot may not injure a human being or, through inaction, allow a human being to come to harm.

Second Law A robot must obey the orders given it by human beings except where such orders would conflict with the First Law.

Third Law A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.

Whether it would be possible to embed and ensure unbreakable compliance with such laws in an AI system is unknown. But limited degrees of AI, known as machine learning, already are in widespread use by the military and advanced stages of the technology, such as deep learning, almost certainly will be deployed by one or more nations as they become available. More than 50 nations already are actively researching battlefield robots.

Military quantum computing

AI-HPEC would give UAVs, next-generation cruise missiles, and even maneuverable ballistic missiles the ability to alter course to new targets at any point after launch, recognize counter measures, avoid, and misdirect or even destroy them.

Quantum computing, on the other hand, is seen by some as providing little, if any, advantage over traditional computer technologies, by many as requiring cooling and size, weight and power (SWaP) improvements not possible with current technologies to make it applicable to mobile platforms and by most as being little more than a research tool for perhaps decades to come.

Perhaps the biggest stumbling block to a mobile platform-based quantum computing is cooling it currently requires a cooling unit, at near absolute zero, the Military trusted computing experts are considering new generations of quantum computing for creating nearly unbreakable encryption for super-secure defense applications.size of a refrigerator to handle a fractional piece of quantum computing.

A lot of work has been done and things are being touted as operational, but the most important thing to understand is this isnt some simple physical thing you throw in suddenly and it works. That makes it harder to call it deployable youre not going to strap a quantum computing to a handheld device. A lot of solutions are still trying to deal with cryogenics and how do you deal with deployment of cryo, says Tammy Carter, senior product manager for GPGPUs and software products at Curtiss-Wright Defense Solutions in Ashburn, Va.

AI is now a technology in deployment. Machine learning is pretty much in use worldwide, Carter says. Were in a migration of figuring out how to use it with the systems we have. quantum computing will require a lot of engineering work and demand may not be great enough to push the effort. From a cryogenically cooled electronics perspective, I dont think there is any insurmountable problem. It absolutely can be done, its just a matter of decision making to do it, prioritization to get it done. These are not easily deployed technologies, but certainly can be deployed.

Given its current and expected near-term limitations, research has increased on the development of hybrid systems.

The longer term reality is a hybrid approach, with the quantum system not going mobile any time soon, says Brian Kirby, physicist in the Army Research Laboratory Computational & Informational Sciences Directorate in Adelphi, Md. Its a mistake to forecast a timeline, but Im not sure putting a quantum computing on such systems would be valuable. Having the quantum computing in a fixed location and linked to the mobile platform makes more sense, for now at least. There can be multiple quantum computers throughout the country; while individually they may have trouble solving some problems, networking them would be more secure and able to solve larger problems.

Broadly, however, quantum computing cant do anything a practical home computer cant do, but can potentially solve certain problems more efficiently, Kirby continues. So youre looking at potential speed-up, but there is no problem a quantum computing can solve a normal computer cant. Beyond the basics of code-breaking and quantum simulations affecting material design, right now we cant necessarily predict military applications.

Raising concerns

In some ways similar to AI, quantum computing raises nearly as many concerns as it does expectations, especially in the area of security. The latest Thales Data Threat Report says 72 percent of surveyed security experts worldwide believe quantum computing will have a negative impact on data security within the next five years.

At the same time, quantum computing is forecast to offer more robust cryptography and security solutions. For HPEC, that duality is significant: quantum computing can make it more difficult to break the security of mobile platforms, while simultaneously making it easier to do just that.

Quantum computers that can run Shors algorithm [leveraging quantum properties to factor very large numbers efficiently] are expected to become available in the next decade. These algorithms can be used to break conventional digital signature schemes (e.g. RSA or ECDSA), which are widely used in embedded systems today. This puts these systems at risk when they are used in safety-relevant long-term applications, such as automotive systems or critical infrastructures. To mitigate this risk, classical digital signature schemes used must be replaced by schemes secure against quantum computing-based attacks, according to the August 2019 proceedings of the 14th International Conference on Availability, Reliability & Securitys Post-Quantum Cryptography in Embedded Systems report.

The security question is not quite so clean-cut as armor/anti-armor, but there is a developing bifurcation between defensive and offensive applications. On the defense side, deployed quantum systems are looked at to provide encoded communications. Experts say it seems likely the level of activity in China about quantum communications, which has been a major focus for years, runs up against the development of quantum computing in the U.S. The two aspects are not clearly one-against-one, but the two moving independently.

Googles quantum supremacy demonstration has led to a rush on finding algorithms robust against quantum attack. On the quantum communications side, the development of attacks on such systems has been underway for years, leading to a whole field of research based on identifying and exploiting quantum attacks.

Quantum computing could also help develop revolutionary AI systems. Recent efforts have demonstrated a strong and unexpected link between quantum computation and artificial neural networks, potentially portending new approaches to machine learning. Such advances could lead to vastly improved pattern recognition, which in turn would permit far better machine-based target identification. For example, the hidden submarine in our vast oceans may become less-hidden in a world with AI-empowered quantum computers, particularly if they are combined with vast data sets acquired through powerful quantum-enabled sensors, according to Q-CTRLs Biercuk.

Even the relatively mundane near-term development of new quantum-enhanced clocks may impact security, beyond just making GPS devices more accurate, Biercuk continues. Quantum-enabled clocks are so sensitive that they can discern minor gravitational anomalies from a distance. They thus could be deployed by military personnel to detect underground, hardened structures, submarines or hidden weapons systems. Given their potential for remote sensing, advanced clocks may become a key embedded technology for tomorrows warfighter.

Warfighter capabilities

The early applications of quantum computing, while not embedded on mobile platforms, are expected to enhance warfighter capabilities significantly.

Jim Clark, director of quantum hardware at Intel Corp. in Santa Clara, Calif., shows one of the companys quantum processors.There is a high likelihood quantum computing will impact ISR [intelligence, surveillance and reconnaissance], solving logistics problems more quickly. But so much of this is in the basic research stage. While we know the types of problems and general application space, optimization problems will be some of the first where we will see advantages from quantum computing, says Sara Gamble, quantum information sciences program manager at ARL.

Biercuk says he agrees: Were not really sure there is a role for quantum computing in embedded computing just yet. quantum computing is right now very large systems embedded in mainframes, with access by the cloud. You can envision embedded computing accessing quantum computing via the cloud, but they are not likely to be very small, agile processors you would embed in a SWAP-constrained environment.

But there are many aspects of quantum technology beyond quantum computing; the combination of quantum sensors could allow much better detection in the field, Biercuk continues. The biggest potential impact comes in the areas of GPS denial, which has become one of the biggest risk factors identified in every blueprint around the world. quantum computing plays directly into this to perform dead reckoning navigation in GPS denial areas.

DARPAs Curcic also says the full power of quantum computing is still decades away, but believes ONISQ has the potential to help speed its development.

The main two approaches industry is using is superconducting quantum computing and trapped ions. We use both of those, plus cold atoms [Rydberg atoms]. We are very excited about ONISQ and seeing if we can get anything useful over classical computing. Four teams are doing hardware development with those three approaches, she says.

Because these are noisy systems, its very difficult to determine if there will be any advantages. The hope is we can address the optimization problem faster than today, which is what were working on with ONISQ. Optimization problems are everywhere, so even a small improvement would be valuable.

Beyond todays capabilities

As to how quantum computing and AI may impact future warfare, especially through HPEC, she adds: I have no doubt quantum computing will be revolutionary and well be able to do things beyond todays capabilities. The possibilities are pretty much endless, but what they are is not crystal clear at this point. Its very difficult, with great certainly, to predict what quantum computing will be able to do. Well just have to build and try. Thats why today is such an exciting time.

Curtiss Wrights Carter says he believes quantum computing and AI will be closely linked with HPEC in the future, once current limitations with both are resolved.

AI itself is based on a lot of math being done in parallel for probability answers, similar to modeling the neurons in the brain highly interconnected nodes and interdependent math calculations. Imagine a small device trying to recognize handwriting, Carter says. You run every pixel of that through lots and lots of math, combining and mixing, cutting some, amplifying others, until you get a 98 percent answer at the other end. quantum computing could help with that and researchers are looking at how you would do that, using a different level of parallel math.

How quantum computing will be applied to HPEC will be the big trick, how to get that deployed. Imagine were a SIGINT [signals intelligence] platform land, air or sea there are a lot of challenges, such as picking the right signal out of the air, which is not particularly easy, Carter continues. Once you achieve pattern recognition, you want to do code breaking to get that encrypted traffic immediately. Getting that on a deployed platform could be useful; otherwise you bring your data back to a quantum computing in a building, but that means you dont get the results immediately.

The technology research underway today is expected to show progress toward making quantum computing more applicable to military needs, but it is unlikely to produce major results quickly, especially in the area of HPEC.

Trapped ions and superconducting circuits still require a lot of infrastructure to make them work. Some teams are working on that problem, but the systems still remain room-sized. The idea of quantum computing being like an integrated circuit you just put on a circuit board were a very long way from that, Biercuk says. The systems are getting smaller, more compact, but there is a very long way to go to deployable, embeddable systems. Position, navigation and timing systems are being reduced and can be easily deployed on aircraft. Thats probably where the technology will remain in the next 20 years; but, eventually, with new technology development, quantum computing may be reduced to more mobile sizes.

The next 10 years are about achieving quantum advantage with the systems available now or iterations. Despite the acceleration we have seen, there are things that are just hard and require a lot of creativity, Biercuk continues. Were shrinking the hardware, but that hardware still may not be relevant to any deployable system. In 20 years, we may have machines that can do the work required, but in that time we may only be able to shrink them to a size that can fit on an aircraft carrier local code-breaking engines. To miniaturize this technology to put it on, say, a body-carried system, we just dont have any technology basis to claim we will get there even in 20 years. Thats open to creativity and discovery.

Even with all of the research underway worldwide, one question remains dominant.

The general challenge is it is not clear what we will use quantum computing for, notes Rad Balu, a computer scientist in ARLs Computational & Informational Sciences Directorate.

Continued here:
The future of artificial intelligence and quantum computing - Military & Aerospace Electronics