New EU Consortium shaping the future of Quantum Computing USA – PRNewswire

Europe has always been excellent in academic research, but over the past few decades commercializing research projects has been slow compared to international competition. This is starting to change with quantum technologies. As one of the largest efforts in Europe and worldwide, Germany announced 2 Billion funding into quantum programs in June 2020, from which 120 Million are invested in this current round of research grants.

Today, IQM announced a Quantum project consortium that includes Europe's leading startups (ParityQC, IQM), industry leaders (Infineon Technologies), research centers (Forschungszentrum Jlich),supercomputing centers (Leibniz Supercomputing Centre), and academia (Freie Universitt Berlin) has been awarded 12.4 Million from the German Ministry of Education and Research (BMBF) (Announcement in German).

The scope of the project is to accelerate commercialization through an innovative co-design concept. This project focuses on application-specific quantum processors, which have the potential to create a fastlane to quantum advantage. The digital-analog concept used to operate the processors will further lay the foundation for commercially viable quantum computers. This project will run for four years and aims to develop a 54-qubit quantum processor.

The project is intended to support the European FET Flagship project EU OpenSuperQ, announced in 2018 which is aimed at designing, building, and operating a quantum information processing system of up to 100 qubits. Deploying digital-analog quantum computing, this consortium adds a new angle to the OpenSuperQ project and widens its scope. With efforts from Munich, Berlin and Jlich, as well as Parity QC from Austria, the project builds bridges and seamlessly integrates into the European quantum landscape.

"The grant from the Federal Ministry of Education and Research of Germanyis a huge recognition of our unique co-design approach for quantum computers. Last year when we established our office in Munich, this was one of our key objectives. The concept allows us to become a system integrator for full-stack quantum computers by bringing together all the relevant players. As Europe's leading startup in quantum technologies, this gives us confidence to further invest in Germany and other European countries" said Dr. Jan Goetz, CEO of IQM Quantum Computers.

As European technology leader, Germany is taking several steps to lead the quantum technology race. An important role of such leadership is to bring together the European startups, industry, research and academic partners. This project will give the quantum landscape in Germany an accelerated push and will create a vibrant quantum ecosystem in the region for the future.

Additional Quotes:

"DAQC is an important project for Germany and Europe. It enables us to take a leading role in the area of quantum technologies. It also allows us to bring quantum computing into one of the prime academic supercomputing centres to more effectively work on the important integration of high-performance computing and quantum computing. We are looking forward to a successful collaboration," said Prof. DrMartinSchulz, Member of the Board of Directors, Leibniz Supercomputing Centre (LRZ).

"The path towards scalable and fully programmable quantum computing will be the parallelizability of gates and building with reduced complexity in order to ensure manageable qubit control. Our ParityQC architecture is the blueprint for a fully parallelizable quantum computer, which comes with the associated ParityOS operating system. With the team of extraordinary members of the DAQC consortium this will allow us to tackle the most pressing and complex industry-relevant optimization problems." saidMagdalena Hauser & Wolfgang Lechner, CEOs & Co-founder ParityQC

"We are looking forward to exploring and realizing a tight connection between hardware and applications, and having DAQC quantum computers as a compatible alternative within the OpenSuperQ laboratory. Collaborations like this across different states, and including both public and private partners, have the right momentum to move quantum computing in Germany forward." saidProf. Frank Wilhelm-Mauch, Director, Institute for Quantum Computing Analytics, Forschungszentrum Jlich

"At Infineon, we are looking forward to collaborating with top-class scientists and leading start-ups in the field of quantum computing in Europe. We must act now if we in Germany and Europe do not want to become solely dependent on American or Asian know-how in this future technology area. We are very glad to be part of this highly innovative project and happy to contribute with our expertise in scaling and manufacturing processes." saidDr.Sebastian Luber, Senior Director Technology & Innovation, Infineon Technologies AG

"This is a hugely exciting project. It is a chance of Europe and Germany to catch up in the development of superconducting quantum computers. I am looking forward to adventures on understanding how such machines can be certified in their precise functioning." said Prof.Jens Eisert, Professor of Quantum Physics, Freie Universitt Berlin

About IQM Quantum Computers:

IQM is the European leader in superconducting quantum computers, headquartered in Espoo, Finland. Since its inception in 2018, IQM has grown to 80+ employees and has also established a subsidiary in Munich, Germany, to lead the co-design approach. IQM delivers on-premises quantum computers for research laboratories and supercomputing centers and provides complete access to its hardware. For industrial customers, IQM delivers quantum advantage through a unique application-specific co-design approach. IQM has raised 71 Million from VCs firms and also public grants and is also building Finland's first quantum computer.

For more information, visit http://www.meetiqm.com.

Registered offices:

IQM Finland OyKeilaranta 1902150 EspooFINLANDwww.meetiqm.com

IQM GERMANY GmbHNymphenburgerstr. 8680636 MnchenGermany

IQM: Facts and Figures

Founders:

Media Contact: Raghunath Koduvayur, Head of Marketing and Communications, [emailprotected], +358504876509

Photo - https://mma.prnewswire.com/media/1437806/IQM_Quantum_Computers_Founders.jpg Photo - https://mma.prnewswire.com/media/1437807/IQM_Quantum_computer_design.jpg Logo - https://mma.prnewswire.com/media/1121497/IQM_Logo.jpg

SOURCE IQM Finland Oy

http://meetiqm.com/contact/

See the rest here:
New EU Consortium shaping the future of Quantum Computing USA - PRNewswire

Quantum Isnt Armageddon; But Your Horse Has Already Left the Barn – PaymentsJournal

It is true that adversaries are collecting our encrypted data today so they can decrypt it later. In essence anything sent using PKI (Public Key Infrastructure) today may very well be decrypted when quantum computing becomes available. Our recent report identifies the risk to account numbers and other long tail data (data that still has high value 5 years or more into the future). Data you send today using traditional PKI is the horse that left the barn.

But this article describes a scary scenario where an adversarys quantum computer hacks the US militarys communications and utilizes that advantage to sink the US Fleet but that is highly unlikely as long as government agencies follow orders. The US government specifies that AES-128 be used for secret (unclassified) information and AES-256 for top secret (classified) information. While AES-128 can be cracked using quantum computers, one estimate suggests that would take 6 months of computing time. That would be very expensive. Most estimates indicate that using AES-256 would take hundreds of years, but the military is already planning an even safer alternative it just isnt yet in production (that I am aware of):

Arthur Herman conducted two formidable studies on what a single, successful quantum computing attack would do to both our banking systems and a major cryptocurrency. A single attack on the banking system by a quantum computer would take down Fedwire and cause $2 trillion of damage in a very short period of time. A similar attack on a cryptocurrency like bitcoin would cause a 90 percent drop in price and would start a three-year recession in the United States. Both studies were backed up by econometric models using over 18,000 data points to predict these cascading failures.

Another disastrous effect could be that an attacker with a CRQC could take control of any systems that rely on standard PKI. So, by hacking communications, they would be able to disrupt data flows so that the attacker could take control of a device, crashing it into the ground or even using it against an enemy. Think of the number of autonomous vehicles that we are using both from a civilian and military standpoint. Any autonomous devices such as passenger cars, military drones, ships, planes, and robots could be hacked by a CRQC and shut down or controlled to perform activities not originally intended by the current users or owners.

Overview byTim Sloane,VP, Payments Innovation at Mercator Advisory Group

Go here to read the rest:
Quantum Isnt Armageddon; But Your Horse Has Already Left the Barn - PaymentsJournal

PsiQuantum’s Path to 1 Million Qubits by the Middle of the Decade – HPCwire

PsiQuantum, founded in 2016 by four researchers with roots at Bristol University, Stanford University, and York University, is one of a few quantum computing startups thats kept a moderately low PR profile. (Thats if you disregard the roughly $700 million in funding it has attracted.) The main reason is PsiQuantum has eschewed the clamorous public chase for NISQ (near-term intermediate scale quantum) computers and set out to develop a million-qubit system the company says will deliver big gains on big problems as soon as it arrives.

When will that be?

PsiQuantum says it will have all the manufacturing processes in place by the middle of the decade and its working closely with GlobalFoundries (GF) to turn its vision into reality. The generous size of its funding suggests many think it will succeed. PsiQuantum is betting on a photonics-based approach called fusion-based quantum computing (paper) that relies mostly on well-understood optical technology but requires extremely precise manufacturing tolerances to scale up. It also relies on managing individual photons, something that has proven difficult for others.

Heres the companys basic contention:

Success in quantum computing will require large, fault-tolerant systems and the current preoccupation with NISQ computers is an interesting but ultimately mistaken path. The most effective and fastest route to practical quantum computing will require leveraging (and innovating) existing semiconductor manufacturing processes and networking thousands of quantum chips together to reach the million-qubit system threshold thats widely regarded as necessary to run game-changing applications in chemistry, banking, and other sectors.

Its not that incrementalism is bad. In fact, its necessary. But its not well served when focused on delivering NISQ systems argues Peter Shadbolt, one of PsiQuantum founders and the current chief scientific officer.

Conventional supercomputers are already really good. Youve got to do some kind of step change, you cant increment your way [forward], and especially you cant increment with five qubits, 10 qubits, 20 qubits, 50 qubits to a million. That is not a good strategy. But its also not true to say that were planning to leap from zero to a million, said Shadbolt. We have a whole chain of incrementally larger and larger systems that were building along the way. Those allow us to validate the control electronics, the systems integration, the cryogenics, the networking, etc. But were not spending time and energy, trying to dress those up as something that theyre not. Were not having to take those things and try to desperately extract computational value from something that doesnt have any computational value. Were able to use those intermediate systems for our own learnings and for our own development.

Thats a much different approach from the majority of quantum computing hopefuls. Shadbolt suggests the broad message about the need to push beyond NISQ dogma is starting to take hold.

There is a change that is happening now, which is that people are starting to program for error-corrected quantum computers, as opposed to programming for NISQ computers. Thats a welcome change and thats happening across the whole space. If youre programming for NISQ computers, you very rapidly get deeply entangled if youll forgive the pun with the hardware. You start looking under the hood, and you start trying to find shortcuts to deal with the fact that you have so few gates at your disposal. So, programming NISQ computers is a fascinating, intellectually stimulating activity, Ive done it myself, but it rapidly becomes sort of siloed and you have to pick a winner, said Shadbolt.

With fault tolerance, once you start to accept that youre going to need error correction, then you can start programming in a fault-tolerant gate set which is hardware agnostic, and its much more straightforward to deal with. There are also some surprising characteristics, which mean that the optimizations that you make to algorithms in a fault-tolerant regime are in many cases, the diametric opposite of the optimizations that you would make in the NISQ regime. It really takes a different approach but its very welcome that the whole industry is moving in that direction and spending less time on these kinds of myopic, narrow efforts, he said.

That sounds a bit harsh. PsiQuantum is no doubt benefitting from the manifold efforts by the young quantum computing ecosystem to tout advances and build traction by promoting NISQ use cases. Theres an old business axiom that says a little hype is often a necessary lubricant to accelerate development of young industries; quantum computing certainly has its share. A bigger question is will PsiQuantum beat rivals to the end-game? IBM has laid out a detailed roadmap and said 2023 is when it will start delivering quantum advantage, using a 1000-qubit system, with plans for eventual million-qubit systems. Intel has trumpeted its CMOS strength to scale up manufacturing its quantum dot qubits. D-Wave has been selling its quantum annealing systems to commercial and government customers for years.

Its really not yet clear which of the qubit technologies semiconductor-based superconducting, trapped ions, neutral atoms, photonics, or something else will prevail and for which applications. Whats not ambiguous is PsiQuantums Go Big or Go Home strategy. Its photonics approach, argues the company, has distinct advantages in manufacturability and scalability, operating environment (less frigid), ease of networking, and error correction. Shadbolt recently talked with HPCwire about the companys approach, technology and progress.

What is fusion-based quantum computing?

Broadly, PsiQuantum uses a form of linear optical quantum computing in which individual photons are used as qubits. Over the past year and a half, the previously stealthy PsiQuantum has issued several papers describing the approach while keeping many details close to the vest (papers listed at end of article). The computation flow is to generate single photons and entangle them. PsiQuantum uses dual rail entangling/encoding for photons. The entangled photons are the qubits and are grouped into what PsiQuantum calls resource states, a group of qubits if you will. Fusion measurements (more below) act as gates. Shadbolt says the operations can be mapped to a standard gate-set to achieve universal, error-corrected, quantum computing.

On-chip components carry out the process. It all sounds quite exotic, in part because it differs from more-widely used matter-based qubit technologies. The figure below taken from a PsiQuantum paper Fusion-based quantum computation issued about a year ago roughly describes the process.

Digging into the details is best served by reading the papers and the company has archived videos exploring its approach on its website. The video below is a good brief summation by Mercedes Gimeno-Segovia, vice president of quantum architecture at PsiQuantum.

Shadbolt also briefly described fusion-based quantum computation (FBQC).

Once youve got single photons, you need to build what we refer to as seed states. Those are pretty small entangled states and can be constructed again using linear optics. So, you take some single photons and send them into an interferometer and together with single photon detection, you can probabilistically generate small entangled states. You can then multiplex those again and basically the task is to get as fast as possible to a large enough, complex enough, appropriately structured, resource state which is ready to then be acted upon by a fusion network. Thats it. You want to kill the photon as fast as possible. You dont want photons living for a long time if you can avoid it. Thats pretty much it, said Shadbolt.

The fusion operators are the smallest simplest piece of the machine. The multiplex, single-photon sources are the biggest, most expensive piece. Everything in the middle is kind of the secret sauce of our architecture, some of that weve put out in that paper and you can see kind of how that works, he said. (At the risk of overkill, another brief description of the system from PsiQuantum is presented at the end of the article.)

One important FBQC advantage, says PsiQuantum, is that the shallow depth of optical circuits make error correction easier. The small entangled states fueling the computation are referred to as resource states. Importantly, their size is independent of code distance used or the computation being performed. This allows them to be generated by a constant number of operations. Since the resource states will be immediately measured after they are created, the total depth of operations is also constant. As a result, errors in the resource states are bounded, which is important for fault-tolerance.

Some of the differences between the PsiQuantums FBQC design and the more familiar MBQC (measurement-based quantum computing) paradigm are shown below.

Another advantage is the operating environment.

Nothing about photons themselves requires cryogenic operation. You can do very high fidelity manipulation and generation of qubits at room temperature, and in fact, you can even detect single photons at room temperature just fine. The efficiency of room temperature single photon detectors, is not good enough for fault tolerance. These room temperature detectors are based on pretty complex semiconductor devices, avalanche photodiodes, and theres no physical reason why you couldnt push those to the necessary efficiency, but it looks really difficult [and] people have been trying for a very long time, said Shadbolt

We use a superconducting single-photon detector, which can achieve the necessary efficiencies without a ton of development. Its worth noting those detectors run in the ballpark of 4 Kelvin. So liquid helium temperature, which is still very cold, but its nowhere near as cold as milli-Kelvin temperatures required for superconducting qubits or some of the competing technologies, said Shadbolt.

This has important implications for control circuit placement as well as for reduced power needed to generate the 4-degree Kelvin environment.

Theres a lot to absorb here and its best done directly from the papers. PsiQuantum, like many other quantum start-ups, was founded by researchers who were already digging into the quantum computing space and theyve shown that PsiQuantums FBQC flavor of linear optical quantum computing will work. While at Bristol, Shadbolt was involved in the first demonstration of running a Variational Quantum Eigensolver (VQE) on a photonic chip.

The biggest challenges for PsiQuantum, he suggests, are developing manufacturing techniques and system architecture around well-known optical technology. The company argues having a Tier-1 fab partner such as GlobalFoundries is decisive.

You can go into infinite detail on the architecture and how all the bits and pieces go together. But the point of optical quantum computing is that the network of components is pretty complicated all sorts of modules and structures and multiplexing strategies, and resource state generation schemes and interferometers, and so on but theyre all just made out of beam splitters, and switches, and single photon sources and detectors. Its kind of like in a conventional CPU, you can go in with a microscope and examine the structure of the cache and the ALU and whatever, but underneath its all just transistors. Its the same kind of story here. The limiting factor in our development is the semiconductor process enablement. The thesis has always been that if you tried to build a quantum computer anywhere other than a high-volume semiconductor manufacturing line, your quantum computer isnt going to work, he said.

Any quantum computer needs millions of qubits. Millions of qubits dont fit on a single chip. So youre talking about heaps of chips, probably billions of components realistically, and they all need to work and they all need to work better than the state of the art. That brings us to the progress, which is, again, rearranging those various components into ever more efficient and complex networks in pretty close analogy with CPU architecture. Its a very key part of our IP, but its not rate limiting and its not terribly expensive to change the network of components on the chip once weve got the manufacturing process. Were continuously moving the needle on that architecture development and weve improved these architectures in terms of their tolerance to loss by more than 150x, [actually] well beyond that. Weve reduced the size of the machine, purely through architectural improvements by many, many orders of magnitude.

The big, expensive, slow pieces of the development are in being able to build high quality components at GlobalFoundries in New York. What weve already done there is to put single photon sources and superconducting nanowire, single photon detectors into that manufacturing process engine. We can build wafers, 300-millimeter wafers, with tens of thousands of components on the wafer, including a full silicon photonics PDK (process design kit), and also a very high performing single photon detector. Thats real progress that brings us closer to being able to build a quantum computer, because that lets us build millions to billions of components.

Shadbolt says real systems will quickly follow development of the manufacturing process. PsiQuantum, like everyone in the quantum computing community, is collaborating closely with potential users. Roughly a week ago, it issued a joint paper with Mercedes-Benz discussing quantum computer simulation of Li-ion chemistry. If the PsiQuantum-GlobalFoundries process is ready around 2025, can a million-qubit system (100 logical qubits) be far behind?

Shadbolt would only say that things will happen quickly once the process has been fully developed. He noted there are three ways to make money with a quantum computer: sell machines, sell time, and sell solutions that come from that machine. I think we were exploring all of the above, he said.

Our customers, which is a growing list at this point pharmaceutical companies, car companies, materials companies, big banks are coming to us to understand what a quantum computer can do for them. To understand that, what we are doing, principally, is fault-tolerant resource counting, said Shadbolt. So that means were taking the algorithm or taking the problem the customer has, working with their technical teams to look under the hood, and understand the technical requirements of solving that problem. We are turning that into the quantum algorithms and sub routines that are appropriate. Were compiling that for the fault-tolerant gate set that will run on top of that fusion network, which by the way is a completely vanilla textbook fault-tolerant gate set.

Stay tuned.

PsiQuantum Papers

Fusion-based quantum computation, https://arxiv.org/abs/2101.09310

Creation of Entangled Photonic States Using Linear Optics, https://arxiv.org/abs/2106.13825

Interleaving: Modular architectures for fault-tolerant photonic quantum computing, https://arxiv.org/abs/2103.08612

Description of PsiQuantums Fusion-Based System from the Interleaving Paper

Useful fault-tolerant quantum computers require very large numbers of physical qubits. Quantum computers are often designed as arrays of static qubits executing gates and measurements. Photonic qubits require a different approach. In photonic fusion-based quantum computing (FBQC), the main hardware components are resource-state generators (RSGs) and fusion devices connected via waveguides and switches. RSGs produce small entangled states of a few photonic qubits, whereas fusion devices perform entangling measurements between different resource states, thereby executing computations. In addition, low-loss photonic delays such as optical fiber can be used as fixed-time quantum memories simultaneously storing thousands of photonic qubits.

Here, we present a modular architecture for FBQC in which these components are combined to form interleaving modules consisting of one RSG with its associated fusion devices and a few fiber delays. Exploiting the multiplicative power of delays, each module can add thousands of physical qubits to the computational Hilbert space. Networks of modules are universal fault-tolerant quantum computers, which we demonstrate using surface codes and lattice surgery as a guiding example. Our numerical analysis shows that in a network of modules containing 1-km-long fiber delays, each RSG can generate four logical distance-35 surface-code qubits while tolerating photon loss rates above 2% in addition to the fiber-delay loss. We illustrate how the combination of interleaving with further uses of non-local fiber connections can reduce the cost of logical operations and facilitate the implementation of unconventional geometries such as periodic boundaries or stellated surface codes. Interleaving applies beyond purely optical architectures, and can also turn many small disconnected matter-qubit devices with transduction to photons into a large-scale quantum computer.

Slides/Figures from various PsiQuantum papers and public presentations

Read more here:
PsiQuantum's Path to 1 Million Qubits by the Middle of the Decade - HPCwire

The 4 biggest science breakthroughs that Gen Z could live to see – The Next Web

The only difference between science fiction and science is patience. Yesterdays mainframes are todays smartphones and todays neural networks will be tomorrows androids. But long before any technology becomes reality, someone has to dream it into existence.

The worlds of science and technology are constantly in flux. Its impossible to tell what the future will bring. However we can make some educated guesses based on recent breakthroughs in the fields of nuclear physics, quantum computing, robotics, artificial intelligence, and Facebooks name change.

Lets set our time machines to January 28, 2100 to take an imaginary gander at the four most amazing science and technology breakthroughs the sort-of-far future has to offer.

This could very well be the most important technological breakthrough in human history.

The premise is simple: tiny machines that function at the cellular level capable of performing tissue repairs, destroying intruders, and delivering targeted nano-medications.

And this wouldnt necessarily mean filling your bloodstream with trillions of microscopic hunks of metal and silicon. Theres plenty of reason to believe scientists could take todays biological robots and turn them into artificial intelligence agents capable of executing code functions inside our bodies.

Imagine an AI swarm controlled by a bespoke neural network attached to our brain-computer-interfaces with the sole purpose of optimizing our biological functions.

We might not be able to solve immortality by 2100, but medical nanobots could go a long way towards bridging the gap.

Another technology thats sure to save innumerable human lives is fusion power. Luckily, were on the verge of solving that one already (at least in a rudimentary, proof-of-concept kind of way). With any luck, by the time Gen Zs grandkids are old enough to drive, well have advanced the technology to the point of abundance.

And thats when we can finallystart solving humanitys problems.

The big idea here is that well come close to perfecting fusion power in the future and, because of that, well be able to use quantum computers to optimize civilization.

Fusion could potentially be a limitless form of power and its theoretically feasible that we could eventually scale its energy-producing capabilities to such a degree that energy would be as ubiquitous for private and commercial use as air is.

Under such a paradigm, we can imagine a race to the top for scientific endeavor, the ultimate goal of which would be to produce a utopian society.

With near-infinite energy freely available, there would be little incentive to fight over resources and every incentive to optimize our existence.

And thats where quantum computers come in. If we can make classical algorithms learn to drive cars by building binary supercomputers, imagine what we could do with quantum supercomputing clusters harnessing the unbridled energy of entire stars.

We could assign algorithms to every living creature in the known universe and optimize for their existence. In essence, we could potentially solve the traveling salesman problem at the multiverse scale.

Admittedly, warp drives are a glamour technology. Technically-speaking, with Mars so nearby, we dont really have to travel beyond our own solar system.

But its well-documented that humanity has a need for speed. And if we ever have any intention of seeing stars other than Sol up close, were going to need spaceships that can travel really, really fast.

The big problem here is that the universe doesnt appear to allow anything to travel faster than light. And thats pretty slow. It would take us over four years to travel to the closest star to Earth. In galactic terms, thats like spending a 1/20th of your life walking to the neighbors house.

Warp drives could solve this. Instead of going faster, we could theoretically exploit the wackiness of the universe to go further in a given amount of time without increasing speed.

This involves shifting through warp bubbles in space with exotic temporal properties, but in essence its as simple as Einsteins observations that time works a bit differently at the edge of a black hole.

In the modern era, physicists are excited over some interesting equations and simulations that are starting to make the idea of warp drives seem less like science fiction and more like science.

An added benefit to the advent of the warp drive would be that it would exponentially increase the odds of humans discovering alien life.

If aliens arent right next door, then maybe theyre a few blocks over. If we can start firing probes beyond non-warp ranges by 2100, who knows what our long-range sensors will be able to detect?

Dont laugh. Its understandable if you dont think the metaverse belongs on this list. After all, its just a bunch of cartoon avatars and bad graphics that you need a VR headset for right?

But the metaverse of 2100 will be something different entirely. In 2022, Spotify tries to figure out what song you want to hear based on the music youve listened to in the past. In 2100, your brain-embedded AI assistant will know what song you want to hear because it has a direct connection to the area of your mind that processes sound, memory, and emotion.

The ideal metaverse would be a bespoke environment thats only indistinguishable from reality in its utopianism. In other words, youll only know its fake because you can control the metaverse.

While its obvious that jacking into the Matrix could pose a multitude of risks, the ability to take a vacation from reality could have positive implications ranging from treating depression to giving people with extremely low quality of life a reason to want to continue living.

The ultimate freedom is choosing your own reality. And its a safe bet that whoever owns the server it runs on is whos going to be in charge of the future.

See more here:
The 4 biggest science breakthroughs that Gen Z could live to see - The Next Web

Why Is Silicon Valley Still Waiting for the Next Big Thing? – The New York Times

In the fall of 2019, Google told the world it had reached quantum supremacy.

It was a significant scientific milestone that some compared to the first flight at Kitty Hawk. Harnessing the mysterious powers of quantum mechanics, Google had built a computer that needed only three minutes and 20 seconds to perform a calculation that normal computers couldnt complete in 10,000 years.

But more than two years after Googles announcement, the world is still waiting for a quantum computer that actually does something useful. And it will most likely wait much longer. The world is also waiting for self-driving cars, flying cars, advanced artificial intelligence and brain implants that will let you control your computing devices using nothing but your thoughts.

Silicon Valleys hype machine has long been accused of churning ahead of reality. But in recent years, the tech industrys critics have noticed that its biggest promises the ideas that really could change the world seem further and further on the horizon. The great wealth generated by the industry in recent years has generally been thanks to ideas, like the iPhone and mobile apps, that arrived years ago.

Have the big thinkers of tech lost their mojo?

The answer, those big thinkers are quick to respond, is absolutely not. But the projects they are tackling are far more difficult than building a new app or disrupting another aging industry. And if you look around, the tools that have helped you cope with almost two years of a pandemic the home computers, the videoconferencing services and Wi-Fi, even the technology that aided researchers in the development of vaccines have shown the industry hasnt exactly lost a step.

Imagine the economic impact of the pandemic had there not been the infrastructure the hardware and the software that allowed so many white-collar workers to work from home and so many other parts of the economy to be conducted in a digitally mediated way, said Margaret OMara, a professor at the University of Washington who specializes in the history of Silicon Valley.

As for the next big thing, the big thinkers say, give it time. Take quantum computing. Jake Taylor, who oversaw quantum computing efforts for the White House and is now chief science officer at the quantum start-up Riverlane, said building a quantum computer might be the most difficult task ever undertaken. This is a machine that defies the physics of everyday life.

A quantum computer relies on the strange ways that some objects behave at the subatomic level or when exposed to extreme cold, like metal chilled to nearly 460 degrees below zero. If scientists merely try to read information from these quantum systems, they tend to break.

While building a quantum computer, Dr. Taylor said, you are constantly working against the fundamental tendency of nature.

The most important tech advances of the past few decades the microchip, the internet, the mouse-driven computer, the smartphone were not defying physics. And they were allowed to gestate for years, even decades, inside government agencies and corporate research labs before ultimately reaching mass adoption.

The age of mobile and cloud computing has created so many new business opportunities, Dr. OMara said. But now there are trickier problems.

Still, the loudest voices in Silicon Valley often discuss those trickier problems as if they were just another smartphone app. That can inflate expectations.

People who arent experts who understand the challenges may have been misled by the hype, said Raquel Urtasun, a University of Toronto professor who helped oversee the development of self-driving cars at Uber and is now chief executive of the self-driving start-up Waabi.

Technologies like self-driving cars and artificial intelligence do not face the same physical obstacles as quantum computing. But just as researchers do not yet know how to build a viable quantum computer, they do not yet know how to design a car that can safely drive itself in any situation or a machine that can do anything the human brain can do.

Even a technology like augmented reality eyeglasses that can layer digital images onto what you see in the real world will require years of additional research and engineering before it is perfected.

Andrew Bosworth, vice president at Meta, formerly Facebook, said that building these lightweight eyeglasses was akin to creating the first mouse-driven personal computers in the 1970s (the mouse itself was invented in 1964). Companies like Meta must design an entirely new way of using computers, before stuffing all its pieces into a tiny package.

Over the past two decades, companies like Facebook have built and deployed new technologies at a speed that never seemed possible before. But as Mr. Bosworth said, these were predominantly software technologies built solely with bits pieces of digital information.

Building new kinds of hardware working with physical atoms is a far more difficult task. As an industry, we have almost forgotten what this is like, Mr. Bosworth said, calling the creation of augmented reality glasses a once-in-a-lifetime project.

Technologists like Mr. Bosworth believe they will eventually overcome those obstacles and they are more open about how difficult it will be. But thats not always the case. And when an industry has seeped into every part of daily life, it can be hard to separate hand-waving from realism especially when it is huge companies like Google and well-known personalities like Elon Musk drawing that attention.

Many in Silicon Valley believe that hand-waving is an important part of pushing technologies into the mainstream. The hype helps attract the money and the talent and the belief needed to build the technology.

If the outcome is desirable and it is technically possible then its OK if were off by three years or five years or whatever, said Aaron Levie, chief executive of the Silicon Valley company Box. You want entrepreneurs to be optimistic to have a little bit of that Steve Jobs reality-distortion field, which helped to persuade people to buy into his big ideas.

The hype is also a way for entrepreneurs to generate interest among the public. Even if new technologies can be built, there is no guarantee that people and businesses will want them and adopt them and pay for them. They need coaxing. And maybe more patience than most people inside and outside the tech industry will admit.

When we hear about a new technology, it takes less than 10 minutes for our brains to imagine what it can do. We instantly compress all of the compounding infrastructure and innovation needed to get to that point, Mr. Levie said. That is the cognitive dissonance we are dealing with.

Read the original post:
Why Is Silicon Valley Still Waiting for the Next Big Thing? - The New York Times

Quantum Computing Threatens Everything Could it be Worse Than the Apocalypse? – Entrepreneur

Opinions expressed by Entrepreneur contributors are their own.

A quantum computer is a machine that uses the laws of quantum theory to solve problems made harder by Moore's law (the number of transistors in a dense integrated circuit doubles about every two years). One example is factoring large numbers. Traditional computers are limited to logical circuits with several tens of transistors, while the number of transistors in a quantum processor may be on the order of one to twomillion. Meaning, these computers will have exponential power, solving problems that traditional computation can't even identify or create solutions for.

In the near future, quantum computers will be so advanced that they will have the capability to simulate very complicated systems. This could be used for simulations in physics, aerospace engineering, cybersecurityand much more. However, once this computer is built, it has the potential to unraveldata encryption protocols. It could also potentially compromise air gaps due to its ability to scan vast distances for nearby networked devices or applications that are open. This means that it can become even simpler for external hackers. Theymay already haveaccess to your computer or computer system via other avenues,like vulnerabilities in web browsers. Theycould find it much easier because you're not locking up all the doors.

Quantum computers point to a radically new understanding of computing. An understandingthat could eventually be used to unlock problems now thought completely intractable. For now, the field seems ripe with potential. Scientists working on quantum computing call it one of the most interesting theoretical toolsin artificial intelligence. Think of it as an incredibly powerful calculator programmed with deep domain expertise. Quantum computers promise answers to all sorts of mathematical, scientificand medical questions humans would never have the guts to tackle otherwise. They promise profound breakthroughs in imaging that will rival even experimental intracellular MRI scans; they may help crack wide-ranging databases that are currently unbreakable orthey might pick up scant details like geological signatures warning us about tsunamis long before they happen.

Quantum computers can theoretically be programmed to solve any complex computational problem. But, the act of programming the computer is so expensive and inflexible that someone would need to program it with all possible solutions. Quantum computers threaten everything. The worst part is that security experts can't ever say for sure what you can do to protect against their programming capabilities. They do know, however, that it's possible to reprogram them just as we would with a normal computer. It's just that the task is so complex and difficult that programming would be such a high-level security risk, it might as well never exist.

What does this all mean? It means we need to develop some sort of encryption technology on our smaller devices so not even those who hold all the worlds data can see or access it. Quantum computers work differently than traditional computers. That gives the maker of a quantum computer more control than with a conventional computer. They can do things like reverse time and process large data with greater speed. The manufacturer will program the machine before release, which also comes with certain risks. Ifthey change their mind and reprogramit per client needs, they put themselves at risk for security breaches. The catch is that the cryptography keys are only secure if you keep them secret. The slightest leak say a pinhole camera across the table from something like a quantum computeror a phone call or email intercepted while being decrypted would enable an adversary to not just unscramble your message but steal your keys. The threat made by quantum computing has been speculated since before it was even technologically feasible to build a quantum computer. But now that were nearly there, the situation might be even more direthan you can imagine.

Related:How Will The World Look Like In 2025 And TheFutureOf ...

As quantum computers allow for more efficient algorithms, the dangers of hacking increase. Such security risks have been a top priority at Google. They havehigh expectations for what approach they will take to create their future quantum machine. In the meantime, DARPA (Defense Advanced Research Projects Agency) has set out grand challengesfor computer science with a hefty $2 million prize. DARPA's goal is to keep U.S. cyber strength relevant amid the rapid decline in Moore's Law and potential loss of global technological leadership. If quantum computers proliferate, they will threaten everything not just bank records and medical documents, but everything. They represent a security leak so fundamental that it could be worse than the apocalypse. The quantum computer poses a possible threat to the infrastructure of the United States. Yet the American authorities do not have enough measures in place to stop this type of danger. One way that they can defend themselves is by inventing new safety standards that work with the current technologies.

Whenever quantum computing matures, however, it will present a vigorous challenge. Computer scientists will need to develop the protocols and protections necessary to ensure security for this emerging technology. If these precautions are not taken, quantum computing could lead to disastrous outcomes in cyber security. There needs to be a protocol developed to provide security for quantum computers. Hackers will be able to access and disrupt live systems, which calls for an urgent need of advancements in cyber security. These new systems can't just implement existing protection protocols because they're not fully developed yet. The cost of research and development is high and the profits once the product is finished are relatively low.

Quantum computing is a hot topic at this moment in time that will impact society in a way we can't even predict if we don't acknowledge its significance now. Most computers today work in accordance with digital signals. If someone tries to hack the computer, it will change that digital signal into another form or cancel it out, which can be easily noticed. However, quantum computers use quantum bits forcalculations. Theyare tied together in a way that makes them so sensitive to changes in information that they are exponentially more vulnerable to hacks than digital computers. If someone manages to hack a quantum computer though not yet possible it would have serious implications for maintaining our safety standards.

Related:Thanks to Blockchain, Decentralization -- and Data Security -- Are ...

If the leaked NSA documents are to be believed, then we may be in for a rude awakening when quantum computers become technologically feasible. These machines will be able to perform calculations in far less time than any conventional computerand render our currentencryptions ineffectual. The leaks claim that in 30 years, two medium-sized quantum computers would be able to even break the security of RSA (cryptosystem) which is currently set at 2048 bits.

Any business that relies on modern cryptography is at risk of being hacked in the near future. But what can companies do to protect themselves? As it turns out, there are some pretty straightforward solutions which firms can preserve (or improve) security amid all this hullabaloo with quantum computing. The authors recommend investing in encryption techniques like Bitcoin, the blockchainand the TLS (Transport Layer Security).

In simple terms, quantum computers process information differently from today's digital computers. This is because of their ability to have bits which sit in more than one state simultaneously, meaning they can perform many calculations at a time. In a future dominated by quantum computing, all regular computing will be made virtually obsolete. Hackers will be able to access the deepest secrets of companies without needing a password. To avoid this fate, companies need to embrace encryption techniques that guard against quantum technology, but they cannot afford to stop innovating too drastically.

The looming potential threat of quantum computing should be taken seriously, but this doesn't mean you should panic. The best way to protect yourself is to plan ahead and think about possible solutions. Incorporating elements of quantum cryptography may not always be possible for every client because of the cost. But, it could help secure an important client who cannot risk future interference in their sensitive operations.

Related:How Companies Can Utilize AI andQuantumTechnologies to ...

Excerpt from:
Quantum Computing Threatens Everything Could it be Worse Than the Apocalypse? - Entrepreneur

Quantum computers are on the path to solving bigger problems for BMW, LG and others – CNET

Marissa Giustina, a researcher with Google's quantum computer lab, draws a diagram showing "quantum supremacy" as only an early step on a path of quantum computer progress.

After years of development, quantum computers reached a level of sophistication in 2021 that emboldened commercial customers to begin dabbling with the radical new machines. Next year, the business world may be ready to embrace them more enthusiastically.

BMW is among the manufacturing giants that sees the promise of the machines, which capitalize on the physics of the ultrasmall to soar over some limits of conventional computers. Earlier this month, the German auto giant chose four winners in a contest it hosted with Amazon to spotlight ways the new technology could help the automaker.

The carmaker found quantum computers have potential to optimize the placement of sensors on cars, predict metal deformation patterns and employ AI in quality checks.

"We at the BMW Group are convinced that future technologies such as quantum computing have the potential to make our products more desirable and sustainable," Peter Lehnert, who leads BMW's research group, said in a statement.

BMW isn't alone in its determination to evaluate the practical application of quantum computers. Aerospace giant Airbus, financial services company PayPal and consumer electronics maker LG Electronics are among the commercial businesses looking to use the machines to refine materials science, streamline logistics and monitor payments.

For years, researchers worked on quantum computers as more or less conceptual projects that take advantage of qubits, data processing elements that can hold more than the two states that are handled by transistors found in conventional computers. Even as they improved, quantum computers were best suited for research projects, some as basic as figuring out how to program the exotic machines. But at the current rate of progress, they'll soon become powerful enough to tackle computing jobs out of reach of conventional computers.

Like cloud computing before it, quantum computing will be a service that most corporations rent from other companies. The rigs require constant attention and are notoriously fiddly. Though more work is required to tap their full potential, quantum computers are becoming more and more stable, a development that's helping corporations overcome initial hesitance.

Georges-Olivier Reymond, chief executive of startup Pasqal, says the progress is turning around skeptics who previously viewed quantum computing as a fantasy. A few years ago, employees at large corporations would roll their eyes when he brought up the subject, but that's changed, Reymond says.

"Now each time I talk to them I have a positive answer," Reymond said. "They are ready to engage."

One new customer is European defense contractor Thales, which is interested in quantum computing applications in sensors and communications. "Pasqal's quantum processors can efficiently address large size problems that are completely out of reach of classical computing systems," Thales Chief Technology Officer Bernhard Quendtsaid in a statement.

Of course, quantum computing is still a tiny fraction of the traditional computing market, but it's growing fast. About $490 million was spent on quantum computers, software and services in 2021, Hyperion Research analyst Bob Sorensen said at the Q2B conference held by quantum computing software company QC Ware in December. He expects spending to grow by 22% to $597 million in 2022 and at an average of 26% a year through 2024. By comparison, spending on conventional computing is expected to rise 4% in 2021 to $3.8 trillion, Gartner analysts predict.

The growing commercial activity is notable given that using a quantum computer costs $3,000 to $5,000 per hour, according to Jean-Francois Bobier, an analyst at Boston Consulting Group. A conventional, high-performance computer hosted on a cloud service costs a half penny for the same amount of time.

Analysts say the real spending on quantum computing will start when the industry tackles error correction, a solution to the vexing problem of easily perturbed qubits that derail calculations. The fidelity of a single computing step on the most advanced machines is around 99.9%, leaving a degree of flakiness that makes a raw quantum computing calculation unreliable. As a result, quantum computers have to run the same calculation many times to provide confidence that the answer is correct.

Once error correction is mature, the revenue generated through quantum computing will explode, according to Boston Consulting Group. With today's machines, that value will likely total between $5 billion and $10 billion by 2025, according to the consultancy's estimates. Once error corrected machines arrive, the total could leap forward to hit $450 billion to $850 billion by 2040.

Software and services that hide the complexity of quantum computers also will boost usage. IonQ CEO Peter Chapman predicts that in 2022, developers will be able to easily train their AI models with quantum computers. "You don't need to know anything about quantum," Chapman said. "You just give it the data set and it spits back a model."

Among the signs of commercial interest:

Quantum computers today are more of a luxury than a necessity. But with their potential to transform materials science, shipping, financial services and product design, it's not a surprise companies like BMW are investing. The automaker stands to benefit from knowing better how materials will deform in a crash or training its vehicles' vision AI faster. Though quantum computers might not produce a payoff this year or next, there's a cost to missing out on the technology once it matures.

Link:
Quantum computers are on the path to solving bigger problems for BMW, LG and others - CNET

BMW and AWS Announce Winners of Quantum Computing …

The BMW Group, in collaboration with AWS, last July called on the global quantum computing community to develop innovative quantum algorithms for four specific industrial challenges and test them on real quantum computing technologies. About 70 teams participated in the challenge and a winning team was selected for the four areas:

1. Sensor positions for automated driving functions: AccentureAccentures winning team tackled the problem of optimising the positioning of sensors for highly automated driving functions.

2. Simulation of material deformations: Qu&CoThe jury concluded that the quantum computing start-up Qu&Co stood out with its approach to solving partial differential equations in the field of numerical simulation.

3. Configuration optimisation of pre-series vehicles: 1QBit and NTTThe winning team from 1QBit and NTT came out on top with hybrid algorithms for solving satisfiability problems in propositional logic for optimising equipment configuration.

4. Automated quality analyses: QC WareThe QC Ware team stood out with its approach, drawn from the field of machine learning, that can be used in image recognition in the area of quality analysis.

The BMW Group worked closely with theAmazonQuantum Solutions LabProfessional Services team, an expert group of professionals, throughout the challenge, right up to the moment when the winners were determined. AWS also provided credits for the use of Amazon Braket, enabling the development and testing of the submitted quantum algorithms. Amazon Braket provides a development environment to explore and create quantum algorithms, test them on quantum circuit simulators and run them on different quantum hardware technologies.

The jury that oversaw the challenge and ultimately decided on the winning teams also included professors from the Technical University of Munich (TUM) as well as representatives of the BMW Group and AWS. TUM is an important partner for the BMW Group for research in the field of quantum computing. The BMW Group announced the establishment of theQuantum Algorithms and Applications endowed chair at TUMback in June of this year. Algorithms close to specific use cases along the industrial value chain are being researched at the chair. The BMW Group is providing 5.1 million euros over a period of six years to fund the professorship, staff and equipment at TUM.

Quantum computing is one of the most promising future technologies in the automotive sector. It has enormous potential for research into materials, for complex optimisation problems and for the future of automated driving. The Quantum Computing Challenge once again underlines the BMW Groups leading-edge role in building a quantum ecosystem. As recently as June, the company was a founding member, along with nine other large corporations, of theQuantum Technology and Application Consortium (QUTAC). This aims to specifically accelerate the development of the technology in Germany and Europe. In November this year, the BMW Group and RWTH Aachen University jointly announced the establishment of theQuantum Information Systems endowed chair, where software and industrialisation comeptencies will be created to realise a quantum advantage in the medium term.

Dr Peter Lehnert, Vice President BMW Group Research and New Technologies Digital Car:We at the BMW Group are convinced that future technologies such as quantum computing have the potential to make our products more desirable and sustainable. We have succeeded in reaching the global quantum computing community with our crowd-innovation approach and enthusing them about automotive use cases. We look forward to continuing to work with the winners.

The BMW Group received around 70 submissions from all over the world from different areas such as international and national research groups, the start-up scene and established companies. The exceptionally high quality of the submissions enables new perspectives and offers potential for innovative approaches to solutions such as the development and further development of new algorithms. The expert jury took into account criteria such as comprehensibility, feasibility, scalability, innovation and benefit for the BMW Group when evaluating the submitted solutions.

All 15 finalists set themselves apart with their high innovation potential and have therefore been shortlisted for future projects. The journey continues straight away for the four winners: they immediately gain the BMW Group as a customer and will be involved in the further development of the pilot projects. The company looks forward to working with these four winners.

The BMW Group Quantum Computing Challenge is structured around the Supplierthon methodology, which is the BMW Groups future-oriented supplier scouting method. It marks the companys first global crowd-innovation initiative on this scale. The crowd innovation approach enables innovative solutions to be found within a very short time and to validate them in cooperation with the specialist departments. The challenge also gave the BMW Group invaluable insights into the status quo of the global quantum ecosystem. This knowledge is crucial in determining the future direction of research on the future technology and the long-term establishment of the market for quantum computing. The successful challenge along with the extremely promising submissions encourage the company to continue to look to the crowd innovation approach in the future.

See original here:
BMW and AWS Announce Winners of Quantum Computing ...

Neural’s best quantum computing and physics stories from 2021 – The Next Web

2021 will be remembered for a lot of things, but when its all said and done we think itll eventually get called the year quantum computing finally came into focus.

Thats not to say useful quantum computers have actually arrived yet. Theyre still somewhere between a couple years and a couple centuries away. Sorry for being so vague, but when youre dealing with quantum physics there arent yet many guarantees.

This is because physics is an incredibly complex and challenging field of study. And the difficulty gets cranked up exponentially when you start adding theoretical and quantum to the research.

Were talking about physics at the very edge of reason. Like, for example, imagining a quantum-powered artificial intelligence capable of taking on the Four Horseman of the Apocalypse.

That might sound pretty wacky, but this story explains why its not quite as out there as you might think.

But lets go even further. Lets go past the edge of reason and into the realm of the speculative science. Earlier this year we wondered what would happen if physicists could actually prove that reality as we know it isnt real.

Per that article:

Theoretically, if we could zoom in past the muons and leptons and keep going deeper and deeper, we could reach a point where all objects in the universe are indistinguishable from each other because, at the quantum level, everything that exists is just a sea of nearly-identical subparticulate entities.

This version of reality would render the concepts of space and time pointless. Time would only exist as a construct by which we give meaning to our own observations. And those observations would merely be the classical side-effects of existing in a quantum universe.

So, in the grand scheme of things, its possible that our reality is little more than a fleeting, purposeless arrangement of molecules. Everything that encompasses our entire universe may be nothing more than a brief hallucination caused by a quantum vibration.

Nothing makes you feel special like trying to conceive of yourself as a few seasoning particles in an infinite soup of gooey submolecules.

If having an existential quantum identity-crisis isnt your thing, we also covered a lot of cool stuff that doesnt require you to stop seeing yourself as an individual stack of materials.

Does anyone remember the time China said it had built a quantum computer a million times more powerful than Googles? We dont believe it. But thats the claim the researchersmade. You can read more about that here.

Oh, and that Google quantum system the Chinese researchers referenced? Yeah, it turns out it wasnt exactly the massive upgrade over classical supercomputers it was chalked up to be either.

But, of course, we forgive Google for its marketing faux pas. And thats because, hands down, the biggest story of the year for quantum computers was the time crystal breakthrough.

As we wrote at the time:

If Googles actually created time-crystals, it could accelerate the timeline for quantum computing breakthroughs from maybe never to maybe within a few decades.

At the far-fetched, super-optimistic end of things we could see the creation of a working warp drive in our lifetimes. Imagine taking a trip to Mars or the edge of our solar system, and being back home on Earth in time to catch the evening news.

And, even on the conservative end with more realistic expectations, its not hard to imagine quantum computing-based chemical and drug discovery leading to universally-effective cancer treatments.

Talk about a eureka moment!

But there were even bigger things in the world of quantum physics than just advancing computer technology.

Scientists from the University of Sussex determined that black holes emanate a specific kind of quantum pressure that could lend some credence to multiple universe theories.

Basically, we cant explain where the pressure comes from. Could this be blow back from white holes swallowing up energy and matter in a dark, doppelganger universe that exists parallel to our own? Nobody knows! You can read more here though.

Still there were even bigger philosophical questions in play over the course of 2021 when it came to interpreting physics research.

Are we incapable of finding evidence for God because were actually gods in our rights? That might sound like philosophy, but there are some pretty radical physics interpretations behind that assertion.

And, if we are gods, can we stop time? Turns out, whether were just squishy mortal meatbags or actual deities, we actually can!

Alright. If none of those stories impress you, weve saved this one for last. If being a god, inventing time crystals, or even stopping time doesnt float your boat, how about immortality? And not just regular boring immortality, butquantum immortality.

Its probably not probable, and adding the word quantum to something doesnt necessarily make it cooler, but anythings possible in an infinite universe. Plus, the underlying theories involving massive-scale entanglement are incredible read more here.

Seldom a day goes by where something incredible isnt happening in the world of physics research. But thats nothing compared to the magic weve yet to uncover out there in this fabulous universe we live in.

Luckily for you, Neural will be back in 2022 to help make sense of it all. Stick with us for the most compelling, wild, and deep reporting on the quantum world this side of the non-fiction realm.

Excerpt from:
Neural's best quantum computing and physics stories from 2021 - The Next Web

Research Opens the Door to Fully Light-Based Quantum Computing – Tom’s Hardware

A team of researchers with Japan's NTT Corporation, the Tokyo University, and the RIKEN research center have announced the development of a full photonics-based approach to quantum computing. Taking advantage of the quantum properties of squeezed light sources, the researchers expect their work to pave the road towards faster and easier deployments of quantum computing systems, avoiding many practical and scaling pitfalls of other approaches. Furthermore, the team is confident their research can lead towards the development of rack-sized, large-scale quantum computing systems that are mostly maintenance-free.

The light-based approach in itself brings many advantages compared to traditional quantum computing architectures, which can be based on a number of approaches (trapped ions, silicon quantum dots, and topological superconductors, just to name a few). However, all of these approaches are somewhat limited from a physics perspective: they all need to employ electronic circuits, which leads to Ohmic heating (the waste heat that results from electrical signals' trips through resistive semiconductor wiring). At the same time, photonics enable tremendous improvements in latency due to data traveling at the speed of light.

Photonics-based quantum computing takes advantage of emerging quantum properties in light. The technical term here is squeezing the more squeezed a light source is, the more quantum behavior it demonstrates. While a minimum squeezing level of over 65% was previously thought required to unlock the necessary quantum properties, the researchers achieved a higher, 75% factor in their experiments. In practical terms, their quantum system unlocks a higher than 6 THz frequency band, thus taking advantage of the benefits of photonics for quantum computing without decreasing the available broadband to unusable levels.

The researchers thus expect their photonics-based quantum design to enable easier deployments there's no need for exotic temperature controls (essentially sub-zero freezers) that are usually required to maintain quantum coherence on other systems. Scaling is also made easier and simplified: there's no need to increase the number of qubits by interlinking several smaller, coherent quantum computing units. Instead, the number of qubits (and thus the performance of the system) can be increased by continuously dividing light into "time segments" and encoding different information in each of these segments. According to the team, this method allows them to "easily increase the number of qubits on the time axis without increasing the size of the equipment."

All of these elements combined allow for a reduction in required raw materials while doing away with the complexity of maintaining communication and quantum coherence between multiple, small quantum computing units. The researchers will now focus on actually building the photonics-based quantum computer. Considering how they estimate their design can scale up towards "millions of qubits," their contributions could enable a revolutionary jump in quantum computation that skips the expected "long road ahead" for useful qubit counts to be achieved.

Original post:
Research Opens the Door to Fully Light-Based Quantum Computing - Tom's Hardware

Columbia Research Sends Letter to Senator Chuck Schumer and Others – Columbia University

This week, Jeannette Wing, the executive vice president of research at Columbia, reached out to Senators KirstenGillibrand and Chuck Schumer and Representatives Adriano Espaillat,MondaireJones, and Jerrold Nadler toexpress the university's support of the Build Back Better bill, which provides "critical new funding to revitalize and strengthen our nations scientific research enterprise."

In the letters, she wrote that "it is particularly encouraging that the bill also identifies both urgent and emerging areas of research as funding priorities, such as climate science, biotechnology, artificial intelligence, and quantum computing. These areas of research demand our immediate attention to improve our collective health and prosperity, and to ensure our role as the worlds innovation leader."

Wing urged the delegates to support the followingprovisions and investments included in the House bill:

House Committee on Science, Space, and Technology

House Committee on Energy and Commerce

House Committee on Agriculture

See the rest here:
Columbia Research Sends Letter to Senator Chuck Schumer and Others - Columbia University

QCE21 Home IEEE Quantum Week

IEEE Quantum Week the IEEE International Conference on Quantum Computing and Engineering (QCE) is bridging the gap between the science of quantum computing and the development of an industry surrounding it. As such, this event brings a perspective to the quantum industry different from academic or business conferences. IEEE Quantum Week is a multidisciplinary quantum computing and engineering venue that gives attendees the unique opportunity to discuss challenges and opportunities with quantum researchers, scientists, engineers, entrepreneurs, developers, students, practitioners, educators, programmers, and newcomers.

IEEE Quantum Week 2021 received outstanding contributions from the international quantum community forming anexceptional programwithexciting exhibitsfeaturing technologies from quantum companies, start-ups and research labs.QCE21, the second IEEE International Conference on Quantum Computing and Engineering, provides over 300 hours of quantum and engineering programming featuring10 world-class keynote speakers,19 workfoce-building tutorials,23 community-building workshops,48 technical papers,30 innovative posters,18 stimulating panels, andBirds-of a Feather sessions. The QCE21 program is structured into 10 parallel tracks over six days, October 17-22, 2021 and is available on-demand for registered participants until the end of the year.

The QCE conference grew out of theIEEE Future Directions Quantum Initiativein 2019 and held itsinaugural IEEE Quantum Week event in October 2020.IEEE Quantum Week 2020was tremendous success with over 800 attendees from 45 countries and 270+ hours of quantum computing and engineering programming in nine parallel tracks over five days.

With your contributions and your participation, together we are building a premier meeting of quantum minds to help advance the fields of quantum computing and engineering. As a virtual event, Quantum Week provides ample opportunities to network with your peers and explore partnerships with industry, government, and academia.Quantum Week 2021 aims to bring together quantum professionals, researchers, educators, entrepreneurs, champions and enthusiasts to exchange and share their experiences, challenges, research results, innovations, applications, pathways and enthusiasm on all aspects of quantum computing and engineering.

IEEE Quantum Week aims to showcase quantum research, practice, applications, education, and training including programming systems, software engineering methods & tools, algorithms, benchmarks & performance metrics, hardware engineering, architectures, & topologies, software infrastructure, hybrid quantum-classical computing, architectures and algorithms, as well as many applications including simulation of chemical, physical and biological systems, optimization problems, techniques and solutions, and quantum machine learning.

Link:
QCE21 Home IEEE Quantum Week

IBM and ExxonMobil are building quantum algorithms to solve this giant computing problem – ZDNet

Research teams from energy giant ExxonMobil and IBM have been working together to find quantum solutions to one of the most complex problems of our time: managing the tens of thousands of merchant ships crossing the oceans to deliver the goods that we use every day.

The scientists lifted the lid on the progress that they have made so far and presented the different strategies that they have been using to model maritime routing on existing quantum devices, with the ultimate goal of optimizing the management of fleets.

ExxonMobil was the first energy company to join IBM's Quantum Network in 2019, and has expressed a keen interest in using the technology to explore various applications, ranging from the simulation of new materials to solving optimization problems.

SEE: Research: Why Industrial IoT deployments are on the rise (TechRepublic Premium)

Now, it appears that part of the energy company's work was dedicated to tapping quantum capabilities to calculate journeys that minimize the distance and time traveled by merchant ships across the globe.

On a worldwide scale, the equation is immense intractable, in fact, for classical computers. About 90% of world trade relies on maritime shipping, with more than 50,000 ships, themselves carrying up to 200,000 containers each, moving around every day to transport goods with a total value of $14 trillion.

The more the number of ships and journeys increase, the bigger the problem becomes. As IBM and ExxonMobil's teams put itin a blog post detailing their research: "Logistically speaking, this isn't the 'traveling salesperson problem.'"

While this type of exponentially growing problem can only be solved with simplifications and approximations on classical computers, the challenge is well-suited to quantum technologies. Quantum computers can effectively leverage a special dual state that is taken on by quantum bits, or qubits, to run many calculations at once; meaning that even the largest problems could be resolved in much less time than is possible on a classical computer.

"We wanted to see whether quantum computers could transform how we solve such complex optimization problems and provide more accurate solutions in less computational times," said the researchers.

Although the theory behind the potential of quantum computing is well-established, it remains to be found how quantum devices can be used in practice to solve a real-world problem such as the global routing of merchant ships. In mathematical terms, this means finding the right quantum algorithms that could be used to most effectively model the industry's routing problems, on current or near-term devices.

To do so, IBM and ExxonMobil's teams started with widely-used mathematical representations of the problem, which account for factors such as the routes traveled, the potential movements between port locations and the order in which each location is visited on a particular route. There are many existing ways to formulate the equation, one of which is called the quadratic unconstrained binary optimization (QUBO) technique, and which is often used in classical computer science.

The next question was to find out whether well-known models like QUBO can be solved with quantum algorithms and if so, which solvers work better. Using IBM's Qiskit optimization module, which was released last year toassist developers in building quantum optimization algorithms, the team tested various quantum algorithms labeled with unbeatably exotic names: the Variational Quantum Eigensolver (VQE), the Quantum Approximate Optimization Algorithm (QAOA), and Alternating Direction Method of Multiplier (ADMM) solvers.

After running the algorithms on a simulated quantum device, the researchers found that models like QUBO could effectively be solved by quantum algorithms, and that depending on the size of the problem, some solvers showed better results than others.

In another promising finding, the team said that the experiment showed some degree of inexactness in solving QUBOs is tolerable. "This is a promising feature to handle the inherent noise affecting the quantum algorithms on real devices," said the researchers.

SEE: BMW explores quantum computing to boost supply chain efficiencies

Of course, while the results suggest that quantum algorithms could provide real-world value, the research was carried out on devices that are still technically limited, and the experiments can only remain small-scale. The idea, however, is to develop working algorithms now, to be ready to harness the power of a fully fledged quantum computer when the technology develops.

"As a result of our joint research, ExxonMobil now has a greater understanding of the modelling possibilities, quantum solvers available, and potential alternatives for routing problems in any industry," said the researchers.

What applies to merchant ships, in effect, can also work in other settings. Routing problems are not inherent to the shipping industry, and the scientists confirmed that their findings could easily be transferred to any vehicle optimization problem that has time constraints, such as goods delivery, ride-sharing services or urban waste management.

In fact, ExxonMobil is not the first company to look at ways to use quantum computing techniques to solve optimization problems. Electronics manufacturer OTI Lumionics, for example, has been using QUBO representations to find the most optimal simulation of next-generation OLED materials. Instead of using gate-based quantum computers to run the problem, however, the company has been developing quantum-inspired algorithms to solve calculations on classical Microsoft Azure hardware,with encouraging results.

The mathematical formulas and solution algorithmsare described in detail in the research paper, and the ExxonMobil/IBM team stressed that their use is not restricted. The researchers encouraged their colleagues to reproduce their findings to advance the global field of quantum solvers.

Here is the original post:
IBM and ExxonMobil are building quantum algorithms to solve this giant computing problem - ZDNet

Q-CTRL: machine learning technique to pinpoint quantum errors – News – The University of Sydney

Professor Michael Biercuk is CEO of quantum tech startup Q-CTRL.

Researchers at the University of Sydney and quantum control startup Q-CTRL have announced a way to identify sources of error in quantum computers through machine learning, providing hardware developers the ability to pinpoint performance degradation with unprecedented accuracy and accelerate paths to useful quantum computers.

A joint scientific paper detailing the research, titled Quantum Oscillator Noise Spectroscopy via Displaced Cat States, has been published inPhysical Review Letters, the worlds premier physical science research journal and flagship publication of the American Physical Society (APS Physics).

Focused on reducing errors caused by environmental noise - the Achilles heel of quantum computing - the University of Sydney team developed a technique to detect the tiniest deviations from the precise conditions needed to execute quantum algorithms using trapped ion and superconducting quantum computing hardware. These are the core technologies used by world-leading industrial quantum computing efforts at IBM, Google, Honeywell, IonQ, and others.

The University team is based at the Quantum Control Laboratory led by Professor Michael Biercukin the Sydney Nanoscience Hub.

Topinpoint the source of the measured deviations, Q-CTRL scientists developed a new way to process the measurement results using custom machine-learning algorithms. In combination with Q-CTRLs existing quantum control techniques, the researchers were also able to minimise the impact of background interference in the process. This allowed easy discrimination between real noise sources that could be fixed and phantom artefacts of the measurements themselves.

Combining cutting-edge experimental techniques with machine learning has demonstrated huge advantages in the development of quantum computers, said Dr Cornelius Hempel of ETH Zurich who conducted the research while at the University of Sydney. The Q-CTRL team was able to rapidly develop a professionally engineered machine learning solution that allowed us to make sense of our data and provide a new way to see the problems in the hardware and address them.

Q-CTRL CEO Professor Biercuk said: The ability to identify and suppress sources of performance degradation in quantum hardware is critical to both basic research and industrial efforts building quantum sensors and quantum computers.

Quantum control, augmented by machine learning, has shown a pathway to make these systems practically useful and dramatically accelerate R&D timelines, he said.

The published results in a prestigious, peer-reviewed journal validate the benefit of ongoing cooperation between foundational scientific research in a university laboratory and deep-tech startups. Were thrilled to be pushing the field forward through our collaboration.

Q-CTRL was spun-out of the University of Sydney by Professor Michael Biercuk from the School of Physics. The startup builds quantum control infrastructure software for quantum technology end-users and R&D professionals across all applications.

Q-CTRL has assembled the worlds foremost team of expert quantum-control engineers, providing solutions to many of the most advanced quantum computing and sensing teams globally. Q-CTRL is funded by SquarePeg Capital, Sierra Ventures, Sequoia Capital China, Data Collective, Horizons Ventures, Main Sequence Ventures and In-Q-Tel. Q-CTRL has international headquarters in Sydney, Los Angeles, and Berlin.

The rest is here:
Q-CTRL: machine learning technique to pinpoint quantum errors - News - The University of Sydney

The global quantum computing race has begun. What will it take to win it? – ZDNet

The UK is now facing a huge challenge: after having secured a top spot in the quantum race, retaining the country's status is going to require some serious stepping up.

National quantum programs and decade-long quantum strategies are increasingly being announced by governments around the world. And as countries unlock billions-worth of budgets, it is becoming clear that a furious competition is gradually unrolling. Nations want to make sure that they are the place-to-be when quantum technologies start showing some real-world value and the UK, for one, is keen to prove that it is a quantum hotspot in the making.

"We have a very successful program that is widely admired and emulated around the world," said Peter Knight, who sits on the strategic advisory for the UK's national quantum technology program (NQTP), as he provided a virtual update on the NQTP's performance so far.

Speaking at an online conference last month, Knight seemed confident. The UK, said the expert, in line with the objectives laid out in the program, is on track to become "the go-to place" for new quantum companies to start, and for established businesses to base all manners of innovative quantum activities.

SEE: Hiring Kit: Computer Hardware Engineer (TechRepublic Premium)

The UK is just over halfway through the NQTP, which saw its second five-year phase kick off at the end of 2019, and at the same timehit an impressive milestone of 1 billion ($1.37 billion) combined investment. This, the government claims, is letting the UK keep pace with competitors who are also taking interest in quantum namely, the US and China.

There is no doubt that the country has made strides in the field of quantum since the start of the NQTP. New ground-breaking research papers are popping up on a regular basis, and so are news reports of rounds of funding from promising quantum startups.

But with still just under half of the national quantum program to carry out, and despite the huge sums already invested, the UK is now facing a bigger challenge yet: after having chased a top spot in the quantum race, retaining the country's status in the face of ferocious competition is going to require some serious stepping up.

Clearly playing in favor of the UK is the country's early involvement in the field. The NQTP was announced as early as 2013, and started operating in 2014, with an initial 270 million ($370 million) budget. The vision laid out in the program includes creating a "quantum-enabled economy", in which the technology would significantly contribute to the UK's economy and attract both strong investment and global talent.

"The national program was one of the first to kick off," Andrew Fearnside, senior associate specializing in quantum technologies at intellectual property firm Mewburn Ellis, tells ZDNet. "There are increasingly more national programs emerging in other countries, but they are a good few years behind us. The fact that there has been this sustained and productive long-term government initiative is definitely attractive."

The EU's Quantum Technologies Flagship, in effect,only launched in 2018; some countries within the bloc,like France, started their own quantum roadmaps on top of the European initiative even later. Similarly, the National Quantum Initiative Act wassigned into law by the Trump administration but that was also in 2018, years into the UK's national quantum technology program.

Since it launched in 2014, there has been abundant evidence of the academic successes of the initial phase of the NQTP. In Birmingham, the Quantum Sensing Hub is developing new types of quantum-based magnetic sensors that could help diagnose brain and heart conditions, while the Quantum Metrology Institute leads the development of quantum atomic clocks. There are up to 160 research groups and universities registered across the UK withprograms that are linked to quantum technologies, working on projects ranging from the design of quantum algorithms to the creation of new standards and verification methods.

A much harder challenge, however, is to transform this strong scientific foundation into business value and as soon as the UK government announced the second phase of the NQTP at the end of 2019,a clear messageemerged: quantum technology needed to come out of the lab, thanks to increased private sector investment that would accelerate commercialization.

Some key initiatives followed. A national quantum computing center was established for academics to work alongside commercial partners such as financial services company Standard Chartered, "possibly with an eye on financial optimization problems," notes Fearnside, given the business'established interest in leveraging quantum technologies. A 10 million ($13 million) "Discovery" program alsolaunched a few months ago, bringing together five quantum computing companies, three universities and the UK's national physical laboratory all for the purpose of making quantum work for businesses.

The government's efforts have been, to an extent, rewarded. The quantum startup ecosystem is thriving in the UK, with companies like Riverlane or Cambridge Quantum Computing completing strong rounds of private financing. In total, up to 204 quantum-related businesses have been listed so far in the country.

But despite these encouraging results, the UK is still faced with a big problem. Bringing university-born innovation to the real worldhas always been a national challenge, and quantum is no exception. A 2018 report from the Science and Technology committee, in fact,gave an early warning of the stumbling blocksthat the NQTP might run into, and stressed the need for improved awareness across industry of the potential of quantum technologies.

The committee urged the government to start conveying the near-term benefits that quantum could provide to businesses something that according to the report, CEOs and company chairs in North America worryingly seem to realize a whole lot better.

It's been three years since the report was published, and things haven't changed much. Speaking at the same forum as the NQTP's Peter Knight, Ian West, a partner at consultancy firm KPMG, said that there remained a huge barrier to the widespread take-up of quantum technologies in the UK. "Some of our clients feel they don't understand the technology, or feel it's one for the academics only," he argued.

"We need that demand from businesses who will be the ultimate users of quantum technologies, to encourage more investment," West added. "We need to do much more to explain the near-term and medium-term use cases for business applications of quantum technologies."

SEE: BMW explores quantum computing to boost supply chain efficiencies

Without sufficient understanding of the technology, funding problems inevitably come. The difficulty of securing private money for quantum stands in stark contrast to the situation across the Atlantic, where investors have historically done a better job of spotting and growing successful technology companies. Add the deep pockets of tech giants such as Google, IBM or Microsoft, which are all pouring money into quantum research, and it is easy to see why North America might have better prospects when it comes to winning the quantum game.

In the worst of cases, this has led to US technology hubs hoovering up some of the best quantum brains in the UK. In 2019, for example, PsiQ, a promising startup that was founded at the University of Bristol with the objective of producing a commercial quantum computer, re-located to Silicon Valley. The movewas reported to be partly motivated by a lack of access to capital in Europe. It was a smart decision: according to the company's latest update, PsiQ hasnow raised $215 million (156 million) in VC funding.

Pointing to the example of PsiQ, Simon King, partner and deep tech investor at VC firm Octopus Ventures, explains that to compete against the US, the UK needs to up its game when it comes to assessing the startups that show promise, and making sure that they are injected with adequate cash.

"The US remains the biggest competitor, with a big concentration of universities and academics and the pedigree and culture of commercializing university research," King tells ZDNet. "Things are definitely moving in the right direction, but the UK and Europe still lag behind the US, where there is a deeper pool of capital and there are more investors willing to invest in game-changing, but long-term technology like quantum."

US-based private investors are only likely to increase funding for the quantum ecosystem in the coming years, and significant amounts of public money will be backing the technology too. The National Quantum Initiative Act that was signed in 2018 came with $1.2 billion (870 million) to be invested in quantum information science over the next five years; as more quantum companies flourish, the budget can be expected to expand even further.

Competition will be coming from other parts of the world as well. In addition to the European Commission's 1 billion ($1.20 billion) quantum flagship, EU countries are also spending liberally on the technology. Germany, in particular, has launched a 2 billion ($2.4 billion) funding program for the promotion of quantum technologies in the country, surpassing by far many of its competitors; but France, the Netherlands, and Switzerland are all increasingly trying to establish themselves as hubs for quantum startups and researchers.

SEE: Less is more: IBM achieves quantum computing simulation for new materials with fewer qubits

Little data is available to measure the scope of the commercialization of quantum technology in China, but the country has made no secret of its desire to secure a spot in the quantum race, too. The Chinese government has ramped up its spending on research and development, and the impact of that investment has already shown in the countryachieving some significant scientific breakthroughs in the field.

In the midst of this ever-more competitive landscape, whether the UK can effectively distinguish itself as the "go-to place" for quantum technologies remains to be seen. One thing is for certain: the country has laid some very strong groundwork to compete. "The UK has some genuinely world-class universities with some really brilliant academics, so while the objective is certainly ambitious, it's not out of the question," argues King.

But even top-notch researchers and some of the most exciting quantum startups might not cut it. The UK has positioned itself well from an early stage in the quantum race, but becoming a frontrunner was only one part of the job. Preserving the country's position for the coming years might prove to be the hardest challenge yet.

Follow this link:
The global quantum computing race has begun. What will it take to win it? - ZDNet

Kangaroo Court: Quantum Computing Thinking on the Future – JD Supra

The promise of quantum computers is that certain computational tasks might be executed exponentially faster on a quantum processor than on a classical processor.

Quantum computing is a beautiful fusion of quantum physics with computer science. It incorporates some of the most stunning ideas of physics from the twentieth century into an entirely new way of thinking about computation. Quantum computers have the potential to resolve problems of a high complexity and magnitude across many different industries and application, including finance, transportation, chemicals, and cybersecurity. Solving the impossible in a few hours of computing time.

Quantum computing is often in the news: China teleported a qubit from earth to a satellite; Shors algorithm has put our current encryption methods at risk; quantum key distribution will make encryption safe again; Grovers algorithm will speed up data searches. But what does all this really mean? How does it all work?

Todays computers operate in a very straightforward fashion: they manipulate a limited set of data with an algorithm and give you an answer. Quantum computers are more complicated. After multiple units of data are input into qubits, the qubits are manipulated to interact with other qubits, allowing for several calculations to be done simultaneously. Thats where quantum computers are a lot faster than todays machines.

Quantum computers have four fundamental capabilities that differentiate them from todays classical computers:

All computations involve inputting data, manipulating it according to certain rules, and then outputting the final answer. For classical computations, the bit is the basic unit of data. For quantum computation, this unit is the quantum bit usually shortened to qubit.

The basic unit of quantum computing is a qubit. A classical bit is either 0 or 1. If its 0 and we measure it, we get 0. If its 1 and we measure 1, we get 1. In both cases the bit remains unchanged. The standard example is an electrical switch that can be either on or off. The situation is totally different for qubits. Qubits are volatile. A qubit can be in one of an infinite number of states a superposition of both 0 and 1 but when we measure it, as in the classical case, we just get one of two values, either 0 or 1. Qubits can also become entangled. In fact, the act of measurement changes the qubit. When we make a measurement of one of them, it affects the state of the other. Whats more, they interact with other qubits. In fact, these interactions are what make it possible to conduct multiple calculations at once.

Nobody really knows quite how or why entanglement works. It even baffled Einstein, who famously described it as spooky action at a distance. But its key to the power of quantum computers. In a conventional computer, doubling the number of bits doubles its processing power. But thanks to entanglement, adding extra qubits to a quantum machine produces an exponential increase in its number-crunching ability.

These three things superposition, measurement, and entanglement are the key quantum mechanical ideas. Controlling these interactions, however, is very complicated. The volatility of qubits can cause inputs to be lost or altered, which can throw off the accuracy of results. And creating a computer of meaningful scale would require hundreds of thousands of millions of qubits to be connected coherently. The few quantum computers that exist today can handle nowhere near that number. But the good news is were getting very, very close.

Quantum computing and classical computer are not two distinct disciplines. Quantum computing is the more fundamental form of computing anything that can be computed classically can be computed on a quantum computer. The qubit is the basic unit of computation, not the bit. Computation, in its essence, really means quantum computing. A qubit can be represented by the spin of an electron or the polarization of a photon.

In 2019 Google achieved a level of quantum supremacy when they reported the use of a processor with programmable superconducting qubits to create quantum states on 54 qubits, corresponding to a computational state-space of dimension 253(about 1016). This incredible achievement was slightly short of their mission goal for creating quantum states of 72 qubits. What is so special about this number? Classical computers can simulate quantum computers if the quantum computer doesnt have too many qubits, but as the number of qubits increases we reach the point where that is no longer possible.

There are 8 possible three-bit combinations: 000,001, 010, 011, 100, 101, 110, 111. The number 8 comes from 23. There are two choices for the first bit, two for the second and two for the third, and we might multiple these three 2s together. If instead of bits we switch to qubits, each of these 8 three-bit strings is associated with a basis vector, so the vector space is 8-dimensional. If we have 72 qubits, the number of basis elements is 2. This is about 4,000,000,000,000,000,000,000. It is a large number and is considered to be the point at which classical computers cannot simulate quantum computers. Once quantum computers have more than 72 or so qubits we truly enter the age of quantum supremacy when quantum computers can do computations that are beyond the ability of any classical computer.

To provide a little more perspective, lets consider a machine with 300 qubits. This doesnt seem an unreasonable number of the not too distant future. But 2300 is an enormous number. Its more than the number of elementary particles in the known universe. A computation using 300 qubits would be working with 2300 basis elements.

Some calculations required for the effective simulation of real-life scenarios are simply beyond the capability of classical computers whats known as intractable problems. Quantum computers, with their huge computational power, are ideally suited to solving these problems. Indeed, some problems, like factoring, are hard on a classical computer, but are easy on a quantum computer. This creates a world of opportunities, across almost every aspect of modern life.

Healthcare: classical computers are limited in terms of size and complexity of molecules they can simulate and compare (an essential process of early drug development). Quantum computers will allow much larger molecules to be simulated. At the same time, researchers will be able to model and simulate interactions between drugs and all 20,000+ proteins encoded in the human genome, leading to greater advancements in pharmacology.

Finance: one potential application is algorithmic trading using complex algorithms to automatically trigger share dealings based on a wide variety of market variables. The advantages, especially for high-volume transactions, are significant. Another application is fraud detection. Like diagnostics in healthcare, fraud detection is reliant upon pattern recognition. Quantum computers could deliver a significant improvement in machine learning capabilities; dramatically reducing the time taken to train a neural network and improving the detection rate.

Logistics: Improved data analysis and modelling will enable a wide range of industries to optimize workflows associated with transport, logistics and supply-chain management. The calculation and recalculation of optimal routes could impact on applications as diverse as traffic management, fleet operations, air traffic control, freight and distribution.

It is, of course, impossible to predict the long-term impact of quantum computing with any accuracy. Quantum computing is now in its infancy, and the comparison to the first computers seems apt. The machines that have been constructed so far tend to be large and not very powerful, and they often involve superconductors that need cooled to extremely low temperatures. To minimize the interaction of quantum computers with the environment, they are always protected from light and heat. They are shieled against electromagnetic radiation, and they are cooled. One thing that can happen in cold places is that certain materials become superconductors they lose all electrical resistance and superconductors have quantum properties that can be exploited.

Many countries are experimenting with small quantum networks using optic fiber. There is the potential of connecting these via satellite and being able to form a worldwide quantum network. This work is of great interest to financial institutions. One early impressive result involves a Chinese satellite that is devoted to quantum experiments. Its named Micius after a Chinese philosopher who did work in optics. A team in China connected to a team in Austria the first time that intercontinental quantum key distribution (QKD) had been achieved. Once the connection was secured, the teams sent pictures to one another. The Chinese team sent the Austrians a picture of Micius, and the Austrians sent a picture of Schrodinger to the Chinese.

To actually make practical quantum computers you need to solve a number of problems, the most serious being decoherence the problem of your qubit interacting with something from the environment that is not part of the computation. You need to set a qubit to an initial state and keep it in that state until you need to use it. Their quantum state is extremely fragile. The slightest vibration or change in temperature disturbances known as noise in quantum-speak can cause them to tumble out of superposition before their job has been properly done. Thats why researchers are doing the best to protect qubits from the outside world in supercooled fridges and vacuum chambers.

Alan Turing is one of the fathers of the theory of computation. In his landmark paper of 1936 he carefully thought about computation. He considered what humans did as they performed computations and broke it down to its most elemental level. He showed that a simple theoretical machine, which we now call a Turing machine, could carry out any algorithm. But remember, Turing was analyzing computation based on what humans do. With quantum computation the focus changes from how humans compute to how the universe computes. Therefore, we should think of quantum computation as not a new type of computation but as the discovery of the true nature of computation.

More:
Kangaroo Court: Quantum Computing Thinking on the Future - JD Supra

Securing the DNS in a Post-Quantum World: New DNSSEC Algorithms on the Horizon – CircleID

This is the fourth in a multi-part series on cryptography and the Domain Name System (DNS).

One of the "key" questions cryptographers have been asking for the past decade or more is what to do about the potential future development of a large-scale quantum computer.

If theory holds, a quantum computer could break established public-key algorithms including RSA and elliptic curve cryptography (ECC), building on Peter Shor's groundbreaking result from 1994.

This prospect has motivated research into new so-called "post-quantum" algorithms that are less vulnerable to quantum computing advances. These algorithms, once standardized, may well be added into the Domain Name System Security Extensions (DNSSEC) thus also adding another dimension to a cryptographer's perspective on the DNS.

(Caveat: Once again, the concepts I'm discussing in this post are topics we're studying in our long-term research program as we evaluate potential future applications of technology. They do not necessarily represent Verisign's plans or position on possible new products or services.)

The National Institute of Standards and Technology (NIST) started a Post-Quantum Cryptography project in 2016 to "specify one or more additional unclassified, publicly disclosed digital signature, public-key encryption, and key-establishment algorithms that are capable of protecting sensitive government information well into the foreseeable future, including after the advent of quantum computers."

Security protocols that NIST is targeting for these algorithms, according to its 2019 status report (Section 2.2.1), include: "Transport Layer Security (TLS), Secure Shell (SSH), Internet Key Exchange (IKE), Internet Protocol Security (IPsec), and Domain Name System Security Extensions (DNSSEC)."

The project is now in its third round, with seven finalists, including three digital signature algorithms, and eight alternates.

NIST's project timeline anticipates that the draft standards for the new post-quantum algorithms will be available between 2022 and 2024.

It will likely take several additional years for standards bodies such as the Internet Engineering Task (IETF) to incorporate the new algorithms into security protocols. Broad deployments of the upgraded protocols will likely take several years more.

Post-quantum algorithms can therefore be considered a long-term issue, not a near-term one. However, as with other long-term research, it's appropriate to draw attention to factors that need to be taken into account well ahead of time.

The three candidate digital signature algorithms in NIST's third round have one common characteristic: all of them have a key size or signature size (or both) that is much larger than for current algorithms.

Key and signature sizes are important operational considerations for DNSSEC because most of the DNS traffic exchanged with authoritative data servers is sent and received via the User Datagram Protocol (UDP), which has a limited response size.

Response size concerns were evident during the expansion of the root zone signing key (ZSK) from 1024-bit to 2048-bit RSA in 2016, and in the rollover of the root key signing key (KSK) in 2018. In the latter case, although the signature and key sizes didn't change, total response size was still an issue because responses during the rollover sometimes carried as many as four keys rather than the usual two.

Thanks to careful design and implementation, response sizes during these transitions generally stayed within typical UDP limits. Equally important, response sizes also appeared to have stayed within the Maximum Transmission Unit (MTU) of most networks involved, thereby also avoiding the risk of packet fragmentation. (You can check how well your network handles various DNSSEC response sizes with this tool developed by Verisign Labs.)

The larger sizes associated with certain post-quantum algorithms do not appear to be a significant issue either for TLS, according to one benchmarking study, or for public-key infrastructures, according to another report. However, a recently published study of post-quantum algorithms and DNSSEC observes that "DNSSEC is particularly challenging to transition" to the new algorithms.

Verisign Labs offers the following observations about DNSSEC-related queries that may help researchers to model DNSSEC impact:

A typical resolver that implements both DNSSEC validation and qname minimization will send a combination of queries to Verisign's root and top-level domain (TLD) servers.

Because the resolver is a validating resolver, these queries will all have the "DNSSEC OK" bit set, indicating that the resolver wants the DNSSEC signatures on the records.

The content of typical responses by Verisign's root and TLD servers to these queries are given in Table 1 below. (In the table, . are the final two labels of a domain name of interest, including the TLD and the second-level domain (SLD); record types involved include A, Name Server (NS), and DNSKEY.)

For an A or NS query, the typical response, when the domain of interest exists, includes a referral to another name server. If the domain supports DNSSEC, the response also includes a set of Delegation Signer (DS) records providing the hashes of each of the referred zone's KSKs the next link in the DNSSEC trust chain. When the domain of interest doesn't exist, the response includes one or more Next Secure (NSEC) or Next Secure 3 (NSEC3) records.

Researchers can estimate the effect of post-quantum algorithms on response size by replacing the sizes of the various RSA keys and signatures with those for their post-quantum counterparts. As discussed above, it is important to keep in mind that the number of keys returned may be larger during key rollovers.

Most of the queries from qname-minimizing, validating resolvers to the root and TLD name servers will be for A or NS records (the choice depends on the implementation of qname minimization, and has recently trended toward A). The signature size for a post-quantum algorithm, which affects all DNSSEC-related responses, will therefore generally have a much larger impact on average response size than will the key size, which affects only the DNSKEY responses.

Post-quantum algorithms are among the newest developments in cryptography. They add another dimension to a cryptographer's perspective on the DNS because of the possibility that these algorithms, or other variants, may be added to DNSSEC in the long term.

In my next post, I'll make the case for why the oldest post-quantum algorithm, hash-based signatures, could be a particularly good match for DNSSEC. I'll also share the results of some research at Verisign Labs into how the large signature sizes of hash-based signatures could potentially be overcome.

Read the previous posts in this six-part blog series:

Read more:
Securing the DNS in a Post-Quantum World: New DNSSEC Algorithms on the Horizon - CircleID

The search for dark matter gets a speed boost from quantum technology – The Conversation US

Nearly a century after dark matter was first proposed to explain the motion of galaxy clusters, physicists still have no idea what its made of.

Researchers around the world have built dozens of detectors in hopes of discovering dark matter. As a graduate student, I helped design and operate one of these detectors, aptly named HAYSTAC. But despite decades of experimental effort, scientists have yet to identify the dark matter particle.

Now, the search for dark matter has received an unlikely assist from technology used in quantum computing research. In a new paper published in the journal Nature, my colleagues on the HAYSTAC team and I describe how we used a bit of quantum trickery to double the rate at which our detector can search for dark matter. Our result adds a much-needed speed boost to the hunt for this mysterious particle.

There is compelling evidence from astrophysics and cosmology that an unknown substance called dark matter constitutes more than 80% of the matter in the universe. Theoretical physicists have proposed dozens of new fundamental particles that could explain dark matter. But to determine which if any of these theories is correct, researchers need to build different detectors to test each one.

One prominent theory proposes that dark matter is made of as-yet hypothetical particles called axions that collectively behave like an invisible wave oscillating at a very specific frequency through the cosmos. Axion detectors including HAYSTAC work something like radio receivers, but instead of converting radio waves to sound waves, they aim to convert axion waves into electromagnetic waves. Specifically, axion detectors measure two quantities called electromagnetic field quadratures. These quadratures are two distinct kinds of oscillation in the electromagnetic wave that would be produced if axions exist.

The main challenge in the search for axions is that nobody knows the frequency of the hypothetical axion wave. Imagine youre in an unfamiliar city searching for a particular radio station by working your way through the FM band one frequency at a time. Axion hunters do much the same thing: They tune their detectors over a wide range of frequencies in discrete steps. Each step can cover only a very small range of possible axion frequencies. This small range is the bandwidth of the detector.

Tuning a radio typically involves pausing for a few seconds at each step to see if youve found the station youre looking for. Thats harder if the signal is weak and theres a lot of static. An axion signal in even the most sensitive detectors would be extraordinarily faint compared with static from random electromagnetic fluctuations, which physicists call noise. The more noise there is, the longer the detector must sit at each tuning step to listen for an axion signal.

Unfortunately, researchers cant count on picking up the axion broadcast after a few dozen turns of the radio dial. An FM radio tunes from only 88 to 108 megahertz (one megahertz is one million hertz). The axion frequency, by contrast, may be anywhere between 300 hertz and 300 billion hertz. At the rate todays detectors are going, finding the axion or proving that it doesnt exist could take more than 10,000 years.

On the HAYSTAC team, we dont have that kind of patience. So in 2012 we set out to speed up the axion search by doing everything possible to reduce noise. But by 2017 we found ourselves running up against a fundamental minimum noise limit because of a law of quantum physics known as the uncertainty principle.

The uncertainty principle states that it is impossible to know the exact values of certain physical quantities simultaneously for instance, you cant know both the position and the momentum of a particle at the same time. Recall that axion detectors search for the axion by measuring two quadratures those specific kinds of electromagnetic field oscillations. The uncertainty principle prohibits precise knowledge of both quadratures by adding a minimum amount of noise to the quadrature oscillations.

In conventional axion detectors, the quantum noise from the uncertainty principle obscures both quadratures equally. This noise cant be eliminated, but with the right tools it can be controlled. Our team worked out a way to shuffle around the quantum noise in the HAYSTAC detector, reducing its effect on one quadrature while increasing its effect on the other. This noise manipulation technique is called quantum squeezing.

In an effort led by graduate students Kelly Backes and Dan Palken, the HAYSTAC team took on the challenge of implementing squeezing in our detector, using superconducting circuit technology borrowed from quantum computing research. General-purpose quantum computers remain a long way off, but our new paper shows that this squeezing technology can immediately speed up the search for dark matter.

Our team succeeded in squeezing the noise in the HAYSTAC detector. But how did we use this to speed up the axion search?

Quantum squeezing doesnt reduce the noise uniformly across the axion detector bandwidth. Instead, it has the largest effect at the edges. Imagine you tune your radio to 88.3 megahertz, but the station you want is actually at 88.1. With quantum squeezing, you would be able to hear your favorite song playing one station away.

In the world of radio broadcasting this would be a recipe for disaster, because different stations would interfere with one another. But with only one dark matter signal to look for, a wider bandwidth allows physicists to search faster by covering more frequencies at once. In our latest result we used squeezing to double the bandwidth of HAYSTAC, allowing us to search for axions twice as fast as we could before.

Quantum squeezing alone isnt enough to scan through every possible axion frequency in a reasonable time. But doubling the scan rate is a big step in the right direction, and we believe further improvements to our quantum squeezing system may enable us to scan 10 times faster.

Nobody knows whether axions exist or whether they will resolve the mystery of dark matter; but thanks to this unexpected application of quantum technology, were one step closer to answering these questions.

See more here:
The search for dark matter gets a speed boost from quantum technology - The Conversation US

ADU Professor Receives Us Patent for a First-of-Its-Kind Hybrid Device Set To Advance the Field of Quantum Computing – Al-Bawaba

Abu Dhabi Universitys (ADU) Associate Professor of Electrical Engineering in the College of Engineering (CoE), Dr. Montasir Qasymeh, has received a U.S. patent registered under 10,824,048 B2 to develop a first-of-its-kind device that will be capable of connecting superconducting quantum computers over significant distances.

Superconducting quantum computers are the extraordinary computers of the future that will surpass all current ones - and achieve ultrasensitive sensing and unattackable quantum communication networks. Unlike todays conventional computers, quantum computers can process huge amounts of data and perform computations in powerful new ways that were never possible before. Potential applications of quantum computing include accelerating innovations in artificial intelligence and machine learning and tackling future cybersecurity challenges.

Dr. Qasymehs device is composed of graphene; a substance that has been hailed as a miracle material due to its electrical properties and the fact that it is the worlds thinnest and second strongest material. Graphene has already innovated the technology sector and is being applied today to laptops, smartphones and headphones. Dr. Qasymeh has been working with graphene for the past seven years and has numerous publications that have studied this substance. The device converts a quantum microwave signal containing data to a laser beam using properly design graphene layers that are electrically connected and subjected to a laser pump.

Dr. Montasir Qasymeh said: I am humbled and honored to be granted this U.S. patent. This invention will advance the field of quantum computing in the UAE taking us one step further towards the quantum age.

He also added: The coming era is an era of knowledge wealth, that brings with it the opportunity to advance all of humankind. I would like to express my sincerest gratitude to Abu Dhabi University for supporting this project and providing my team with access to its purpose-built academic facilities. I am proud and grateful for Abu Dhabi Universitys continued investment in research.

Dr. Hamdi Sheibani, Dean of the College of Engineering at ADU commented: We are extremely proud of yet another accomplishment from Dr. Montasir Qasymeh. This U.S. patent for one of our professors is evidence of ADUs culture of innovation and our continued commitment to the UAE Governments National Agenda to diversify our economy and strengthen our research and innovation sector. The College of Engineering at Abu Dhabi University is committed to supporting educators who serve as role models and mentors to their students and peers by leading with example through their teachings and projects.

The project was developed with the funding of two important grants: the ADEK Award for Research Excellence grant, which was awarded for the research proposal Graphene-Based Modulator for Passive Transmission and White Light Communications from the Ministry of Education; and the Takamul grant from the Department of Economic Development, which was awarded for patent filling.

Dr. Qasymeh received a Ph.D. degree in electrical engineering from Dalhousie University in Halifax, Canada in 2010. From 2010 to 2011, he was Mitacs Elevate Postdoctoral Fellow at the Microwave Photonics Research Laboratory, University of Ottawa, Canada. He joined Abu Dhabi University in 2011, where he continues to teach. With over 10 years of experience in the education and research industry, he has published more than 40 articles in reputed refereed journals and international conferences and has led on 4 U.S. patents (1 Issued and 3 pending). He has attracted a significant amount of research funding (approximately AED 1.8 million) including 2 ADEK awards for research excellence.

During his tenure with Abu Dhabi University, Dr. Qasymeh has taught more than 17 different undergraduate and graduate courses. He is an active member of several national and international scientific committees and is a senior member of the Institute of Electrical and Electronics Engineers (IEEE), the worlds largest technical professional organization dedicated to advancing technology. He is currently working on topics that include novel terahertz waveguides, room temperature quantum devices and ultrafast modulators.

See the rest here:
ADU Professor Receives Us Patent for a First-of-Its-Kind Hybrid Device Set To Advance the Field of Quantum Computing - Al-Bawaba

Quantum computing research helps IBM win top spot in patent race – CNET

An IBM patent shows a hexagonal array of qubits in a quantum computer, arranged to minimize problems controlling the finicky data processing elements.

IBM secured 9,130 US patents in 2020, more than any other company as measured by an annual ranking, and this year quantum computing showed up as part of Big Blue's research effort. The company wouldn't disclose how many of the patents were related to quantum computing -- certainly fewer than the 2,300 it received for artificial intelligence work and 3,000 for cloud computing -- but it's clear the company sees them as key to the future of computing.

The IFI Claims patent monitoring service compiles the list annually, and IBM is a fixture at the top. The IBM Research division, with labs around the globe, has for decades invested in projects that are far away from commercialization. Even though the work doesn't always pay dividends, it's produced Nobel prizes and led to entire industries like hard drives, computer memory and database software.

Get the latest tech stories with CNET Daily News every weekday.

"A lot of the work we do in R&D really is not just about the number of patents, but a way of thinking," Jerry Chow, director of quantum hardware system development, said in an exclusive interview. "New ideas come out of it."

IFI's US patent list is dominated by computer technology companies. Second place went to Samsung with 6,415 patents, followed by Canon with 3,225, Microsoft with 2,905 and Intel with 2,867. Next on the list are Taiwan Semiconductor Manufacturing Corp., LG, Apple, Huawei and Qualcomm. The first non-computing company is Toyota, in 14th place.

Internationally, IBM ranked second to Samsung in patents for 2020, and industrial companies Bosch and General Electric cracked the top 10. Many patents are duplicative internationally since it's possible to file for a single patent in 153 countries.

Quantum computing holds the potential to tackle computing problems out of reach of conventional computers. During a time when it's getting harder to improve ordinary microprocessors, quantum computers could pioneer new high-tech materials for solar panels and batteries, improve chemical processes, speed up package delivery, make factories more efficient and lower financial risks for investors.

Industrywide, quantum computing is a top research priority, with dozens of companies investing millions of dollars even though most don't expect a payoff for years. The US government is bolstering that effort with a massive multilab research effort. It's even become a headline event at this year's CES, a conference that more typically focuses on new TVs, laptops and other consumer products.

"Tactical and strategic funding is critical" to quantum computing's success, said Hyperion Research analyst Bob Sorensen. That's because, unlike more mature technologies, there's not yet any virtuous cycle where profits from today's quantum computing products and services fund the development of tomorrow's more capable successors.

IBM has taken a strong early position in quantum computing, but it's too early to pick winners in the market, Sorensen added.

The long-term goal is what's called a fault tolerant quantum computer, one that uses error correction to keep calculations humming even when individual qubits, the data processing element at the heart of quantum computers, are perturbed. In the nearer term, some customers like financial services giant JPMorgan Chase, carmaker Daimler and aerospace company Airbus are investing in quantum computing work today with the hope that it'll pay off later.

Quantum computing is complicated to say the least, but a few patents illustrate what's going on in IBM's labs.

Patent No. 10,622,536 governs different lattices in which IBM lays out its qubits. Today's 27-qubit "Falcon" quantum computers use this approach, as do the newer 65-qubit "Hummingbird" machines and the much more powerful 1,121-qubit "Condor" systems due in 2023.

A close-up view of an IBM quantum computer. The processor is in the silver-colored cylinder.

IBM's lattices are designed to minimize "crosstalk," in which a control signal for one qubit ends up influencing others, too. That's key to IBM's ability to manufacture working quantum processors and will become more important as qubit counts increase, letting quantum computers tackle harder problems and incorporate error correction, Chow said.

Patent No. 10,810,665 governs a higher-level quantum computing application for assessing risk -- a key part of financial services companies figuring out how to invest money. The more complex the options being judged, the slower the computation, but the IBM approach still outpaces classical computers.

Patent No. 10,599,989 describes a way of speeding up some molecular simulations, a key potential promise of quantum computers, by finding symmetries in molecules that can reduce computational complexity.

More comprehensible is patent No. 10,614,370, which describes quantum computing as a service. Because quantum computers typically must be supercooled to within a hair's breadth of absolute zero to avoid perturbing the qubits, and require spools of complicated wiring, most quantum computing customers are likely to tap into online services from companies like IBM, Google, Amazon and Microsoft that offer access to their own carefully managed machines.

See original here:
Quantum computing research helps IBM win top spot in patent race - CNET