Page 69«..1020..68697071..8090..»

Category Archives: Quantum Computing

Life, the universe and everything Physics seeks the future – The Economist

Posted: August 28, 2021 at 11:49 am

Aug 25th 2021

A WISE PROVERB suggests not putting all your eggs in one basket. Over recent decades, however, physicists have failed to follow that wisdom. The 20th centuryand, indeed, the 19th before itwere periods of triumph for them. They transformed understanding of the material universe and thus peoples ability to manipulate the world around them. Modernity could not exist without the knowledge won by physicists over those two centuries.

Your browser does not support the

Get The Economist app and play articles, wherever you are

In exchange, the world has given them expensive toys to play with. The most recent of these, the Large Hadron Collider (LHC), which occupies a 27km-circumference tunnel near Geneva and cost $6bn, opened for business in 2008. It quickly found a long-predicted elementary particle, the Higgs boson, that was a hangover from calculations done in the 1960s. It then embarked on its real purpose, to search for a phenomenon called Supersymmetry.

This theory, devised in the 1970s and known as Susy for short, is the all-containing basket into which particle physicss eggs have until recently been placed. Of itself, it would eliminate many arbitrary mathematical assumptions needed for the proper working of what is known as the Standard Model of particle physics. But it is also the vanguard of a deeper hypothesis, string theory, which is intended to synthesise the Standard Model with Einsteins general theory of relativity. Einsteins theory explains gravity. The Standard Model explains the other three fundamental forceselectromagnetism and the weak and strong nuclear forcesand their associated particles. Both describe their particular provinces of reality well. But they do not connect together. String theory would connect them, and thus provide a so-called theory of everything.

String theory proposes that the universe is composed of minuscule objects which vibrate in the manner of the strings of a musical instrument. Like such strings, they have resonant frequencies and harmonics. These various vibrational modes, string theorists contend, correspond to various fundamental particles. Such particles include all of those already observed as part of the Standard Model, the further particles predicted by Susy, which posits that the Standard Models mathematical fragility will go away if each of that models particles has a heavier supersymmetric partner particle, or sparticle, and also particles called gravitons, which are needed to tie the force of gravity into any unified theory, but are not predicted by relativity.

But, no Susy, no string theory. And, 13 years after the LHC opened, no sparticles have shown up. Even two as-yet-unexplained results announced earlier this year (one from the LHC and one from a smaller machine) offer no evidence directly supporting Susy. Many physicists thus worry they have been on a wild-goose chase.

They have good reason to be nervous. String theory already comes with a disturbing conceptual price tagthat of adding six (or in one version seven) extra dimensions to the universe, over and above the four familiar ones (three of space and one of time). It also describes about 10500 possible universes, only one of which matches the universe in which human beings live. Accepting all that is challenging enough. Without Susy, though, string theory goes bananas. The number of dimensions balloons to 26. The theory also loses the ability to describe most of the Standard Models particles. And it implies the existence of weird stuff such as particles called tachyons that move faster than light and are thus incompatible with the theory of relativity. Without Susy, string theory thus looks pretty-much dead as a theory of everything. Which, if true, clears the field for non-string theories of everything.

The names of many of these do, it must be conceded, torture the English language. They include causal dynamical triangulation, asymptotically safe gravity, loop quantum gravity and the amplituhedron formulation of quantum theory. But at the moment the bookies favourite for unifying relativity and the Standard Model is something called entropic gravity.

Entropy is a measure of a systems disorder. Famously, the second law of thermodynamics asserts that it increases with time (ie, things have a tendency to get messier as they get older). What that has to do with a theory of gravity, let alone of everything, is not, perhaps, immediately obvious. But the link is black holes. These are objects which have such strong gravitational fields that even light cannot escape from them. They are predicted by the mathematics of general relativity. And even though Einstein remained sceptical about their actual existence until the day he died in 1955, subsequent observations have shown that they are indeed real. But they are not black.

In 1974 Stephen Hawking, of Cambridge University, showed that quantum effects at a black holes boundary allow it to radiate particlesespecially photons, which are the particles of electromagnetic radiation, including light. This has peculiar consequences. Photons carry radiant heat, so something which emits them has a temperature. And, from its temperature and mass, it is possible to calculate a black holes entropy. This matters because, when all these variables are plugged into the first law of thermodynamics, which states that energy can be neither created nor destroyed, only transformed from one form (say, heat) into another (say, mechanical work), what pops out are Einsteins equations of general relativity.

That relationship was discovered in 2010 by Erik Verlinde of Amsterdam University. It has serious implications. The laws of thermodynamics rely on statistical mechanics. They involve properties (temperature, entropy and so on) which emerge from probabilistic descriptions of the behaviour of the underlying particles involved. These are also the particles described by quantum mechanics, the mathematical theory which underpins the Standard Model. That Einsteins equations can be rewritten thermodynamically implies that space and time are also emergent properties of this deeper microscopic picture. The existing forms of quantum mechanics and relativity thus do indeed both seem derivable in principle from some deeper theory that describes the underlying fabric of the universe.

String theory is not so derivable. Strings are not fundamental enough entities. But entropic gravity claims to describe the very nature of space and timeor, to use Einsteinian terminology, spacetime. It asserts this is woven from filaments of quantum entanglement linking every particle in the cosmos.

The idea of quantum entanglement, another phenomenon pooh-poohed by Einstein that turned out to be true, goes back to 1935. It is that the properties of two or more objects can be correlated (entangled) in a way which means they cannot be described independently. This leads to weird effects. In particular, it means that two entangled particles can appear to influence each others behaviour instantaneously even when they are far apart. Einstein dubbed this spooky action at a distance, because it seems to violate the premise of relativity theory that, in the speed of light, the universe has a speed limit.

As with black holes, Einstein did not live long enough to see himself proved wrong. Experiments have nevertheless shown he was. Entanglement is real, and does not violate relativity because although the influence of one particle on another can be instantaneous there is no way to use the effect to pass information faster than light-speed. And, in the past five years, Brian Swingle of Harvard University and Sean Carroll of the California Institute of Technology have begun building models of what Dr Verlindes ideas might mean in practice, using ideas from quantum information theory. Their approach employs bits of quantum information (so-called qubits) to stand in for the entangled particles. The result is a simple but informative analogue of spacetime.

Qubits, the quantum equivalent of classical bitsthe ones and zeros on which regular computing is builtwill be familiar to those who follow the field of quantum computing. They are the basis of quantum information theory. Two properties distinguish qubits from the regular sort. First, they can be placed in a state of superposition, representing both a one and a zero at the same time. Second, several qubits can become entangled. Together, these properties let quantum computers accomplish feats such as performing multiple calculations at once, or completing certain classes of calculation in a sensible amount of time, that are difficult or impossible for a regular computer.

And because of their entanglement qubits can also, according to Dr Swingle and Dr Carroll, be used as stand-ins for how reality works. More closely entangled qubits represent particles at points in spacetime that are closer together. So far, quantum computers being a work in progress, this modelling can be done only with mathematical representations of qubits. These do, though, seem to obey the equations of general relativity. That supports entropic-gravity-theorys claims.

All of this modelling puts entropic gravity in pole position to replace strings as the long-sought theory of everything. But the idea that spacetime is an emergent property of the universe rather than being fundamental to it has a disturbing consequence. It blurs the nature of causality.

In the picture built by entropic gravity, spacetime is a superposition of multiple states. It is this which muddies causality. The branch of maths that best describes spacetime is a form of geometry that has four axes at right angles to each other instead of the more familiar three. The fourth represents time, so, like the position of objects, the order of events in spacetime is determined geometrically. If different geometric arrangements are superposed, as entropic gravity requires, it can therefore sometimes happen that the statements A causes B and B causes A are both true.

This is not mere speculation. In 2016 Giulia Rubino of the University of Bristol, in England, constructed an experiment involving polarised photons and prisms which achieved exactly that. This spells trouble for those who have old-fashioned notions about causalitys nature.

However, Lucien Hardy of the Perimeter Institute, in Canada, has discovered a way to reformulate the laws of quantum mechanics to get around this. In his view, causality as commonly perceived is like data compression in computing: it is a concept that gives you more bang for your buck. With a little bit of information about the present, causality can infer a lot about the futurecompressing the amount of information needed to capture the details of a physical system in time.

But causality, Dr Hardy thinks, may not be the only way to describe such correlations. Instead, he has invented a general method for building descriptions of the patterns in correlations from scratch. This method, which he calls the causaloid framework, tends to reproduce causality but it does not assume it, and he has used it to reformulate both quantum theory (in 2005) and general relativity (in 2016). Causaloid maths is not a theory of everything. But there is a good chance that if and when such a theory is found, causaloid principles will be needed to describe it, just as general relativity needed a geometry of four dimensions to describe spacetime.

Entropic gravity has, then, a lot of heavy-duty conceptual work to back it up. But it is not the only candidate to replace string theory. Others jostling for attention include an old competitor called loop quantum gravity, originally proposed in 1994 by Carlo Rovelli, then at the University of Pittsburgh, and Lee Smolin, of the Perimeter Institute. This, and causal dynamical triangulation, a more recent but similar idea, suggest that spacetime is not the smooth fabric asserted by general relativity, but, rather, has a structureeither elementary loops or triangles, according to which of the two theories you support.

A third option, asymptotically safe gravity, goes back still further, to 1976. It was suggested by Steven Weinberg, one of the Standard Models chief architects. A natural way to develop a theory of quantum gravity is to add gravitons to the model. Unfortunately, this approach got nowhere, because when the interactions of these putative particles were calculated at higher energies, the maths seemed to become nonsensical. However, Weinberg, who died in July, argued that this apparent breakdown would go away (in maths speak, the calculations would be asymptotically safe) if sufficiently powerful machines were used to do the calculating. And, with the recent advent of supercomputers of such power, it looks, from early results, as if he might have been right.

One of the most intriguing competitors of entropic gravity, though, is the amplituhedron formulation of quantum theory. This was introduced in 2013 by Nima Arkani-Hamed of the Institute of Advanced Study at Princeton and Jaroslav Trnka of the University of California, Davis. They have found a class of geometric structures dubbed amplituhedrons, each of which encodes the details of a possible quantum interaction. These, in turn, are facets of a master amplituhedron that encodes every possible type of physical process. It is thus possible to reformulate all of quantum theory in terms of the amplituhedron.

Most attempts at a theory of everything try to fit gravity, which Einstein describes geometrically, into quantum theory, which does not rely on geometry in this way. The amplituhedron approach does the opposite, by suggesting that quantum theory is actually deeply geometric after all. Better yet, the amplituhedron is not founded on notions of spacetime, or even statistical mechanics. Instead, these ideas emerge naturally from it. So, while the amplituhedron approach does not as yet offer a full theory of quantum gravity, it has opened up an intriguing path that may lead to one.

That space, time and even causality are emergent rather than fundamental properties of the cosmos are radical ideas. But this is the point. General relativity and quantum mechanics, the physics revolutions of the 20th century, were viewed as profound precisely because they overthrew common sense. To accept relativity meant abandoning a universal notion of time and space. To take quantum mechanics seriously meant getting comfortable with ideas like entanglement and superposition. Embracing entropic gravity or its alternatives will require similar feats of the imagination.

No theory, though, is worth a damn without data. That, after all, is the problem with Supersymmetry. Work like Dr Rubinos points the way. But something out of a particle-physics laboratory would also be welcome. And, though their meaning is obscure, the past few months have indeed seen two experimentally induced cracks in the Standard Model.

On March 23rd a team from CERN, the organisation that runs the LHC, reported an unexpected difference in behaviour between electrons and their heavier cousins, muons. These particles differ from one another in no known properties but their masses, so the Standard Model predicts that when other particles decay into them, the two should each be produced in equal numbers. But this appears not to be true. Interim results from the LHC suggest that a type of particle called a B-meson is more likely to decay into an electron than a muon. That suggests an as-yet-undescribed fundamental force is missing from the Standard Model. Then, on April 7th, Fermilab, Americas biggest particle-physics facility, announced the interim results of its own muon experiment, Muon g-2.

In the quantum world, there is no such thing as a perfect vacuum. Instead, a froth of particles constantly pops in and out of existence everywhere in spacetime. These are virtual rather than real particlesthat is, they are transient fluctuations which emerge straight out of quantum uncertainty. But, although they are short-lived, during the brief periods of their existence they still have time to interact with more permanent sorts of matter. They are, for example, the source of the black-hole radiation predicted by Hawking.

The strengths of their interactions with types of matter more conventional than black holes are predicted by the Standard Model, and to test these predictions, Muon g-2 shoots muons in circles around a powerful superconducting magnetic-storage ring. The quantum froth changes the way the muons wobble, which detectors can pick up with incredible precision. The Muon g-2 experiment suggests that the interactions causing these wobbles are slightly stronger than the Standard Model predicts. If confirmed, this would mean the model is missing one or more elementary particles.

There is a slim chance that these are the absent sparticles. If so, it is the supporters of supersymmetry who will have the last laugh. But nothing points in this direction and, having failed thus far to stand their ideas up, they are keeping sensibly quiet.

Whatever the causes of these two results, they do show that there is something out there which established explanations cannot account for. Similarly unexplained anomalies were starting points for both quantum theory and relativity. It looks possible, therefore, that what has seemed one of physicss darkest periods is about to brighten into a new morning.

This article appeared in the Science & technology section of the print edition under the headline "Bye, bye, little Susy"

See the rest here:

Life, the universe and everything Physics seeks the future - The Economist

Posted in Quantum Computing | Comments Off on Life, the universe and everything Physics seeks the future – The Economist

Reality Might Be a Simulation, Scientists Think Its Possible to Find out for Sure – Road to VR

Posted: at 11:48 am

If youre interested in VR, youve probably thought at least once or twice about the simulation hypothesisthe idea that we might actually already be living in a virtual reality world. Many people are passingly familiar with the idea, especially thanks to films likeThe Matrix, and its been a topic among philosophersin some form or anotherfor perhaps more than a millenium. But did you know that scientists actually think it may be possible to experimentally verify if were living in a simulation?

The simulation hypothesis was boiled down into a useful thought experiment by University of Oxford philosopher Nick Bostrom in a 2003 paper titledAre You Living in a Computer Simulation?which was published in the peer-reviewed Philosophical Quarterly journal.

In the paper, Bostrom explores the idea thatgiven existing trends in computing powera far future posthuman civilization will likely wieldimmense computing powerenough to be easily capable of running simulations of billions of universes just like ours. He raises the question: if we think humanity will one day be capable of simulating billions of universes isnt it likely that were already living in one of those billions of simulations rather than being real ourselves?

Its an intriguing formulation of the simulation hypothesis thats frankly quite difficult to argue against. Bostroms paper has spurred serious discussion about the topic; its been cited by more than 1,000 other academic papers since its publication.

Beyond philosophers, scientists have taken the simulation hypothesis seriously too, especially in the mysterious realm of quantum physics. Several papers have hypothesized ways of actually testing if our reality is a simulation.

In the 2012 paper Constraints on the Universe as a Numerical Simulation, published in the peer-reviewed European Physical Journal A, physicists Silas R. Beane,Zohreh Davoudi, andMartin J. Savage write that recent developments in simulating quantum interactions point toward a future where a full-fledged universe simulation is possible, which suggests that experimental searches for evidence that our universe is, in fact, a simulation are both interesting and logical.

According to the authors, quantum computing looks like a reasonable foundation for simulating an entire universe. But like any program, a simulated universe will have some fundamental limitations of precision. If our reality is based on a quantum computing simulation, the authors argue, we should be able to predict some of those fundamental limitations and then go searching for them in nature.

Specifically the authors say theyre looking at the possibility that the simulations [] employ an underlying cubic lattice structure, which is foundationally similar to small-scale quantum computing-based simulations that humanity is capable of running today. If we could observe limitations in our reality that are consistent with an underlying lattice structure for space-time, instead of a continuous space-time, the authors say it could be evidence that our universe is indeed a simulation.

The authors leave us with a tantalizing conclusionthat it may be impossible for a simulation to be fully hidden from its subjects.

[] assuming that the universe is finite and therefore the resources of potential simulators are finite, then a volume containing a simulation will be finite and a lattice spacing must be non-zero, and therefore in principle there always remains the possibility for the simulated to discover the simulators.

In the 2017 paperOn Testing the Simulation Theory, published in the peer-reviewed International Journal of Quantum Foundations, authorsTom Campbell, Houman Owhadi, Joe Sauvageau, and David Watkinson start with a similar premise to the above conclusionthat a simulated universe likely operates with finite resources. If thats the case, they argue, we should be looking for evidence that the behavior of our universe is consistent with a simulation optimized for computing performance.

The paper introduces a concept that will be familiar to game developersas a matter of optimization to run a game with finite computing power, games only render what the player can see at any given moment. Anything more would be a waste that would drastically slow down the game.

The authors point out that physicists are already aware of a feature of the universe that seems suspiciously similar to rendering a game only where the player is looking. That would be the so-called Wave Function Collapse, in which fundamental particles appear to act as wave functions up until the point that they are observed, at which point their wave characteristics collapse and into predictable particle interactions.

The paper lays out a number of specific variations of the perplexing Double-slit Experiment, that are designed to isolate the precise role of the observer in determining the experimental outcome. The ultimate goal of the experiments is to look for a situation in which the universe would change its behavior in order to avoid creating a paradox. If this was observed, the authors argue, it would be an indicator of a VR engine [simulated universe] reacting to the intent of the experiment.

Further the authors suggest that finding a conflict between likely requirements of any such simulation (logical consistency and avoidance of detection) could reveal observations consistent with a simulated universe.

Two strategies can be followed to test the simulation theory: (1) Test the moment of rendering; (2) Exploit conflicting requirement of logical consistency preservation and detection avoidance to force the VR rendering engine to create discontinuities in its rendering or produce a measurable signature event within our reality that indicates that our reality must be simulated, the authors write.

More:

Reality Might Be a Simulation, Scientists Think Its Possible to Find out for Sure - Road to VR

Posted in Quantum Computing | Comments Off on Reality Might Be a Simulation, Scientists Think Its Possible to Find out for Sure – Road to VR

Deloitte’s quantum computing leader on the technology’s healthcare future – Healthcare IT News

Posted: August 22, 2021 at 3:44 pm

Quantum computing has enormous potential in healthcare and has started to impact the industry in various ways.

For example, quantum computing offers the ability to track and diagnose disease. Using sensors, quantum technology has the ability to track the progress of cancer treatments and diagnose and monitor such degenerative diseases as multiple sclerosis.

The tech also can help modernize supply chains. Quantum technology can solve routing issues in real time using live data such as weather and traffic updates to help determine the most efficient method of delivery. This would have been particularly helpful during the pandemic since many states had issues with vaccine deliveries.

Elsewhere, quantum technology can impact early-stage drug discovery. Pharmaceuticals can take a decade or longer to bring to market. Quantum computing could lower the costs and reduce the time.

"In the simplest terms, quantum computing harnesses the mysterious properties of quantum mechanics to solve problems using individual atoms and subatomic particles," explained Scott Buchholz, emerging technology research director and government and public services CTO at Deloitte Consulting. "Quantum computers can be thought of as akin to supercomputers.

"However, today's supercomputers solve problems by performing trillions of math calculations very quickly to predict the weather, study air flow over wings, etc.," he continued. "Quantum computers work very differently they perform calculations all at once, limited by the number of qubitsof information that they currently hold."

Because of how differently they work, they aren't well suited for all problems, but they're a fit forcertain types of problems, such as molecular simulation, optimization and machine learning.

"What's important to note is that today's most advanced quantum computers still aren't especially powerful," Buchholz noted.

"Many calculations they currently can do can be performed on a laptop computer. However, if quantum computers continue to scale exponentially that is, the number of qubitsthey use for computation continues to double every year or so they will become dramatically more powerful in years to come.

"Because quantum computers can simulate atoms and other molecules much better than classical computers, researchers are investigating the future feasibility of doing drug discovery, target protein matching, calculating protein folding and more," he continued.

"That is, during the drug discovery process, they could be useful to dramatically reduce the time required to sort through existing databases of molecules to look for targets, identify potential new drugs with novel properties, identify potential new targets and more."

Researchers also are investigating the possibility of simulating or optimizing manufacturing processes for molecules, which potentially could help make scaling up manufacturing easier over time. While these advances won't eliminate the lengthy testing process, they may well accelerate the initial discovery process for interesting molecules.

"Quantum computing may also directly and indirectly lead to the ability to diagnose disease," Buchholz said. "Given future machines' ability to sort through complex problems quickly, they may be able to accelerate the processing of some of the techniques that are being developed today, say those that are designed to identify harmful genetic mutations or combinations.

"Indirectly, some of the materials that were investigated for quantum computers turned out to be better as sensors," he added. "Researchers are investigating quantum-based technologies to make smaller, more sensitive, lower-power sensors. In the future, these sensors and exotic materials may be combined in clever ways to help with disease identification and diagnosis."

Quantum computers will improve the ability to optimize logistics and routing, potentially easing bottlenecks in supply chains or identifying areas of improvement, Buchholz said.

Perhaps more interestingly, due to their ability to simulate molecular interactions, researchers are looking at their ability to optimize manufacturing processes to be quicker, use less energy and produce less waste, he added. That could lead to alternative manufacturing techniques that could simplify healthcare supply chains, he noted.

"Ultimately, the promise of quantum computers is to make some things faster like optimization and machine learning and make some things practical like large scale molecular and process simulation," he said.

"While the technology to solve the 'at scale' problems is still several years in the future, researchers currently are working hard today to put the foundations in place to tackle these problems as the hardware capacity of quantum computers advances.

"Should the hardware researchers achieve some of the sought after scalability breakthroughs, that promise could accelerate," he concluded.

Twitter:@SiwickiHealthITEmail the writer:bsiwicki@himss.orgHealthcare IT News is a HIMSS Media publication.

Originally posted here:

Deloitte's quantum computing leader on the technology's healthcare future - Healthcare IT News

Posted in Quantum Computing | Comments Off on Deloitte’s quantum computing leader on the technology’s healthcare future – Healthcare IT News

Urgent Warning Issued Over The Future Of Bitcoin Even As The Crypto Market Price Smashes Past $2 Trillion – Forbes

Posted: at 3:44 pm

Bitcoin and cryptocurrencies have seen a huge resurgence over the last year following the brutal so-called crypto winter that began in 2018.

The bitcoin price has this year climbed to never-before-seen highs, topping $60,000 per bitcoin before falling back slightly. Other smaller cryptocurrencies have risen at an even faster clip than bitcoin, with many making percentage gains into the thousands.

Now, as bitcoin and cryptocurrencies begin to carve out a place among traditional assets in investor portfolios, technologists have warned that advances in quantum computing could mean the encryption that underpins bitcoin is "fundamentally" undermined as soon as 2026unless the software is updated.

Sign up now for the freeCryptoCodexA free, daily newsletter for the crypto-curious. Helping you understand the world of bitcoin and crypto, every weekday

Subscribe now to Forbes' CryptoAsset & Blockchain Advisor and discover crypto blockbusters poised for 1,000% gains

The bitcoin price has risen many hundreds of percent over the last few years but quantum computing ... [+] could spell the end of bitcoin and cryptocurrencies unless urgent action is taken.

"Quantum computers, expected to be operational by around 2026, will easily undermine any blockchain security systems because of their power," says the founder of quantum encryption company Arqit, David Williams, speaking over the phone. Arqit is gearing up for a September SPAC listing in New York.

"There needs to be rather more urgency," Williams adds.

Quantum computing, which sees traditional computer "bits" replaced with quantum particles (qubits) that can calculate information at vastly increased speed, has been in development since the 1990s. Researchers at universities around the world are now on the verge of creating a working quantum computer, with search giant Google and scientists from the University of New South Wales in Sydney, Australia, recently making headlines with breakthroughs.

Williams, pointing to problems previously identified by the cofounder of ethereum and creator of cardano, Charles Hoskinson, warns that upgrading to post-quantum algorithms will "dramatically slow blockchains down" and called for blockchain developers to adopt so-called quantum encryption keys.

"Blockchains are effectively fundamentally flawed if they dont address the oncoming quantum age. The grownups in the room know what's coming."

Others have also begun working on getting bitcoin and other blockchains ahead of quantum computing.

"If this isn't addressed before quantum computers pose a threat, the impact would be massive," says Duncan Jones, head of quantum cybersecurity at Cambridge Quantum Computing, speaking via email. "Attackers could create fraudulent transactions and steal currency, as well as potentially disrupting blockchain operations."

Earlier this month, Cambridge Quantum Computing, along with the Inter-American Development Bank and Tecnolgico de Monterrey, identified four potential threats to blockchain networks posed by quantum computers and used a post-quantum cryptography layer to help protect them.

CryptoCodexA free, daily newsletter for the crypto-curious

Subscribe now to Forbes' CryptoAsset & Blockchain Advisor and discover crypto blockbusters poised for 1,000% gains

"Time is of the essence here," says Jones, pointing to Google chief executive Sundar Pichai's prediction that encryption could be broken in as little as five to 10 years. "It's important for decentralized networks to start this migration process soon because it requires careful planning and execution. However, I'm hopeful the core developers behind these platforms understand the issues and will be addressing them."

Recently, it's been reported that China is pulling ahead in the global quantum race, something Williams fears could undermine both traditional and crypto markets to the same degree as the 2008 global financial crisis.

"On day one, the creation of a quantum computer doesn't break everything," says Williams. "It will probably initially happen in secret and the information will slowly leak out that the cryptography has been broken. Then there will be a complete loss of confidence, similar to how the global financial crisis saw confidence in the system disintegrate."

With more than 11,000 different cryptocurrencies now listed on crypto data website CoinMarketCap and competition between bitcoin and other major cryptocurrencies reaching fever pitch, adding protection against the coming quantum revolution could be beneficial.

"If anyone one blockchain company could deliver proof it's quantum-safe it would have an advantage," says Williams.

Go here to see the original:

Urgent Warning Issued Over The Future Of Bitcoin Even As The Crypto Market Price Smashes Past $2 Trillion - Forbes

Posted in Quantum Computing | Comments Off on Urgent Warning Issued Over The Future Of Bitcoin Even As The Crypto Market Price Smashes Past $2 Trillion – Forbes

Energy Department Sets $61M of Funding to Advance QIS Research – MeriTalk

Posted: at 3:44 pm

The U.S. Department of Energy (DOE) has announced $61 million in funding for infrastructure and research projects to advance quantum information science (QIS).

Specifically, the DOE is supplying $25 million in funding for creating quantum internet testbeds, which will advance foundational building blocks including devices, protocols, technology, and techniques for quantum error correction at the internet scale.

The DOE also is providing $6 million in funding for scientists to study and develop new devices to send and receive quantum network traffic and advance a continental-scale quantum internet.

Lastly, the DOE granted $30 million of funding to five DOE Nanoscale Science Research Centers to support cutting-edge infrastructure for nanoscience-based research to strengthen the United States competitiveness in QIS and enable the development of nanotechnologies.

Harnessing the quantum world will create new forms of computers and accelerate our ability to process information and tackle complex problems like climate change, said U.S. Secretary of Energy Jennifer M. Granholm in a statement. DOE and our labs across the country are leading the way on this critical research that will strengthen our global competitiveness and help corner the markets of these growing industries that will deliver the solutions of the future.

The DOE recognized the advantages of QIS back in 2018 when it became an integral partner in theNational Quantum Initiative, which became law in December 2018. Since then, the DOE Office of Science has launched a range of multidisciplinary research programs in QIS, including developing quantum computers as testbeds, designing new algorithms for quantum computing, and using quantum computing to model fundamental physics, chemistry, and materials phenomena.

Continued here:

Energy Department Sets $61M of Funding to Advance QIS Research - MeriTalk

Posted in Quantum Computing | Comments Off on Energy Department Sets $61M of Funding to Advance QIS Research – MeriTalk

This breakthrough paves the way for more powerful and compact quantum computers – Tech News Inc

Posted: at 3:44 pm

Yurchanka Siarhei / Shutterstock.com

Australian engineers recently overcame a major hurdle, paving the way for the development of a new generation of more powerful and compact quantum computers.

Although impressive progress has been made in recent years in quantum computing, the simultaneous management of a large number of qubit It is a big challenge for this type of machine. In the context of the work published in the magazine science progressand researchers fromUniversity of New South Wales (UNSW) I found a way to control millions of them at once.

Traditional computers store and process data in the form of binary bits (0 or 1). For their part, quantum machines use qubit , or quantum bits, which can exist in a simultaneous superposition of these two states, dramatically increasing computing power.

In quantum silicon processors, information is encoded in yarn An electron (that is, the property that gives it magnetically), with an upward and downward rotation representing ones and zeros, is generally obtained thanks to the magnetic field produced by wires arranged along qubits. Problem: These wires take up a lot of space and also generate a lot of heat, currently limiting the number of bits per chip to a few dozen.

the teamUniversity of New South Wales He recently developed a new approach to applying a magnetic field to a large number of qubits simultaneously. This is based on a crystal prism called a dielectric resonator, which is placed just above the silicon wafer. Microwaves are directed toward this prism reducing their length to less than a millimeter, creating a magnetic field that controls the rotation of the qubits below.

Two major innovations are included here , he explains Jared Blah, lead author of the study. First, we dont need to use a lot of energy to get a strong magnetic field for qubits, which means we dont produce a lot of heat. Second, the field produced turns out to be very homogeneous, so the millions of qubits on a silicon chip would all benefit from the same level of control..

So far, this field has made it possible to invert individual qubit states, and more work will be needed to achieve the overlap between two states simultaneously. According to the team, this method should eventually allow up to four million qubits to be controlled simultaneously.

Go here to see the original:

This breakthrough paves the way for more powerful and compact quantum computers - Tech News Inc

Posted in Quantum Computing | Comments Off on This breakthrough paves the way for more powerful and compact quantum computers – Tech News Inc

IBM Partnering with University of Tokyo on Quantum Computer – Datamation

Posted: August 16, 2021 at 1:32 pm

TOKYO IBM and the University of Tokyo have unveiled one of the most powerful quantum computers in Japan.

IBM Quantum System One is part of the Japan-IBM Quantum Partnership between the University of Tokyo and IBM to advance Japans exploration of quantum science, business, and education, according to IBM last month.

IBM Quantum System One is now operational for researchers at both scientific institutions and businesses in Japan, with access administered by the University of Tokyo.

IBM is committed to the growth of the global quantum ecosystem and fostering collaboration between different research communities,, said Dr. Dario Gil, director, IBM Research.

The quantum computer gives users access to repeatable and predictable performance from high-quality qubits and high-precision control electronics, with quantum resources tightly coupled with classical processing, according to IBM. Users can securely run algorithms requiring repetition of quantum circuits in the cloud.

The IBM Quantum System One in Japan is the second system of its kind by IBM to be built outside the U.S. In June, IBM unveiled an IBM Quantum System One in Munich, Germany, which is administered by Fraunhofer Geselleschaft, a scientific research organization.

IBMs quantum efforts are intended to help advance quantum computing and develop a skilled quantum workforce worldwide.

Gil is excited to see the contributions to research that will be made by Japans world-class academic, private sector, and government institutions.

Together, we can take major steps to accelerate scientific progress in a variety of fields, Gil said.

Teruo Fujii, president of the University of Tokyo, said that in the rapidly changing field of quantum technology, it is extremely important not only to develop quantum technology-related elements and systems, but also to foster the next generation of human resources in order to achieve advanced social implementation on a global scale.

Our university has a broad base of research talents and has been always promoting high-level quantum education from the undergraduate level. Now, we will further refine the development of the next generation of quantum native skill sets by utilizing IBM Quantum System One.

In 2020, IBM and the University of Tokyo launched the Quantum Innovation Initiative Consortium (QIIC), with the goal of strategically accelerating quantum computing research and development activities in Japan by bringing together academic talent from across the countrys universities, research associations, and industry.

In the last year, IBM has also announced partnerships that include a focus on quantum information science and technology with several organizations: the Cleveland Clinic, the UKs Science and Technologies Facilities Council, and the University of Illinois Urbana-Champaign.

See the rest here:

IBM Partnering with University of Tokyo on Quantum Computer - Datamation

Posted in Quantum Computing | Comments Off on IBM Partnering with University of Tokyo on Quantum Computer – Datamation

A Simple Crystal Could Finally Give Us Large-Scale Quantum Computing, Scientists Say – ScienceAlert

Posted: at 1:32 pm

Vaccine and drug development, artificial intelligence, transport and logistics, climate science - these are all areas that stand to be transformed by the development of a full-scale quantum computer. And there has been explosive growth in quantum computing investment over the past decade.

Yet current quantum processors are relatively small in scale, with fewer than 100 qubits - the basic building blocks of a quantum computer. Bits are the smallest unit of information in computing, and the term qubits stems from "quantum bits".

While early quantum processors have been crucial for demonstrating the potential of quantum computing, realizing globally significant applications will likely require processors with upwards of a million qubits.

Our new research tackles a core problem at the heart of scaling up quantum computers: how do we go from controlling just a few qubits, to controlling millions? In research published today in Science Advances, we reveal a new technology that may offer a solution.

Quantum computers use qubits to hold and process quantum information. Unlike the bits of information in classical computers, qubits make use of the quantum properties of nature, known as "superposition" and "entanglement", to perform some calculations much faster than their classical counterparts.

Unlike a classical bit, which is represented by either 0 or 1, a qubit can exist in two states (that is, 0 and 1) at the same time. This is what we refer to as a superposition state.

Demonstrations by Google and others have shown even current, early-stage quantum computers can outperform the most powerful supercomputers on the planet for a highly specialized (albeit not particularly useful) task - reaching a milestone we call quantum supremacy.

Google's quantum computer, built from superconducting electrical circuits, had just 53 qubits and was cooled to a temperature below -273 in a high-tech refrigerator. This extreme temperature is needed to remove heat, which can introduce errors to the fragile qubits. While such demonstrations are important, the challenge now is to build quantum processors with many more qubits.

Major efforts are underway at UNSW Sydney to make quantum computers from the same material used in everyday computer chips: silicon. A conventional silicon chip is thumbnail-sized and packs in several billion bits, so the prospect of using this technology to build a quantum computer is compelling.

In silicon quantum processors, information is stored in individual electrons, which are trapped beneath small electrodes at the chip's surface. Specifically, the qubit is coded into the electron's spin. It can be pictured as a small compass inside the electron. The needle of the compass can point north or south, which represents the 0 and 1 states.

To set a qubit in a superposition state (both 0 and 1), an operation that occurs in all quantum computations, a control signal must be directed to the desired qubit. For qubits in silicon, this control signal is in the form of a microwave field, much like the ones used to carry phone calls over a 5G network. The microwaves interact with the electron and cause its spin (compass needle) to rotate.

Currently, each qubit requires its own microwave control field. It is delivered to the quantum chip through a cable running from room temperature down to the bottom of the refrigerator at close to -273. Each cable brings heat with it, which must be removed before it reaches the quantum processor.

At around 50 qubits, which is state-of-the-art today, this is difficult but manageable. Current refrigerator technology can cope with the cable heat load. However, it represents a huge hurdle if we're to use systems with a million qubits or more.

An elegant solution to the challenge of how to deliver control signals to millions of spin qubits was proposed in the late 1990s. The idea of "global control" was simple: broadcast a single microwave control field across the entire quantum processor.

Voltage pulses can be applied locally to qubit electrodes to make the individual qubits interact with the global field (and produce superposition states).

It's much easier to generate such voltage pulses on-chip than it is to generate multiple microwave fields. The solution requires only a single control cable and removes obtrusive on-chip microwave control circuitry.

For more than two decades global control in quantum computers remained an idea. Researchers could not devise a suitable technology that could be integrated with a quantum chip and generate microwave fields at suitably low powers.

In our work we show that a component known as a dielectric resonator could finally allow this. The dielectric resonator is a small, transparent crystal which traps microwaves for a short period of time.

The trapping of microwaves, a phenomenon known as resonance, allows them to interact with the spin qubits longer and greatly reduces the power of microwaves needed to generate the control field. This was vital to operating the technology inside the refrigerator.

In our experiment, we used the dielectric resonator to generate a control field over an area that could contain up to four million qubits. The quantum chip used in this demonstration was a device with two qubits. We were able to show the microwaves produced by the crystal could flip the spin state of each one.

There is still work to be done before this technology is up to the task of controlling a million qubits. For our study, we managed to flip the state of the qubits, but not yet produce arbitrary superposition states.

Experiments are ongoing to demonstrate this critical capability. We'll also need to further study the impact of the dielectric resonator on other aspects of the quantum processor.

That said, we believe these engineering challenges will ultimately be surmountable - clearing one of the greatest hurdles to realizing a large-scale spin-based quantum computer.

Jarryd Pla, Senior Lecturer in Quantum Engineering, UNSW and Andrew Dzurak, Scientia Professor in Quantum Engineering, UNSW.

This article is republished from The Conversation under a Creative Commons license. Read the original article.

Link:

A Simple Crystal Could Finally Give Us Large-Scale Quantum Computing, Scientists Say - ScienceAlert

Posted in Quantum Computing | Comments Off on A Simple Crystal Could Finally Give Us Large-Scale Quantum Computing, Scientists Say – ScienceAlert

Quantum Takes the Scenic Route in Automotive – The Next Platform

Posted: at 1:32 pm

The automotive industry has shown keen and early in quantum computing over the last several years, beginning most notably with Volkswagen, which rolled out a traffic simulation system with its hardware partner, D-Wave. That was in 2017 and while there have been a few other stories focused on quantums role in everything from traffic to designing better fuel technologies, the automotive momentum for quantum seems stuck in the slow lane.

Think tanks like McKinsey, for instance, see a vibrant role for quantum in everything from pushing the EV arena forward and working optimization magic for auto warehousing, dealers, repair shops, and supply chain management. Still, much of this seems far off in terms of broad commercial integration.

For quantum to jump into the daily express lane for the automotive industry, it might take a piecemeal approach, with certain elements of vehicle design or use finding a fit on quantum systems. But there has to be cause to put forth the software and time investment (better, cheaper, faster, etc.).

One example might be in a specific aspect of vehicle designmodeling drive cycles, an aspect that directly relates to a cars efficiency and operation. This has long-since been the domain of Fourier transform simulations on high performance computing systems. But this problem appears to be well-suited to gate-based quantum systems, as recently demonstrated.

Using IBM-Q quantum services, a team was able to reach Fourier-driven drive cycle modeling results faster via a 15-qubit run on the IBM-Q16 Melbourne quantum simulator, paving the way for other workloads based on Fourier transform for quantum machines. These possible future uses can include everything from solving PDEs used in various HPC areas as well as in signal processing, compression, acoustics, and other areas.

While their results are promising, this is still a small quantum simulator and the team observed significant noise in the process, which meant they had to create and use error correction mechanisms. This is one of the most important barriers to practical quantum computing.

Current quantum computers are known to have errors, and in the era of NISQ, it is imperative to develop methods that can achieve quantum speedups despite these errors. The study proposed a simple error correction method to estimate the probabilities consistent with QFT, without compromising the computational complexity. The method was able to reasonably well recover the probabilities.

While this quantum simulation work for Fourier transforms is promising, the team behind the results says that in transportation in particular, the scalability of quantum systems is far from ready for large-scale programs that could have real-world implications. For instance, they say, even a modest network of 1000 vehicles and 64 road sections would require 6000 qubits, which would be extremely cost prohibitive.

Despite clear limitations, they add that we are nonetheless embarking on an exciting frontier of quantum computing that has significant implications on vehicle dynamics, transportation planning and traffic management. These could help with identifying issues quickly and rapidly determining optimal responses, which could in turn help reduce congestion, emissions and improve safety.

As for optimistic McKinsey, they see opportunities for automotive every step of the way, from component design to global supply chains. It will just take a whilea long while.

They estimate one-fifth of companies in the QC value chain provide enabling solutions. Their offerings include existing components, such as cooling units, processing tools for making qubits, and the materials that compose qubits. This area could become a potential playing field for some upstream automotive suppliers, including tier-two and tier-three vendors, which produce control units and thermal solutions that are potentially transferrable to quantum computers.

They add, Automotive suppliers will not immediately profit from large-scale-production opportunities, since QC is still in its infancy, but they will over the long term. We expect enablers to become more relevant as the QC industry matures, gains scale, and one hardware approach begins to dominate.

Featuring highlights, analysis, and stories from the week directly from us to your inbox with nothing in between.Subscribe now

Originally posted here:

Quantum Takes the Scenic Route in Automotive - The Next Platform

Posted in Quantum Computing | Comments Off on Quantum Takes the Scenic Route in Automotive – The Next Platform

Quantum Computing Tech is Amazing. But What Does Business Think? – DesignNews

Posted: August 14, 2021 at 1:16 am

Recent scientific and technological breakthroughs in quantum computing hardware and software demonstrate the commercial viability of quantum computers. Specifically, Honeywell and Cambridge Quantum just announced three scientific and technical milestones that significantly move large-scale quantum computing into the commercial world

These milestones include demonstrated real-time quantum error correction (QEC), doubling the quantum volume of Honeywells System H1 to 1,024, and developing a new quantum algorithm that uses fewer qubits to solve optimization problems. Lets break each of these topical areas down into understandable bits of information.

Related: What Will it Take to Make a Successful Quantum Computing Platform? Two Things

Optical signal conditioning used on quantum computers.

Real-time quantum error correction (QEC) is used in quantum computing to protect the information from errors due to decoherence and other quantum noise. Quantum decoherence is the loss of coherence. Decoherence can be viewed as the loss of information from a system into the environment. Quantum coherence is needed to perform computing on quantum information encoded in quantum states.

Related: 4 Experts Let The Cat Out Of The Box On Quantum Computing And Electronic Design

In contrast, classical error correction employs redundancy. The simplest way to achieve redundancy is to store the information multiple times in memory and then constantly compare the information to determine if corruption has occurred.

Another difference between classical and quantum error correction is one of continuity. In classic error correction, the bit is either a 1 or a 0, i.e., it is either flipped on or off. However, errors are continuous in the quantum state. Continuous errors can occur on a qubit, in which a qubit is partially flipped, or the phase is partially changed.

Honeywell researchers have addressed quantum error correction by creating a single logical qubit from seven of the ten physical qubits available on the H1 Model and then applying multiple rounds of QEC. Protected from the main types of errors that occur in a quantum computer, the logical qubit combats errors that accumulate during computations.

Quantum Volume (QV) is the other key metric used to gauge quantum computing performance. QV is a single number meant to encapsulate the performance of quantum computers, like a classical computer's transistor count in Moores Law.

QV is a hardware-agnostic metric that IBM initially used to measure the performance of its quantum computers. This metric was needed since a classical computers transistor count and a quantum computers quantum bit count isnt the same. Qubits decohere, forgetting their assigned quantum information in less than a millisecond. For quantum computers to be commercially viable and useful, they must have a few low-error, highly connected, and scalable qubits to ensure a fault-tolerant and reliable system. That is why QV now serves as a benchmark for the progress being made by quantum computers to solve real-world problems.

According to Honeywells recent release, the System Model H1 has become the first to achieve a demonstrated quantum volume of 1024. This QV represents a doubling of its record from justfour months ago.

The third milestone comes from Cambridge Quantum Computing recently merged with Honeywell - also has developed a new quantum algorithm that uses fewer qubits to solve optimization problems.

Honeywell and Cambridge Quantum Computing (CQC) have met three key quantum milestones with the Model H1 systems.

John Blyler is a Design News senior editor, covering the electronics and advanced manufacturing spaces. With a BS in Engineering Physics and an MS in Electrical Engineering, he has years of hardware-software-network systems experience as an editor and engineer within the advanced manufacturing, IoT and semiconductor industries. John has co-authored books related to system engineering and electronics for IEEE, Wiley, and Elsevier.

Read the original here:

Quantum Computing Tech is Amazing. But What Does Business Think? - DesignNews

Posted in Quantum Computing | Comments Off on Quantum Computing Tech is Amazing. But What Does Business Think? – DesignNews

Page 69«..1020..68697071..8090..»