by Chris Woodford. Last updated: March 9, 2018.
How can you get more and more out of less and less? The smaller computers get, the more powerful they seem to become: there's more number-crunching ability in a 21st-century cellphone than you'd have found in a room-sized, military computer 50 years ago. Yet, despitesuch amazing advances, there are still plenty of complex problemsthat are beyond the reach of even the world's most powerfulcomputersand there's no guarantee we'll ever be able to tacklethem. One problem is that the basic switching and memory units ofcomputers, known as transistors, are now approaching the point wherethey'll soon be as small as individual atoms. If we want computersthat are smaller and more powerful than today's, we'll soon need todo our computing in a radically different way. Entering the realm ofatoms opens up powerful new possibilities in the shape of quantumcomputing, with processors that could work millions of timesfaster than the ones we use today. Sounds amazing, but the trouble isthat quantum computing is hugely more complex than traditionalcomputing and operates in the Alice in Wonderland world of quantumphysics, where the "classical," sensible, everyday laws of physics no longer apply. What isquantum computing and how does it work? Let's take a closer look!
Photo: Quantum computing means storing and processing information using individual atoms, ions, electrons, or photons. On the plus side, this opens up the possibility of faster computers, but the drawback is the greater complexity of designing computers that can operate in the weird world of quantum physics.
You probably think of a computer as a neat little gadget that sits on your lap and lets you send emails, shop online, chat to your friends, or play gamesbut it's much moreand much lessthan that. It's more, because it's a completely general-purposemachine: you can make it do virtually anything you like. It'sless, because inside it's little more than an extremely basiccalculator, following a prearranged set of instructions called aprogram. Like the Wizard of Oz, the amazing things you see in front of youconceal some pretty mundane stuff under the covers.
Photo: This is what one transistor from a typical radio circuit board looks like. In computers, the transistors are much smaller than this and millions of them are packaged together onto microchips.
Conventional computers have two tricks that they do really well: they can storenumbers in memory and they can process stored numbers with simple mathematical operations (like add and subtract). They can do more complex things by stringing together the simple operations into a series called an algorithm (multiplying can bedone as a series of additions, for example). Both of a computer's keytricksstorage and processingare accomplished using switchescalled transistors, which are like microscopic versions of theswitches you have on your wall for turning on and off the lights. Atransistor can either be on or off, just as a light can either be litor unlit. If it's on, we can use a transistor to store a number one(1); if it's off, it stores a number zero (0). Long strings of onesand zeros can be used to store any number, letter, or symbol using acode based on binary (so computers store an upper-case letter A as1000001 and a lower-case one as 01100001). Each of the zeros or ones is called a binary digit (or bit) and, with a string of eight bits, you can store 255 differentcharacters (such as A-Z, a-z, 0-9, and most common symbols).Computers calculate by using circuits called logic gates,which are made from a number of transistors connected together. Logicgates compare patterns of bits, stored in temporary memories calledregisters, and then turn them into new patterns of bitsandthat's the computer equivalent of what our human brains would calladdition, subtraction, or multiplication. In physical terms, thealgorithm that performs a particular calculation takes the form of anelectronic circuit made from a number of logic gates, with the output from one gate feeding in as the input to the next.
The trouble with conventional computers is that they depend onconventional transistors. This might not sound like a problem if yougo by the amazing progress made in electronics over the last fewdecades. When the transistor was invented, back in 1947, the switchit replaced (which was called the vacuum tube) was about asbig as one of your thumbs. Now, a state-of-the-art microprocessor(single-chip computer) packs hundreds of millions (and up to twobillion) transistors onto a chip of silicon the size of yourfingernail! Chips like these, which are called integrated circuits, are an incredible feat of miniaturization. Back in the1960s, Intel co-founder Gordon Moore realized that the power ofcomputers doubles roughly 18 monthsand it's been doing so eversince. This apparently unshakeable trend is known as Moore's Law.
Photo: This memory chip from a typical USB stick contains an integrated circuit that can store 512 megabytes of data. That's roughly 500 million characters (536,870,912 to be exact), each of which needs eight binary digitsso we're talking about 4 billion (4,000 million) transistors in all (4,294,967,296 if you're being picky) packed into an area the size of a postage stamp!
It sounds amazing, and it is, but it misses the point. The moreinformation you need to store, the more binary ones and zerosandtransistorsyou need to do it. Since most conventional computers canonly do one thing at a time, the more complex the problem you wantthem to solve, the more steps they'll need to take and the longerthey'll need to do it. Some computing problems are so complex thatthey need more computing power and time than any modern machine couldreasonably supply; computer scientists call those intractableproblems.
As Moore's Law advances, so the number of intractable problemsdiminishes: computers get more powerful and we can do more with them.The trouble is, transistors are just about as small as we can makethem: we're getting to the point where the laws of physics seem likelyto put a stop to Moore's Law. Unfortunately, there are still hugelydifficult computing problems we can't tackle because even the mostpowerful computers find them intractable. That's one of the reasonswhy people are now getting interested in quantum computing.
Things on a very small scale behave like nothing you have any direct experience about... or like anything that you have ever seen.
Richard Feynman
Quantum theory is the branch of physics that deals with the world ofatoms and the smaller (subatomic) particles inside them. You mightthink atoms behave the same way as everything else in the world, intheir own tiny little waybut that's not true: on the atomic scale, the rules change and the "classical" laws of physics we take for granted in our everyday world no longer automatically apply. As Richard P. Feynman,one of the greatest physicists of the 20th century, once put it: "Things on a very small scale behave like nothing you have any direct experience about... or like anything that you have ever seen." (Six Easy Pieces, p116.)
If you've studied light, you may already know a bit about quantumtheory. You might know that a beam of light sometimes behaves asthough it's made up of particles (like a steady stream ofcannonballs), and sometimes as though it's waves of energy ripplingthrough space (a bit like waves on the sea). That's called wave-particle dualityand it's one of the ideas that comes to us from quantum theory. It's hard to grasp thatsomething can be two things at oncea particle and awavebecause it's totally alien to our everyday experience: a car isnot simultaneously a bicycle and a bus. In quantum theory, however,that's just the kind of crazy thing that can happen. The most striking example of this is the baffling riddle known as Schrdinger's cat. Briefly, in the weird world ofquantum theory, we can imagine a situation where something like a catcould be alive and dead at the same time!
What does all this have to do with computers? Suppose we keep on pushingMoore's Lawkeep on making transistors smaller until they get to thepoint where they obey not the ordinary laws of physics (likeold-style transistors) but the more bizarre laws of quantummechanics. The question is whether computers designed this way can dothings our conventional computers can't. If we can predictmathematically that they might be able to, can we actually make themwork like that in practice?
People have been asking those questions for several decades.Among the first were IBM research physicists Rolf Landauer and Charles H. Bennett. Landauer opened the door for quantumcomputing in the 1960s when he proposed that information is a physical entitythat could be manipulated according to the laws of physics.One important consequence of this is that computers waste energy manipulating the bits inside them(which is partly why computers use so much energy and get so hot, even though they appear to be doingnot very much at all). In the 1970s, building on Landauer's work, Bennett showed how a computer could circumventthis problem by working in a "reversible" way, implying that a quantum computer couldcarry out massively complex computations without using massive amounts of energy.In 1981, physicist Paul Benioff from Argonne National Laboratory tried to envisage a basic machine that would work in a similar way to an ordinary computer but according to the principlesof quantum physics. The following year, Richard Feynman sketched out roughly how a machine using quantum principles could carry out basiccomputations. A few years later, Oxford University's David Deutsch(one of the leading lights in quantum computing) outlined thetheoretical basis of a quantum computer in more detail. How did thesegreat scientists imagine that quantum computers might work?
The key features of an ordinary computerbits, registers, logic gates,algorithms, and so onhave analogous features in a quantum computer.Instead of bits, a quantum computer has quantum bits or qubits,which work in a particularly intriguing way. Where a bit can storeeither a zero or a 1, a qubit can store a zero, a one, bothzero and one, or an infinite number of values in betweenandbe in multiple states (store multiple values) at the same time!If that sounds confusing, think back to light being a particle anda wave at the same time, Schrdinger's cat being alive and dead, or acar being a bicycle and a bus. A gentler way to think of the numbersqubits store is through the physics concept of superposition(where two waves add to make a third one that contains both of theoriginals). If you blow on something like a flute, the pipe fills upwith a standing wave: a wave made up of a fundamental frequency (thebasic note you're playing) and lots of overtones or harmonics(higher-frequency multiples of the fundamental). The wave inside thepipe contains all these waves simultaneously: they're added togetherto make a combined wave that includes them all. Qubits usesuperposition to represent multiple states (multiple numeric values)simultaneously in a similar way.
Just as a quantum computer can store multiple numbers at once, so it canprocess them simultaneously. Instead of working in serial (doing aseries of things one at a time in a sequence), it can work inparallel (doing multiple things at the same time). Only when youtry to find out what state it's actually in at any given moment(by measuring it, in other words) does it "collapse" into one of its possible statesandthat gives you the answer to your problem. Estimates suggesta quantum computer's ability to work in parallel would make it millions of times faster thanany conventional computer... if only we could build it! So howwould we do that?
In reality, qubits would have to be stored by atoms, ions (atoms withtoo many or too few electrons), or even smaller things such as electronsand photons (energy packets), so a quantum computer would be almost like a table-topversion of the kind of particle physics experiments they do atFermilab or CERN. Now you wouldn't be racing particles round giantloops and smashing them together, but you would need mechanisms forcontaining atoms, ions, or subatomic particles, for putting them into certainstates (so you can store information), knocking them into other states (so you canmake them process information), and figuring out what their states are after particularoperations have been performed.
Photo: A single atom can be trapped in an optical cavitythe space between mirrorsand controlled by precise pulses from laser beams.
In practice, there are lots of possible ways of containing atoms and changing their states usinglaser beams, electromagneticfields, radio waves, and an assortment of other techniques.One method is to make qubits usingquantum dots, which are nanoscopically tiny particles of semiconductors inside which individual charge carriers, electrons and holes (missing electrons), can be controlled. Another methodmakes qubits from what are called ion traps: you add or take awayelectrons from an atom to make an ion, hold it steady in a kind of laser spotlight(so it's locked in place like a nanoscopic rabbit dancing in a very bright headlight),and then flip it into different states with laser pulses. In another technique,the qubits are photons inside optical cavities (spaces betweenextremely tiny mirrors). Don't worry if you don't understand; not many people do. Since the entirefield of quantum computing is still largely abstract and theoretical, the only thing we really need to knowis that qubits are stored by atoms or other quantum-scale particles that canexist in different states and be switched between them.
Although people often assume that quantum computers must automatically bebetter than conventional ones, that's by no means certain. So far,just about the only thing we know for certain that a quantum computer could do better than anormal one is factorisation: finding two unknown prime numbers that,when multiplied together, give a third, known number. In 1994,while working at Bell Laboratories, mathematician Peter Shor demonstrated an algorithm that a quantum computercould follow to find the "prime factors" of a large number, whichwould speed up the problem enormously. Shor's algorithm reallyexcited interest in quantum computing because virtually every moderncomputer (and every secure, online shopping and banking website) usespublic-key encryption technology based on the virtual impossibility of finding prime factors quickly (it is, in other words, essentiallyan "intractable" computer problem). If quantum computers couldindeed factor large numbers quickly, today's online security could berendered obsolete at a stroke. But what goes around comes around,and some researchers believe quantum technology will lead tomuch stronger forms of encryption.(In 2017, Chinese researchers demonstrated for the first timehow quantum encryption could be used to make a very secure video callfrom Beijing to Vienna.)
Does that mean quantum computers are better than conventional ones? Notexactly. Apart from Shor's algorithm, and a search method called Grover's algorithm, hardly any other algorithms have been discovered that wouldbe better performed by quantum methods. Given enough time andcomputing power, conventional computers should still be able to solveany problem that quantum computers could solve, eventually. Inother words, it remains to be proven that quantum computers aregenerally superior to conventional ones, especially given the difficulties ofactually building them. Who knows how conventional computers might advancein the next 50 years, potentially making the idea of quantum computers irrelevantand even absurd.
Photo: Quantum dots are probably best known as colorful nanoscale crystals, but they can also be used as qubits in quantum computers). Photo courtesy of Argonne National Laboratory.
Three decades after they were first proposed, quantum computers remainlargely theoretical. Even so, there's been some encouraging progresstoward realizing a quantum machine. There were two impressivebreakthroughs in 2000. First, Isaac Chuang (now an MIT professor, but then working at IBM'sAlmaden Research Center) used five fluorine atoms to make a crude,five-qubit quantum computer. The same year, researchers at LosAlamos National Laboratory figured out how to make a seven-qubitmachine using a drop of liquid. Five years later, researchers at theUniversity of Innsbruck added an extra qubit and produced the firstquantum computer that could manipulate a qubyte (eight qubits).
These were tentative but important first steps.Over the next few years, researchers announced more ambitious experiments, addingprogressively greater numbers of qubits. By 2011, a pioneering Canadiancompany called D-Wave Systems announced in Nature that it had produced a 128-qubitmachine; the announcement proved highly controversialand there was a lot of debate over whether the company's machines had really demonstrated quantum behavior.Three years later, Google announced that it was hiring a team of academics (including University of Californiaat Santa Barbara physicist John Martinis) to develop its own quantum computers based on D-Wave's approach.In March 2015, the Google team announced they were "a step closer to quantum computation," having developeda new way for qubits to detect and protect against errors.In 2016, MIT's Isaac Chuang and scientists from the University of Innsbruckunveiled a five-qubit, ion-trap quantum computer that could calculate the factors of 15; one day, a scaled-up version of this machine mightevolve into the long-promised, fully fledged encryption buster.
There's no doubt that these are hugely important advances.and the signs are growing steadily more encouraging that quantumtechnology will eventually deliver a computing revolution.In December 2017, Microsoft unveiled a completequantum development kit, including a new computer language, Q#, developed specifically forquantum applications. In early 2018,D-wave announced plans to start rolling out quantum power to acloud computing platform.A few weeks later, Google announced Bristlecone, a quantum processorbased on a 72-qubit array, that might, one day, form the cornerstone of a quantum computer that could tackle real-world problems.All very exciting! Even so, it's early days for the whole field, and mostresearchers agree that we're unlikely to see practical quantumcomputers appearing for some yearsand more likely several decades.
Read the original here:
Quantum computing: A simple introduction - Explain that Stuff
- The Quantum Computer Revolution Is Closer Than You May Think - National Review [Last Updated On: May 3rd, 2017] [Originally Added On: May 3rd, 2017]
- Time Crystals Could be the Key to the First Quantum Computer - TrendinTech [Last Updated On: May 3rd, 2017] [Originally Added On: May 3rd, 2017]
- quantum computing - WIRED UK [Last Updated On: May 3rd, 2017] [Originally Added On: May 3rd, 2017]
- Chinese scientists build world's first quantum computing machine - India Today [Last Updated On: May 3rd, 2017] [Originally Added On: May 3rd, 2017]
- Here's How We Can Achieve Mass-Produced Quantum Computers - ScienceAlert [Last Updated On: June 6th, 2017] [Originally Added On: June 6th, 2017]
- D-Wave partners with U of T to move quantum computing along - Financial Post [Last Updated On: June 6th, 2017] [Originally Added On: June 6th, 2017]
- Team develops first blockchain that can't be hacked by quantum computer - Siliconrepublic.com [Last Updated On: June 6th, 2017] [Originally Added On: June 6th, 2017]
- Telstra just wants a quantum computer to offer as-a-service - ZDNet [Last Updated On: June 6th, 2017] [Originally Added On: June 6th, 2017]
- Research collaborative pursues advanced quantum computing - Phys.Org [Last Updated On: June 6th, 2017] [Originally Added On: June 6th, 2017]
- Quantum Computing Market Forecast 2017-2022 | Market ... [Last Updated On: June 6th, 2017] [Originally Added On: June 6th, 2017]
- Quantum Computing Is Real, and D-Wave Just Open ... - WIRED [Last Updated On: June 7th, 2017] [Originally Added On: June 7th, 2017]
- FinDEVr London: Preparing for the Dark Side of Quantum Computing - GlobeNewswire (press release) [Last Updated On: June 9th, 2017] [Originally Added On: June 9th, 2017]
- Purdue, Microsoft to Collaborate on Quantum Computer - Photonics.com [Last Updated On: June 9th, 2017] [Originally Added On: June 9th, 2017]
- Scientists May Have Found a Way to Combat Quantum Computer Blockchain Hacking - Futurism [Last Updated On: June 9th, 2017] [Originally Added On: June 9th, 2017]
- Microsoft and Purdue work on scalable topological quantum computer - Next Big Future [Last Updated On: June 12th, 2017] [Originally Added On: June 12th, 2017]
- HYPRES Expands Efforts in Quantum Computing with Launch of European Subsidiary SeeQC - Business Wire (press release) [Last Updated On: June 12th, 2017] [Originally Added On: June 12th, 2017]
- From the Abacus to Supercomputers to Quantum Computers - Duke Today [Last Updated On: June 13th, 2017] [Originally Added On: June 13th, 2017]
- Accenture, Biogen, 1QBit Launch Quantum Computing App to ... - HIT Consultant [Last Updated On: June 14th, 2017] [Originally Added On: June 14th, 2017]
- The US and China "Quantum Computing Arms Race" Will Change Long-Held Dynamics in Commerce, Intelligence ... - PR Newswire (press release) [Last Updated On: June 14th, 2017] [Originally Added On: June 14th, 2017]
- Quantum Computing Technologies markets will reach $10.7 billion by 2024 - PR Newswire (press release) [Last Updated On: June 14th, 2017] [Originally Added On: June 14th, 2017]
- A Hybrid of Quantum Computing and Machine Learning Is Spawning New Ventures - IEEE Spectrum [Last Updated On: June 14th, 2017] [Originally Added On: June 14th, 2017]
- KPN CISO details Quantum computing attack dangers - Mobile World Live [Last Updated On: June 16th, 2017] [Originally Added On: June 16th, 2017]
- Get ahead in quantum computing AND attract Goldman Sachs - eFinancialCareers [Last Updated On: June 16th, 2017] [Originally Added On: June 16th, 2017]
- Accenture, 1QBit partner for drug discovery through quantum ... - ZDNet [Last Updated On: June 16th, 2017] [Originally Added On: June 16th, 2017]
- Toward optical quantum computing - MIT News [Last Updated On: June 17th, 2017] [Originally Added On: June 17th, 2017]
- Quantum computing, the machines of tomorrow | The Japan Times - The Japan Times [Last Updated On: June 17th, 2017] [Originally Added On: June 17th, 2017]
- Its time to decide how quantum computing will help your ... [Last Updated On: June 18th, 2017] [Originally Added On: June 18th, 2017]
- Israel Enters Quantum Computer Race, Placing Encryption at Ever-Greater Risk - Sputnik International [Last Updated On: June 20th, 2017] [Originally Added On: June 20th, 2017]
- Prototype device enables photon-photon interactions at room ... - Phys.Org [Last Updated On: June 20th, 2017] [Originally Added On: June 20th, 2017]
- Dow and 1QBit Announce Collaboration Agreement on Quantum Computing - Business Wire (press release) [Last Updated On: June 21st, 2017] [Originally Added On: June 21st, 2017]
- Imperfect crystals may be perfect storage method for quantum computing - Digital Trends [Last Updated On: June 21st, 2017] [Originally Added On: June 21st, 2017]
- Dow Chemical, 1QBit Ink Quantum Computing Development Deal - Zacks.com [Last Updated On: June 22nd, 2017] [Originally Added On: June 22nd, 2017]
- Google on track for quantum computer breakthrough by end of 2017 - New Scientist [Last Updated On: June 22nd, 2017] [Originally Added On: June 22nd, 2017]
- USC to lead project to build super-speedy quantum computers - USC News [Last Updated On: June 24th, 2017] [Originally Added On: June 24th, 2017]
- The Quantum Computer Factory That's Taking on Google and IBM ... - WIRED [Last Updated On: June 24th, 2017] [Originally Added On: June 24th, 2017]
- The weird science of quantum computing, communications and encryption - C4ISR & Networks [Last Updated On: June 27th, 2017] [Originally Added On: June 27th, 2017]
- Multi-coloured photons in 100 dimensions may make quantum ... - Cosmos [Last Updated On: June 30th, 2017] [Originally Added On: June 30th, 2017]
- Global Quantum Computing Market Growth at a CAGR of 35.12 ... - PR Newswire (press release) [Last Updated On: June 30th, 2017] [Originally Added On: June 30th, 2017]
- Qudits: The Real Future of Quantum Computing? - IEEE Spectrum - IEEE Spectrum [Last Updated On: June 30th, 2017] [Originally Added On: June 30th, 2017]
- New method could enable more stable and scalable quantum ... - Phys.Org [Last Updated On: June 30th, 2017] [Originally Added On: June 30th, 2017]
- Quantum computers are about to get real | Science News - Science News Magazine [Last Updated On: June 30th, 2017] [Originally Added On: June 30th, 2017]
- Quantum Computing - Scientific American [Last Updated On: June 30th, 2017] [Originally Added On: June 30th, 2017]
- Australia's ambitious plan to win the quantum race - ZDNet [Last Updated On: July 3rd, 2017] [Originally Added On: July 3rd, 2017]
- How quantum mechanics can change computing - The Conversation - The Conversation US [Last Updated On: August 24th, 2017] [Originally Added On: August 24th, 2017]
- UNSW joins with government and business to keep quantum computing technology in Australia - The Australian Financial Review [Last Updated On: August 24th, 2017] [Originally Added On: August 24th, 2017]
- UNSW launches Australia's first hardware quantum computing company with investments from federal and NSW ... - OpenGov Asia [Last Updated On: August 24th, 2017] [Originally Added On: August 24th, 2017]
- Finns chill out quantum computers with qubit refrigerator to cut out errors - ZDNet [Last Updated On: August 24th, 2017] [Originally Added On: August 24th, 2017]
- Hype and cash are muddying public understanding of quantum ... - The Conversation AU [Last Updated On: August 24th, 2017] [Originally Added On: August 24th, 2017]
- IEEE Approves Standards Project for Quantum Computing ... - insideHPC [Last Updated On: August 24th, 2017] [Originally Added On: August 24th, 2017]
- Silicon Quantum Computing launched to commercialise UNSW ... - ZDNet [Last Updated On: August 24th, 2017] [Originally Added On: August 24th, 2017]
- The Era of Quantum Computing Is Here. Outlook: Cloudy ... [Last Updated On: January 30th, 2018] [Originally Added On: January 30th, 2018]
- The Era of Quantum Computing Is Here. Outlook: Cloudy | WIRED [Last Updated On: February 6th, 2018] [Originally Added On: February 6th, 2018]
- Quantum computing in the NISQ era and beyond [Last Updated On: February 6th, 2018] [Originally Added On: February 6th, 2018]
- What is quantum computing? - Definition from WhatIs.com [Last Updated On: February 6th, 2018] [Originally Added On: February 6th, 2018]
- Quantum computers - WIRED UK [Last Updated On: February 19th, 2018] [Originally Added On: February 19th, 2018]
- Is Quantum Computing an Existential Threat to Blockchain ... [Last Updated On: February 21st, 2018] [Originally Added On: February 21st, 2018]
- What is Quantum Computing? Webopedia Definition [Last Updated On: March 25th, 2018] [Originally Added On: March 25th, 2018]
- Quantum Computing Explained - WIRED UK [Last Updated On: April 15th, 2018] [Originally Added On: April 15th, 2018]
- What are quantum computers and how do they work? WIRED ... [Last Updated On: June 22nd, 2018] [Originally Added On: June 22nd, 2018]
- How Quantum Computers Work [Last Updated On: July 22nd, 2018] [Originally Added On: July 22nd, 2018]
- The reality of quantum computing could be just three years ... [Last Updated On: September 12th, 2018] [Originally Added On: September 12th, 2018]
- The 3 Types of Quantum Computers and Their Applications [Last Updated On: November 24th, 2018] [Originally Added On: November 24th, 2018]
- Quantum Computing - VLAB [Last Updated On: January 27th, 2019] [Originally Added On: January 27th, 2019]
- Quantum Computing | Centre for Quantum Computation and ... [Last Updated On: January 27th, 2019] [Originally Added On: January 27th, 2019]
- Microsofts quantum computing network takes a giant leap ... [Last Updated On: March 7th, 2019] [Originally Added On: March 7th, 2019]
- IBM hits quantum computing milestone, may see 'Quantum ... [Last Updated On: March 7th, 2019] [Originally Added On: March 7th, 2019]
- Quantum technology - Wikipedia [Last Updated On: March 13th, 2019] [Originally Added On: March 13th, 2019]
- Quantum Computing | D-Wave Systems [Last Updated On: April 18th, 2019] [Originally Added On: April 18th, 2019]
- Microsoft will open-source parts of Q#, the programming ... [Last Updated On: May 7th, 2019] [Originally Added On: May 7th, 2019]
- What Is Quantum Computing? The Complete WIRED Guide | WIRED [Last Updated On: May 8th, 2019] [Originally Added On: May 8th, 2019]
- The five pillars of Edge Computing -- and what is Edge computing anyway? - Information Age [Last Updated On: October 1st, 2019] [Originally Added On: October 1st, 2019]
- Moore's Law Is Dying. This Brain-Inspired Analogue Chip Is a Glimpse of What's Next - Singularity Hub [Last Updated On: October 1st, 2019] [Originally Added On: October 1st, 2019]
- Experts Gather at Fermilab for International Workshop on Cryogenic Electronics for Quantum Systems - Quantaneo, the Quantum Computing Source [Last Updated On: October 1st, 2019] [Originally Added On: October 1st, 2019]
- Princeton announces initiative to propel innovations in quantum science and technology - Princeton University [Last Updated On: October 1st, 2019] [Originally Added On: October 1st, 2019]
- Detecting Environmental 'Noise' That Can Damage The Quantum State of Qubits - In Compliance [Last Updated On: October 1st, 2019] [Originally Added On: October 1st, 2019]
- Quantum Computing beginning talks with clients on its quantum asset allocation application - Proactive Investors USA & Canada [Last Updated On: October 1st, 2019] [Originally Added On: October 1st, 2019]
- What is quantum computing? The next era of computational evolution, explained - Digital Trends [Last Updated On: October 1st, 2019] [Originally Added On: October 1st, 2019]
- IT sees the Emergence of Quantum Computing as a Looming Threat to Keeping Valuable Information Confidential - Quantaneo, the Quantum Computing Source [Last Updated On: October 23rd, 2019] [Originally Added On: October 23rd, 2019]
- More wrong answers get quantum computers to find the right one - Futurity: Research News [Last Updated On: October 23rd, 2019] [Originally Added On: October 23rd, 2019]
- Airbus announces the names of the jury members for its Quantum Computing Challenge - Quantaneo, the Quantum Computing Source [Last Updated On: October 23rd, 2019] [Originally Added On: October 23rd, 2019]