by Chris Woodford. Last updated: February 18, 2017.
How can you get more and more out of less and less? The smaller computers get, the more powerful they seem to become: there's more number-crunching ability in a 21st-century cellphone than you'd have found in a room-sized, military computer 50 years ago. Yet, despite such amazing advances, there are still plenty of complex problems that are beyond the reach of even the world's most powerful computersand there's no guarantee we'll ever be able to tackle them. One problem is that the basic switching and memory units of computers, known as transistors, are now approaching the point where they'll soon be as small as individual atoms. If we want computers that are smaller and more powerful than today's, we'll soon need to do our computing in a radically different way. Entering the realm of atoms opens up powerful new possibilities in the shape of quantum computing, with processors that could work millions of times faster than the ones we use today. Sounds amazing, but the trouble is that quantum computing is hugely more complex than traditional computing and operates in the Alice in Wonderland world of quantum physics, where the "classical," sensible, everyday laws of physics no longer apply. What is quantum computing and how does it work? Let's take a closer look!
Photo: Quantum computing means storing and processing information using individual atoms, ions, electrons, or photons. On the plus side, this opens up the possibility of faster computers, but the drawback is the greater complexity of designing computers that can operate in the weird world of quantum physics. Photo courtesy of US Department of Energy.
You probably think of a computer as a neat little gadget that sits on your lap and lets you send emails, shop online, chat to your friends, or play gamesbut it's much more and much less than that. It's more, because it's a completely general-purpose machine: you can make it do virtually anything you like. It's less, because inside it's little more than an extremely basic calculator, following a prearranged set of instructions called a program. Like the Wizard of Oz, the amazing things you see in front of you conceal some pretty mundane stuff under the covers.
Photo: This is what one transistor from a typical radio circuit board looks like. In computers, the transistors are much smaller than this and millions of them are packaged together onto microchips.
Conventional computers have two tricks that they do really well: they can store numbers in memory and they can process stored numbers with simple mathematical operations (like add and subtract). They can do more complex things by stringing together the simple operations into a series called an algorithm (multiplying can be done as a series of additions, for example). Both of a computer's key tricksstorage and processingare accomplished using switches called transistors, which are like microscopic versions of the switches you have on your wall for turning on and off the lights. A transistor can either be on or off, just as a light can either be lit or unlit. If it's on, we can use a transistor to store a number one (1); if it's off, it stores a number zero (0). Long strings of ones and zeros can be used to store any number, letter, or symbol using a code based on binary (so computers store an upper-case letter A as 1000001 and a lower-case one as 01100001). Each of the zeros or ones is called a binary digit (or bit) and, with a string of eight bits, you can store 255 different characters (such as A-Z, a-z, 0-9, and most common symbols). Computers calculate by using circuits called logic gates, which are made from a number of transistors connected together. Logic gates compare patterns of bits, stored in temporary memories called registers, and then turn them into new patterns of bitsand that's the computer equivalent of what our human brains would call addition, subtraction, or multiplication. In physical terms, the algorithm that performs a particular calculation takes the form of an electronic circuit made from a number of logic gates, with the output from one gate feeding in as the input to the next.
The trouble with conventional computers is that they depend on conventional transistors. This might not sound like a problem if you go by the amazing progress made in electronics over the last few decades. When the transistor was invented, back in 1947, the switch it replaced (which was called the vacuum tube) was about as big as one of your thumbs. Now, a state-of-the-art microprocessor (single-chip computer) packs hundreds of millions (and up to two billion) transistors onto a chip of silicon the size of your fingernail! Chips like these, which are called integrated circuits, are an incredible feat of miniaturization. Back in the 1960s, Intel co-founder Gordon Moore realized that the power of computers doubles roughly 18 monthsand it's been doing so ever since. This apparently unshakeable trend is known as Moore's Law.
Photo: This memory chip from a typical USB stick contains an integrated circuit that can store 512 megabytes of data. That's roughly 500 million characters (536,870,912 to be exact), each of which needs eight binary digitsso we're talking about 4 billion (4,000 million) transistors in all (4,294,967,296 if you're being picky) packed into an area the size of a postage stamp!
It sounds amazing, and it is, but it misses the point. The more information you need to store, the more binary ones and zerosand transistorsyou need to do it. Since most conventional computers can only do one thing at a time, the more complex the problem you want them to solve, the more steps they'll need to take and the longer they'll need to do it. Some computing problems are so complex that they need more computing power and time than any modern machine could reasonably supply; computer scientists call those intractable problems.
As Moore's Law advances, so the number of intractable problems diminishes: computers get more powerful and we can do more with them. The trouble is, transistors are just about as small as we can make them: we're getting to the point where the laws of physics seem likely to put a stop to Moore's Law. Unfortunately, there are still hugely difficult computing problems we can't tackle because even the most powerful computers find them intractable. That's one of the reasons why people are now getting interested in quantum computing.
Quantum theory is the branch of physics that deals with the world of atoms and the smaller (subatomic) particles inside them. You might think atoms behave the same way as everything else in the world, in their own tiny little waybut that's not true: on the atomic scale, the rules change and the "classical" laws of physics we take for granted in our everyday world no longer automatically apply. As Richard P. Feynman, one of the greatest physicists of the 20th century, once put it: "Things on a very small scale behave like nothing you have any direct experience about... or like anything that you have ever seen." (Six Easy Pieces, p116.)
If you've studied light, you may already know a bit about quantum theory. You might know that a beam of light sometimes behaves as though it's made up of particles (like a steady stream of cannonballs), and sometimes as though it's waves of energy rippling through space (a bit like waves on the sea). That's called wave-particle duality and it's one of the ideas that comes to us from quantum theory. It's hard to grasp that something can be two things at oncea particle and a wavebecause it's totally alien to our everyday experience: a car is not simultaneously a bicycle and a bus. In quantum theory, however, that's just the kind of crazy thing that can happen. The most striking example of this is the baffling riddle known as Schrdinger's cat. Briefly, in the weird world of quantum theory, we can imagine a situation where something like a cat could be alive and dead at the same time!
What does all this have to do with computers? Suppose we keep on pushing Moore's Lawkeep on making transistors smaller until they get to the point where they obey not the ordinary laws of physics (like old-style transistors) but the more bizarre laws of quantum mechanics. The question is whether computers designed this way can do things our conventional computers can't. If we can predict mathematically that they might be able to, can we actually make them work like that in practice?
People have been asking those questions for several decades. Among the first were IBM research physicists Rolf Landauer and Charles H. Bennett. Landauer opened the door for quantum computing in the 1960s when he proposed that information is a physical entity that could be manipulated according to the laws of physics. One important consequence of this is that computers waste energy manipulating the bits inside them (which is partly why computers use so much energy and get so hot, even though they appear to be doing not very much at all). In the 1970s, building on Landauer's work, Bennett showed how a computer could circumvent this problem by working in a "reversible" way, implying that a quantum computer could carry out massively complex computations without using massive amounts of energy. In 1981, physicist Paul Benioff from Argonne National Laboratory tried to envisage a basic machine that would work in a similar way to an ordinary computer but according to the principles of quantum physics. The following year, Richard Feynman sketched out roughly how a machine using quantum principles could carry out basic computations. A few years later, Oxford University's David Deutsch (one of the leading lights in quantum computing) outlined the theoretical basis of a quantum computer in more detail. How did these great scientists imagine that quantum computers might work?
The key features of an ordinary computerbits, registers, logic gates, algorithms, and so onhave analogous features in a quantum computer. Instead of bits, a quantum computer has quantum bits or qubits, which work in a particularly intriguing way. Where a bit can store either a zero or a 1, a qubit can store a zero, a one, both zero and one, or an infinite number of values in betweenand be in multiple states (store multiple values) at the same time! If that sounds confusing, think back to light being a particle and a wave at the same time, Schrdinger's cat being alive and dead, or a car being a bicycle and a bus. A gentler way to think of the numbers qubits store is through the physics concept of superposition (where two waves add to make a third one that contains both of the originals). If you blow on something like a flute, the pipe fills up with a standing wave: a wave made up of a fundamental frequency (the basic note you're playing) and lots of overtones or harmonics (higher-frequency multiples of the fundamental). The wave inside the pipe contains all these waves simultaneously: they're added together to make a combined wave that includes them all. Qubits use superposition to represent multiple states (multiple numeric values) simultaneously in a similar way.
Just as a quantum computer can store multiple numbers at once, so it can process them simultaneously. Instead of working in serial (doing a series of things one at a time in a sequence), it can work in parallel (doing multiple things at the same time). Only when you try to find out what state it's actually in at any given moment (by measuring it, in other words) does it "collapse" into one of its possible statesand that gives you the answer to your problem. Estimates suggest a quantum computer's ability to work in parallel would make it millions of times faster than any conventional computer... if only we could build it! So how would we do that?
In reality, qubits would have to be stored by atoms, ions (atoms with too many or too few electrons) or even smaller things such as electrons and photons (energy packets), so a quantum computer would be almost like a table-top version of the kind of particle physics experiments they do at Fermilab or CERN! Now you wouldn't be racing particles round giant loops and smashing them together, but you would need mechanisms for containing atoms, ions, or subatomic particles, for putting them into certain states (so you can store information), knocking them into other states (so you can make them process information), and figuring out what their states are after particular operations have been performed.
Photo: A single atom can be trapped in an optical cavitythe space between mirrorsand controlled by precise pulses from laser beams.
In practice, there are lots of possible ways of containing atoms and changing their states using laser beams, electromagnetic fields, radio waves, and an assortment of other techniques. One method is to make qubits using quantum dots, which are nanoscopically tiny particles of semiconductors inside which individual charge carriers, electrons and holes (missing electrons), can be controlled. Another method makes qubits from what are called ion traps: you add or take away electrons from an atom to make an ion, hold it steady in a kind of laser spotlight (so it's locked in place like a nanoscopic rabbit dancing in a very bright headlight), and then flip it into different states with laser pulses. In another technique, the qubits are photons inside optical cavities (spaces between extremely tiny mirrors). Don't worry if you don't understand; not many people do! Since the entire field of quantum computing is still largely abstract and theoretical, the only thing we really need to know is that qubits are stored by atoms or other quantum-scale particles that can exist in different states and be switched between them.
Although people often assume that quantum computers must automatically be better than conventional ones, that's by no means certain. So far, just about the only thing we know for certain that a quantum computer could do better than a normal one is factorisation: finding two unknown prime numbers that, when multiplied together, give a third, known number. In 1994, while working at Bell Laboratories, mathematician Peter Shor demonstrated an algorithm that a quantum computer could follow to find the "prime factors" of a large number, which would speed up the problem enormously. Shor's algorithm really excited interest in quantum computing because virtually every modern computer (and every secure, online shopping and banking website) uses public-key encryption technology based on the virtual impossibility of finding prime factors quickly (it is, in other words, essentially an "intractable" computer problem). If quantum computers could indeed factor large numbers quickly, today's online security could be rendered obsolete at a stroke.
Does that mean quantum computers are better than conventional ones? Not exactly. Apart from Shor's algorithm, and a search method called Grover's algorithm, hardly any other algorithms have been discovered that would be better performed by quantum methods. Given enough time and computing power, conventional computers should still be able to solve any problem that quantum computers could solve, eventually. In other words, it remains to be proven that quantum computers are generally superior to conventional ones, especially given the difficulties of actually building them. Who knows how conventional computers might advance in the next 50 years, potentially making the idea of quantum computers irrelevantand even absurd.
Photo: Quantum dots are probably best known as colorful nanoscale crystals, but they can also be used as qubits in quantum computers). Photo courtesy of Argonne National Laboratory.
Three decades after they were first proposed, quantum computers remain largely theoretical. Even so, there's been some encouraging progress toward realizing a quantum machine. There were two impressive breakthroughs in 2000. First, Isaac Chuang (now an MIT professor, but then working at IBM's Almaden Research Center) used five fluorine atoms to make a crude, five-qubit quantum computer. The same year, researchers at Los Alamos National Laboratory figured out how to make a seven-qubit machine using a drop of liquid. Five years later, researchers at the University of Innsbruck added an extra qubit and produced the first quantum computer that could manipulate a qubyte (eight qubits).
These were tentative but important first steps. Over the next few years, researchers announced more ambitious experiments, adding progressively greater numbers of qubits. By 2011, a pioneering Canadian company called D-Wave Systems announced in Nature that it had produced a 128-qubit machine. Thee years later, Google announced that it was hiring a team of academics (including University of California at Santa Barbara physicist John Martinis) to develop its own quantum computers based on D-Wave's approach. In March 2015, the Google team announced they were "a step closer to quantum computation," having developed a new way for qubits to detect and protect against errors. In 2016, MIT's Isaac Chang and scientists from the University of Innsbruck unveiled a five-qubit, ion-trap quantum computer that could calculate the factors of 15; one day, a scaled-up version of this machine might evolve into the long-promised, fully fledged encryption buster! There's no doubt that these are hugely important advances. Even so, it's very early days for the whole fieldand most researchers agree that we're unlikely to see practical quantum computers appearing for many yearsperhaps even decades.
View original post here:
Quantum computing: A simple introduction - Explain that Stuff
- Time Crystals Could be the Key to the First Quantum Computer - TrendinTech [Last Updated On: May 3rd, 2017] [Originally Added On: May 3rd, 2017]
- The Quantum Computer Revolution Is Closer Than You May Think - National Review [Last Updated On: May 3rd, 2017] [Originally Added On: May 3rd, 2017]
- Chinese scientists build world's first quantum computing machine - India Today [Last Updated On: May 3rd, 2017] [Originally Added On: May 3rd, 2017]
- Quantum Computing | D-Wave Systems [Last Updated On: May 3rd, 2017] [Originally Added On: May 3rd, 2017]
- Quantum computing utilizes 3D crystals - Johns Hopkins News-Letter [Last Updated On: May 4th, 2017] [Originally Added On: May 4th, 2017]
- Quantum Computing and What All Good IT Managers Should Know - TrendinTech [Last Updated On: May 4th, 2017] [Originally Added On: May 4th, 2017]
- World's First Quantum Computer Made By China 24000 Times Faster Than International Counterparts - Fossbytes [Last Updated On: May 4th, 2017] [Originally Added On: May 4th, 2017]
- China adds a quantum computer to high-performance computing arsenal - PCWorld [Last Updated On: May 6th, 2017] [Originally Added On: May 6th, 2017]
- What is Quantum Computing? Webopedia Definition [Last Updated On: May 6th, 2017] [Originally Added On: May 6th, 2017]
- Quantum Computing Market Forecast 2017-2022 | Market ... [Last Updated On: May 6th, 2017] [Originally Added On: May 6th, 2017]
- China hits milestone in developing quantum computer - South China Morning Post [Last Updated On: May 8th, 2017] [Originally Added On: May 8th, 2017]
- China builds five qubit quantum computer sampling and will scale to 20 qubits by end of this year and could any beat ... - Next Big Future [Last Updated On: May 8th, 2017] [Originally Added On: May 8th, 2017]
- Five Ways Quantum Computing Will Change the Way We Think ... - PR Newswire (press release) [Last Updated On: May 8th, 2017] [Originally Added On: May 8th, 2017]
- Quantum Computing Demands a Whole New Kind of Programmer - Singularity Hub [Last Updated On: May 9th, 2017] [Originally Added On: May 9th, 2017]
- New materials bring quantum computing closer to reality - Phys.org - Phys.Org [Last Updated On: May 9th, 2017] [Originally Added On: May 9th, 2017]
- Researchers Invent Nanoscale 'Refrigerator' for Quantum ... - Sci-News.com [Last Updated On: May 11th, 2017] [Originally Added On: May 11th, 2017]
- China's New Type of Quantum Computing Device, Built Inside a Diamond - TrendinTech [Last Updated On: May 11th, 2017] [Originally Added On: May 11th, 2017]
- Molecular magnets closer to application in quantum computing - Next Big Future [Last Updated On: May 11th, 2017] [Originally Added On: May 11th, 2017]
- New Materials Could Make Quantum Computers More Practical - Tom's Hardware [Last Updated On: May 11th, 2017] [Originally Added On: May 11th, 2017]
- Home News Computer Europe Takes Quantum Computing to the Next Level With this Billion Euro... - TrendinTech [Last Updated On: May 13th, 2017] [Originally Added On: May 13th, 2017]
- Researchers seek to advance quantum computing - The Stanford Daily [Last Updated On: May 13th, 2017] [Originally Added On: May 13th, 2017]
- quantum computing - WIRED UK [Last Updated On: May 13th, 2017] [Originally Added On: May 13th, 2017]
- Scientists Invent Nanoscale Refrigerator For Quantum Computers - Wall Street Pit [Last Updated On: May 14th, 2017] [Originally Added On: May 14th, 2017]
- D-Wave Closes $50M Facility to Fund Next Generation of Quantum Computers - Marketwired (press release) [Last Updated On: May 17th, 2017] [Originally Added On: May 17th, 2017]
- Quantum Computers Sound Great, But Who's Going to Program Them? - TrendinTech [Last Updated On: May 17th, 2017] [Originally Added On: May 17th, 2017]
- Quantum Computing Could Use Graphene To Create Stable Qubits - International Business Times [Last Updated On: May 18th, 2017] [Originally Added On: May 18th, 2017]
- Bigger is better: Quantum volume expresses computer's limit - Ars Technica [Last Updated On: May 18th, 2017] [Originally Added On: May 18th, 2017]
- IBM's Newest Quantum Computing Processors Have Triple the Qubits of Their Last - Futurism [Last Updated On: May 18th, 2017] [Originally Added On: May 18th, 2017]
- It's time to decide how quantum computing will help your business - Techworld Australia [Last Updated On: May 20th, 2017] [Originally Added On: May 20th, 2017]
- IBM makes a leap in quantum computing power - PCWorld [Last Updated On: May 20th, 2017] [Originally Added On: May 20th, 2017]
- IBM scientists demonstrate ballistic nanowire connections, a potential future key component for quantum computing - Phys.Org [Last Updated On: May 20th, 2017] [Originally Added On: May 20th, 2017]
- The route to high-speed quantum computing is paved with error - Ars Technica UK [Last Updated On: May 20th, 2017] [Originally Added On: May 20th, 2017]
- IBM makes leap in quantum computing power - ITworld [Last Updated On: May 22nd, 2017] [Originally Added On: May 22nd, 2017]
- Researchers push forward quantum computing research - The ... - Economic Times [Last Updated On: May 22nd, 2017] [Originally Added On: May 22nd, 2017]
- Quantum Computing Research Given a Boost by Stanford Team - News18 [Last Updated On: May 22nd, 2017] [Originally Added On: May 22nd, 2017]
- US playing catch-up in quantum computing - The Register-Guard [Last Updated On: May 22nd, 2017] [Originally Added On: May 22nd, 2017]
- Stanford researchers push forward quantum computing research ... - The Indian Express [Last Updated On: May 23rd, 2017] [Originally Added On: May 23rd, 2017]
- NASA Scientist Eleanor Rieffel to give a talk on quantum computing - Chapman University: Happenings (blog) [Last Updated On: May 23rd, 2017] [Originally Added On: May 23rd, 2017]
- Graphene Just Brought Us One Step Closer to Practical Quantum Computers - Futurism [Last Updated On: May 23rd, 2017] [Originally Added On: May 23rd, 2017]
- IBM Q Offers Quantum Computing as a Service - The Merkle [Last Updated On: May 23rd, 2017] [Originally Added On: May 23rd, 2017]
- How quantum computing increases cybersecurity risks | Network ... - Network World [Last Updated On: May 23rd, 2017] [Originally Added On: May 23rd, 2017]
- Quantum Computing Is Going Commercial With the Potential ... [Last Updated On: May 23rd, 2017] [Originally Added On: May 23rd, 2017]
- Is the US falling behind in the race for quantum computing? - AroundtheO [Last Updated On: May 26th, 2017] [Originally Added On: May 26th, 2017]
- Quantum computing, election pledges and a thief who made science history - Nature.com [Last Updated On: May 26th, 2017] [Originally Added On: May 26th, 2017]
- Top 5: Things to know about quantum computers - TechRepublic [Last Updated On: May 26th, 2017] [Originally Added On: May 26th, 2017]
- Google Plans to Demonstrate the Supremacy of Quantum ... - IEEE Spectrum [Last Updated On: May 26th, 2017] [Originally Added On: May 26th, 2017]
- Quantum Computing Is Real, and D-Wave Just Open ... - WIRED [Last Updated On: May 26th, 2017] [Originally Added On: May 26th, 2017]
- IBM to Sell Use of Its New 17-Qubit Quantum Computer over the Cloud - All About Circuits [Last Updated On: May 28th, 2017] [Originally Added On: May 28th, 2017]
- Doped Diamonds Push Practical Quantum Computing Closer to Reality - Motherboard [Last Updated On: May 28th, 2017] [Originally Added On: May 28th, 2017]
- For more advanced computing, technology needs to make a ... - CIO Dive [Last Updated On: May 30th, 2017] [Originally Added On: May 30th, 2017]
- Microsoft, Purdue Extend Quantum Computing Partnership To Create More Stable Qubits - Tom's Hardware [Last Updated On: May 30th, 2017] [Originally Added On: May 30th, 2017]
- AI and Quantum Computers Are Our Best Weapons Against Cyber Criminals - Futurism [Last Updated On: May 30th, 2017] [Originally Added On: May 30th, 2017]
- Toward mass-producible quantum computers | MIT News - MIT News [Last Updated On: June 1st, 2017] [Originally Added On: June 1st, 2017]
- Purdue, Microsoft Partner On Quantum Computing Research | WBAA - WBAA [Last Updated On: June 1st, 2017] [Originally Added On: June 1st, 2017]
- Tektronix AWG Pulls Test into Era of Quantum Computing - Electronic Design [Last Updated On: June 1st, 2017] [Originally Added On: June 1st, 2017]
- Telstra just wants a quantum computer to offer as-a-service - ZDNet [Last Updated On: June 1st, 2017] [Originally Added On: June 1st, 2017]
- D-Wave partners with U of T to move quantum computing along - Financial Post [Last Updated On: June 1st, 2017] [Originally Added On: June 1st, 2017]
- MIT Just Unveiled A Technique to Mass Produce Quantum Computers - Futurism [Last Updated On: June 1st, 2017] [Originally Added On: June 1st, 2017]
- Here's how we can achieve mass-produced quantum computers ... - ScienceAlert [Last Updated On: June 1st, 2017] [Originally Added On: June 1st, 2017]
- Research collaborative pursues advanced quantum computing - Phys.Org [Last Updated On: June 1st, 2017] [Originally Added On: June 1st, 2017]
- Team develops first blockchain that can't be hacked by quantum computer - Siliconrepublic.com [Last Updated On: June 3rd, 2017] [Originally Added On: June 3rd, 2017]
- Quantum computers to drive customer insights, says CBA CIO - CIO - CIO Australia [Last Updated On: June 6th, 2017] [Originally Added On: June 6th, 2017]
- FinDEVr London: Preparing for the Dark Side of Quantum Computing - GlobeNewswire (press release) [Last Updated On: June 8th, 2017] [Originally Added On: June 8th, 2017]
- Scientists May Have Found a Way to Combat Quantum Computer Blockchain Hacking - Futurism [Last Updated On: June 9th, 2017] [Originally Added On: June 9th, 2017]
- Purdue, Microsoft to Collaborate on Quantum Computer - Photonics.com [Last Updated On: June 9th, 2017] [Originally Added On: June 9th, 2017]
- From the Abacus to Supercomputers to Quantum Computers - Duke Today [Last Updated On: June 12th, 2017] [Originally Added On: June 12th, 2017]
- Microsoft and Purdue work on scalable topological quantum computer - Next Big Future [Last Updated On: June 12th, 2017] [Originally Added On: June 12th, 2017]
- Are Enterprises Ready to Take a Quantum Leap? - IT Business Edge [Last Updated On: June 12th, 2017] [Originally Added On: June 12th, 2017]
- A Hybrid of Quantum Computing and Machine Learning Is Spawning New Ventures - IEEE Spectrum [Last Updated On: June 14th, 2017] [Originally Added On: June 14th, 2017]
- The Machine of Tomorrow Today: Quantum Computing on the Verge - Bloomberg [Last Updated On: June 14th, 2017] [Originally Added On: June 14th, 2017]
- KPN CISO details Quantum computing attack dangers - Mobile World Live [Last Updated On: June 15th, 2017] [Originally Added On: June 15th, 2017]
- Accenture, Biogen, 1QBit Launch Quantum Computing App to ... - HIT Consultant [Last Updated On: June 15th, 2017] [Originally Added On: June 15th, 2017]
- Angry Birds, qubits and big ideas: Quantum computing is tantalisingly close - The Australian Financial Review [Last Updated On: June 15th, 2017] [Originally Added On: June 15th, 2017]
- Consortium Applies Quantum Computing to Drug Discovery for Neurological Diseases - Drug Discovery & Development [Last Updated On: June 15th, 2017] [Originally Added On: June 15th, 2017]
- Accenture, 1QBit partner for drug discovery through quantum computing - ZDNet [Last Updated On: June 15th, 2017] [Originally Added On: June 15th, 2017]
- How to get ahead in quantum machine learning AND attract Goldman Sachs - eFinancialCareers [Last Updated On: June 15th, 2017] [Originally Added On: June 15th, 2017]
- Quantum computing, the machines of tomorrow - The Japan Times [Last Updated On: June 16th, 2017] [Originally Added On: June 16th, 2017]
- Toward optical quantum computing - MIT News [Last Updated On: June 17th, 2017] [Originally Added On: June 17th, 2017]
- Its time to decide how quantum computing will help your ... [Last Updated On: June 18th, 2017] [Originally Added On: June 18th, 2017]
- Israel Enters Quantum Computer Race, Placing Encryption at Ever-Greater Risk - Sputnik International [Last Updated On: June 19th, 2017] [Originally Added On: June 19th, 2017]