In the mid-twentieth century, particle physicists were peering deeper into the history and makeup of the universe than ever before. Over time, their calculations became too complex to fit on a blackboardor to farm out to armies of human computers doing calculations by hand.
To deal with this, they developed some of the worlds earliest electronic computers.
Physics has played an important role in the history of computing. The transistorthe switch that controls the flow of electrical signal within a computerwas invented by a group of physicists at Bell Labs. The incredible computational demands of particle physics and astrophysics experiments have consistently pushed the boundaries of what is possible. They have encouraged the development of new technologies to handle tasks from dealing with avalanches of data to simulating interactions on the scales of both the cosmos and the quantum realm.
But this influence doesnt just go one way. Computing plays an essential role in particle physics and astrophysics as well. As computing has grown increasingly more sophisticated, its own progress has enabled new scientific discoveries and breakthroughs.
Illustration by Sandbox Studio, Chicago with Ariel Davis
In 1973, scientists at Fermi National Accelerator Laboratory in Illinois got their first big mainframe computer: a 7-year-old hand-me-down from Lawrence Berkeley National Laboratory. Called the CDC 6600, it weighed about 6 tons. Over the next five years, Fermilab added five more large mainframe computers to its collection.
Then came the completion of the Tevatronat the time, the worlds highest-energy particle acceleratorwhich would provide the particle beams for numerous experiments at the lab. By the mid-1990s, two four-story particle detectors would begin selecting, storing and analyzing data from millions of particle collisions at the Tevatron per second. Called the Collider Detector at Fermilab and the DZero detector, these new experiments threatened to overpower the labs computational abilities.
In December of 1983, a committee of physicists and computer scientists released a 103-page report highlighting the urgent need for an upgrading of the laboratorys computer facilities. The report said the lab should continue the process of catching up in terms of computing ability, and that this should remain the laboratorys top computing priority for the next few years.
Instead of simply buying more large computers (which were incredibly expensive), the committee suggested a new approach: They recommended increasing computational power by distributing the burden over clusters or farms of hundreds of smaller computers.
Thanks to Intels 1971 development of a new commercially available microprocessor the size of a domino, computers were shrinking. Fermilab was one of the first national labs to try the concept of clustering these smaller computers together, treating each particle collision as a computationally independent event that could be analyzed on its own processor.
Like many new ideas in science, it wasnt accepted without some pushback.
Joel Butler, a physicist at Fermilab who was on the computing committee, recalls, There was a big fight about whether this was a good idea or a bad idea.
A lot of people were enchanted with the big computers, he says. They were impressive-looking and reliable, and people knew how to use them. And then along came this swarm of little tiny devices, packaged in breadbox-sized enclosures.
The computers were unfamiliar, and the companies building them werent well-established. On top of that, it wasnt clear how well the clustering strategy would work.
As for Butler? I raised my hand [at a meeting] and said, Good ideaand suddenly my entire career shifted from building detectors and beamlines to doing computing, he chuckles.
Not long afterward, innovation that sparked for the benefit of particle physics enabled another leap in computing. In 1989, Tim Berners-Lee, a computer scientist at CERN, launched the World Wide Web to help CERN physicists share data with research collaborators all over the world.
To be clear, Berners-Lee didnt create the internetthat was already underway in the form the ARPANET, developed by the US Department of Defense. But the ARPANET connected only a few hundred computers, and it was difficult to share information across machines with different operating systems.
The web Berners-Lee created was an application that ran on the internet, like email, and started as a collection of documents connected by hyperlinks. To get around the problem of accessing files between different types of computers, he developed HTML (HyperText Markup Language), a programming language that formatted and displayed files in a web browser independent of the local computers operating system.
Berners-Lee also developed the first web browser, allowing users to access files stored on the first web server (Berners-Lees computer at CERN). He implemented the concept of a URL (Uniform Resource Locator), specifying how and where to access desired web pages.
What started out as an internal project to help particle physicists share data within their institution fundamentally changed not just computing, but how most people experience the digital world today.
Back at Fermilab, cluster computing wound up working well for handling the Tevatron data. Eventually, it became industry standard for tech giants like Google and Amazon.
Over the next decade, other US national laboratories adopted the idea, too. SLAC National Accelerator Laboratorythen called Stanford Linear Accelerator Centertransitioned from big mainframes to clusters of smaller computers to prepare for its own extremely data-hungry experiment, BaBar. Both SLAC and Fermilab also were early adopters of Lees web server. The labs set up the first two websites in the United States, paving the way for this innovation to spread across the continent.
In 1989, in recognition of the growing importance of computing in physics, Fermilab Director John Peoples elevated the computing department to a full-fledged division. The head of a division reports directly to the lab director, making it easier to get resources and set priorities. Physicist Tom Nash formed the new Computing Division, along with Butler and two other scientists, Irwin Gaines and Victoria White. Butler led the division from 1994 to 1998.
These computational systems worked well for particle physicists for a long time, says Berkeley Lab astrophysicist Peter Nugent. That is, until Moores Law started grinding to a halt.
Moores Law is the idea that the number of transistors in a circuit will double, making computers faster and cheaper, every two years. The term was first coined in the mid-1970s, and the trend reliably proceeded for decades. But now, computer manufacturers are starting to hit the physical limit of how many tiny transistors they can cram onto a single microchip.
Because of this, says Nugent, particle physicists have been looking to take advantage of high-performance computing instead.
Nugent says high-performance computing is something more than a cluster, or a cloud-computing environment that you could get from Google or AWS, or at your local university.
What it typically means, he says, is that you have high-speed networking between computational nodes, allowing them to share information with each other very, very quickly. When you are computing on up to hundreds of thousands of nodes simultaneously, it massively speeds up the process.
On a single traditional computer, he says, 100 million CPU hours translates to more than 11,000 years of continuous calculations. But for scientists using a high-performance computing facility at Berkeley Lab, Argonne National Laboratory or Oak Ridge National Laboratory, 100 million hours is a typical, large allocation for one year at these facilities.
Although astrophysicists have always relied on high-performance computing for simulating the birth of stars or modeling the evolution of the cosmos, Nugent says they are now using it for their data analysis as well.
This includes rapid image-processing computations that have enabled the observations of several supernovae, including SN 2011fe, captured just after it began. We found it just a few hours after it exploded, all because we were able to run these pipelines so efficiently and quickly, Nugent says.
According to Berkeley Lab physicist Paolo Calafiura, particle physicists also use high-performance computing for simulationsfor modeling not the evolution of the cosmos, but rather what happens inside a particle detector. Detector simulation is significantly the most computing-intensive problem that we have, he says.
Scientists need to evaluate multiple possibilities for what can happen when particles collide. To properly correct for detector effects when analyzing particle detector experiments, they need to simulate more data than they collect. If you collect 1 billion collision events a year, Calafiura says, you want to simulate 10 billion collision events.
Calafiura says that right now, hes more worried about finding a way to store all of the simulated and actual detector data than he is about producing it, but he knows that wont last.
When does physics push computing? he says. When computing is not good enough We see that in five years, computers will not be powerful enough for our problems, so we are pushing hard with some radically new ideas, and lots of detailed optimization work.
Thats why the Department of Energys Exascale Computing Project aims to build, in the next few years, computers capable of performing a quintillion (that is, a billion billion) operations per second. The new computers will be 1000 times faster than the current fastest computers.
The exascale computers will also be used for other applications ranging from precision medicine to climate modeling to national security.
Innovations in computer hardware have enabled astrophysicists to push the kinds of simulations and analyses they can do. For example, Nugent says, the introduction of graphics processing units has sped up astrophysicists ability to do calculations used in machine learning, leading to an explosive growth of machine learning in astrophysics.
With machine learning, which uses algorithms and statistics to identify patterns in data, astrophysicists can simulate entire universes in microseconds.
Machine learning has been important in particle physics as well, says Fermilab scientist Nhan Tran. [Physicists] have very high-dimensional data, very complex data, he says. Machine learning is an optimal way to find interesting structures in that data.
The same way a computer can be trained to tell the difference between cats and dogs in pictures, it can learn how to identify particles from physics datasets, distinguishing between things like pions and photons.
Tran says using computation this way can accelerate discovery. As physicists, weve been able to learn a lot about particle physics and nature using non-machine-learning algorithms, he says. But machine learning can drastically accelerate and augment that processand potentially provide deeper insight into the data.
And while teams of researchers are busy building exascale computers, others are hard at work trying to build another type of supercomputer: the quantum computer.
Remember Moores Law? Previously, engineers were able to make computer chips faster by shrinking the size of electrical circuits, reducing the amount of time it takes for electrical signals to travel. Now our technology is so good that literally the distance between transistors is the size of an atom, Tran says. So we cant keep scaling down the technology and expect the same gains weve seen in the past."
To get around this, some researchers are redefining how computation works at a fundamental levellike, really fundamental.
The basic unit of data in a classical computer is called a bit, which can hold one of two values: 1, if it has an electrical signal, or 0, if it has none. But in quantum computing, data is stored in quantum systemsthings like electrons, which have either up or down spins, or photons, which are polarized either vertically or horizontally. These data units are called qubits.
Heres where it gets weird. Through a quantum property called superposition, qubits have more than just two possible states. An electron can be up, down, or in a variety of stages in between.
What does this mean for computing? A collection of three classical bits can exist in only one of eight possible configurations: 000, 001, 010, 100, 011, 110, 101 or 111. But through superposition, three qubits can be in all eight of these configurations at once. A quantum computer can use that information to tackle problems that are impossible to solve with a classical computer.
Fermilab scientist Aaron Chou likens quantum problem-solving to throwing a pebble into a pond. The ripples move through the water in every possible direction, simultaneously exploring all of the possible things that it might encounter.
In contrast, a classical computer can only move in one direction at a time.
But this makes quantum computers faster than classical computers only when it comes to solving certain types of problems. Its not like you can take any classical algorithm and put it on a quantum computer and make it better, says University of California, Santa Barbara physicist John Martinis, who helped build Googles quantum computer.
Although quantum computers work in a fundamentally different way than classical computers, designing and building them wouldnt be possible without traditional computing laying the foundation, Martinis says. We're really piggybacking on a lot of the technology of the last 50 years or more.
The kinds of problems that are well suited to quantum computing are intrinsically quantum mechanical in nature, says Chou.
For instance, Martinis says, consider quantum chemistry. Solving quantum chemistry problems with classical computers is so difficult, he says, that 10 to 15% of the worlds supercomputer usage is currently dedicated to the task. Quantum chemistry problems are hard for the very reason why a quantum computer is powerfulbecause to complete them, you have to consider all the different quantum-mechanical states of all the individual atoms involved.
Because making better quantum computers would be so useful in physics research, and because building them requires skills and knowledge that physicists possess, physicists are ramping up their quantum efforts. In the United States, the National Quantum Initiative Act of 2018 called for the National Institute of Standards and Technology, the National Science Foundation and the Department of Energy to support programs, centers and consortia devoted to quantum information science.
In the early days of computational physics, the line between who was a particle physicist and who was a computer scientist could be fuzzy. Physicists used commercially available microprocessors to build custom computers for experiments. They also wrote much of their own softwareranging from printer drivers to the software that coordinated the analysis between the clustered computers.
Nowadays, roles have somewhat shifted. Most physicists use commercially available devices and software, allowing them to focus more on the physics, Butler says. But some people, like Anshu Dubey, work right at the intersection of the two fields. Dubey is a computational scientist at Argonne National Laboratory who works with computational physicists.
When a physicist needs to computationally interpret or model a phenomenon, sometimes they will sign up a student or postdoc in their research group for a programming course or two and then ask them to write the code to do the job. Although these codes are mathematically complex, Dubey says, they arent logically complex, making them relatively easy to write.
A simulation of a single physical phenomenon can be neatly packaged within fairly straightforward code. But the real world doesnt want to cooperate with you in terms of its modularity and encapsularity, she says.
Multiple forces are always at play, so to accurately model real-world complexity, you have to use more complex softwareideally software that doesnt become impossible to maintain as it gets updated over time. All of a sudden, says Dubey, you start to require people who are creative in their own rightin terms of being able to architect software.
Thats where people like Dubey come in. At Argonne, Dubey develops software that researchers use to model complex multi-physics systemsincorporating processes like fluid dynamics, radiation transfer and nuclear burning.
Hiring computer scientists for research projects in physics and other fields of science can be a challenge, Dubey says. Most funding agencies specify that research money can be used for hiring students and postdocs, but not paying for software development or hiring dedicated engineers. There is no viable career path in academia for people whose careers are like mine, she says.
In an ideal world, universities would establish endowed positions for a team of research software engineers in physics departments with a nontrivial amount of computational research, Dubey says. These engineers would write reliable, well-architected code, and their institutional knowledge would stay with a team.
Physics and computing have been closely intertwined for decades. However the two developtoward new analyses using artificial intelligence, for example, or toward the creation of better and better quantum computersit seems they will remain on this path together.
Read the original here:
The coevolution of particle physics and computing - Symmetry magazine
- Time Crystals Could be the Key to the First Quantum Computer - TrendinTech [Last Updated On: May 3rd, 2017] [Originally Added On: May 3rd, 2017]
- The Quantum Computer Revolution Is Closer Than You May Think - National Review [Last Updated On: May 3rd, 2017] [Originally Added On: May 3rd, 2017]
- Chinese scientists build world's first quantum computing machine - India Today [Last Updated On: May 3rd, 2017] [Originally Added On: May 3rd, 2017]
- Quantum Computing | D-Wave Systems [Last Updated On: May 3rd, 2017] [Originally Added On: May 3rd, 2017]
- Quantum computing utilizes 3D crystals - Johns Hopkins News-Letter [Last Updated On: May 4th, 2017] [Originally Added On: May 4th, 2017]
- Quantum Computing and What All Good IT Managers Should Know - TrendinTech [Last Updated On: May 4th, 2017] [Originally Added On: May 4th, 2017]
- World's First Quantum Computer Made By China 24000 Times Faster Than International Counterparts - Fossbytes [Last Updated On: May 4th, 2017] [Originally Added On: May 4th, 2017]
- China adds a quantum computer to high-performance computing arsenal - PCWorld [Last Updated On: May 6th, 2017] [Originally Added On: May 6th, 2017]
- Quantum computing: A simple introduction - Explain that Stuff [Last Updated On: May 6th, 2017] [Originally Added On: May 6th, 2017]
- What is Quantum Computing? Webopedia Definition [Last Updated On: May 6th, 2017] [Originally Added On: May 6th, 2017]
- Quantum Computing Market Forecast 2017-2022 | Market ... [Last Updated On: May 6th, 2017] [Originally Added On: May 6th, 2017]
- China hits milestone in developing quantum computer - South China Morning Post [Last Updated On: May 8th, 2017] [Originally Added On: May 8th, 2017]
- China builds five qubit quantum computer sampling and will scale to 20 qubits by end of this year and could any beat ... - Next Big Future [Last Updated On: May 8th, 2017] [Originally Added On: May 8th, 2017]
- Five Ways Quantum Computing Will Change the Way We Think ... - PR Newswire (press release) [Last Updated On: May 8th, 2017] [Originally Added On: May 8th, 2017]
- Quantum Computing Demands a Whole New Kind of Programmer - Singularity Hub [Last Updated On: May 9th, 2017] [Originally Added On: May 9th, 2017]
- New materials bring quantum computing closer to reality - Phys.org - Phys.Org [Last Updated On: May 9th, 2017] [Originally Added On: May 9th, 2017]
- Researchers Invent Nanoscale 'Refrigerator' for Quantum ... - Sci-News.com [Last Updated On: May 11th, 2017] [Originally Added On: May 11th, 2017]
- China's New Type of Quantum Computing Device, Built Inside a Diamond - TrendinTech [Last Updated On: May 11th, 2017] [Originally Added On: May 11th, 2017]
- Molecular magnets closer to application in quantum computing - Next Big Future [Last Updated On: May 11th, 2017] [Originally Added On: May 11th, 2017]
- New Materials Could Make Quantum Computers More Practical - Tom's Hardware [Last Updated On: May 11th, 2017] [Originally Added On: May 11th, 2017]
- Home News Computer Europe Takes Quantum Computing to the Next Level With this Billion Euro... - TrendinTech [Last Updated On: May 13th, 2017] [Originally Added On: May 13th, 2017]
- Researchers seek to advance quantum computing - The Stanford Daily [Last Updated On: May 13th, 2017] [Originally Added On: May 13th, 2017]
- quantum computing - WIRED UK [Last Updated On: May 13th, 2017] [Originally Added On: May 13th, 2017]
- Scientists Invent Nanoscale Refrigerator For Quantum Computers - Wall Street Pit [Last Updated On: May 14th, 2017] [Originally Added On: May 14th, 2017]
- D-Wave Closes $50M Facility to Fund Next Generation of Quantum Computers - Marketwired (press release) [Last Updated On: May 17th, 2017] [Originally Added On: May 17th, 2017]
- Quantum Computers Sound Great, But Who's Going to Program Them? - TrendinTech [Last Updated On: May 17th, 2017] [Originally Added On: May 17th, 2017]
- Quantum Computing Could Use Graphene To Create Stable Qubits - International Business Times [Last Updated On: May 18th, 2017] [Originally Added On: May 18th, 2017]
- Bigger is better: Quantum volume expresses computer's limit - Ars Technica [Last Updated On: May 18th, 2017] [Originally Added On: May 18th, 2017]
- IBM's Newest Quantum Computing Processors Have Triple the Qubits of Their Last - Futurism [Last Updated On: May 18th, 2017] [Originally Added On: May 18th, 2017]
- It's time to decide how quantum computing will help your business - Techworld Australia [Last Updated On: May 20th, 2017] [Originally Added On: May 20th, 2017]
- IBM makes a leap in quantum computing power - PCWorld [Last Updated On: May 20th, 2017] [Originally Added On: May 20th, 2017]
- IBM scientists demonstrate ballistic nanowire connections, a potential future key component for quantum computing - Phys.Org [Last Updated On: May 20th, 2017] [Originally Added On: May 20th, 2017]
- The route to high-speed quantum computing is paved with error - Ars Technica UK [Last Updated On: May 20th, 2017] [Originally Added On: May 20th, 2017]
- IBM makes leap in quantum computing power - ITworld [Last Updated On: May 22nd, 2017] [Originally Added On: May 22nd, 2017]
- Researchers push forward quantum computing research - The ... - Economic Times [Last Updated On: May 22nd, 2017] [Originally Added On: May 22nd, 2017]
- Quantum Computing Research Given a Boost by Stanford Team - News18 [Last Updated On: May 22nd, 2017] [Originally Added On: May 22nd, 2017]
- US playing catch-up in quantum computing - The Register-Guard [Last Updated On: May 22nd, 2017] [Originally Added On: May 22nd, 2017]
- Stanford researchers push forward quantum computing research ... - The Indian Express [Last Updated On: May 23rd, 2017] [Originally Added On: May 23rd, 2017]
- NASA Scientist Eleanor Rieffel to give a talk on quantum computing - Chapman University: Happenings (blog) [Last Updated On: May 23rd, 2017] [Originally Added On: May 23rd, 2017]
- Graphene Just Brought Us One Step Closer to Practical Quantum Computers - Futurism [Last Updated On: May 23rd, 2017] [Originally Added On: May 23rd, 2017]
- IBM Q Offers Quantum Computing as a Service - The Merkle [Last Updated On: May 23rd, 2017] [Originally Added On: May 23rd, 2017]
- How quantum computing increases cybersecurity risks | Network ... - Network World [Last Updated On: May 23rd, 2017] [Originally Added On: May 23rd, 2017]
- Quantum Computing Is Going Commercial With the Potential ... [Last Updated On: May 23rd, 2017] [Originally Added On: May 23rd, 2017]
- Is the US falling behind in the race for quantum computing? - AroundtheO [Last Updated On: May 26th, 2017] [Originally Added On: May 26th, 2017]
- Quantum computing, election pledges and a thief who made science history - Nature.com [Last Updated On: May 26th, 2017] [Originally Added On: May 26th, 2017]
- Top 5: Things to know about quantum computers - TechRepublic [Last Updated On: May 26th, 2017] [Originally Added On: May 26th, 2017]
- Google Plans to Demonstrate the Supremacy of Quantum ... - IEEE Spectrum [Last Updated On: May 26th, 2017] [Originally Added On: May 26th, 2017]
- Quantum Computing Is Real, and D-Wave Just Open ... - WIRED [Last Updated On: May 26th, 2017] [Originally Added On: May 26th, 2017]
- IBM to Sell Use of Its New 17-Qubit Quantum Computer over the Cloud - All About Circuits [Last Updated On: May 28th, 2017] [Originally Added On: May 28th, 2017]
- Doped Diamonds Push Practical Quantum Computing Closer to Reality - Motherboard [Last Updated On: May 28th, 2017] [Originally Added On: May 28th, 2017]
- For more advanced computing, technology needs to make a ... - CIO Dive [Last Updated On: May 30th, 2017] [Originally Added On: May 30th, 2017]
- Microsoft, Purdue Extend Quantum Computing Partnership To Create More Stable Qubits - Tom's Hardware [Last Updated On: May 30th, 2017] [Originally Added On: May 30th, 2017]
- AI and Quantum Computers Are Our Best Weapons Against Cyber Criminals - Futurism [Last Updated On: May 30th, 2017] [Originally Added On: May 30th, 2017]
- Toward mass-producible quantum computers | MIT News - MIT News [Last Updated On: June 1st, 2017] [Originally Added On: June 1st, 2017]
- Purdue, Microsoft Partner On Quantum Computing Research | WBAA - WBAA [Last Updated On: June 1st, 2017] [Originally Added On: June 1st, 2017]
- Tektronix AWG Pulls Test into Era of Quantum Computing - Electronic Design [Last Updated On: June 1st, 2017] [Originally Added On: June 1st, 2017]
- Telstra just wants a quantum computer to offer as-a-service - ZDNet [Last Updated On: June 1st, 2017] [Originally Added On: June 1st, 2017]
- D-Wave partners with U of T to move quantum computing along - Financial Post [Last Updated On: June 1st, 2017] [Originally Added On: June 1st, 2017]
- MIT Just Unveiled A Technique to Mass Produce Quantum Computers - Futurism [Last Updated On: June 1st, 2017] [Originally Added On: June 1st, 2017]
- Here's how we can achieve mass-produced quantum computers ... - ScienceAlert [Last Updated On: June 1st, 2017] [Originally Added On: June 1st, 2017]
- Research collaborative pursues advanced quantum computing - Phys.Org [Last Updated On: June 1st, 2017] [Originally Added On: June 1st, 2017]
- Team develops first blockchain that can't be hacked by quantum computer - Siliconrepublic.com [Last Updated On: June 3rd, 2017] [Originally Added On: June 3rd, 2017]
- Quantum computers to drive customer insights, says CBA CIO - CIO - CIO Australia [Last Updated On: June 6th, 2017] [Originally Added On: June 6th, 2017]
- FinDEVr London: Preparing for the Dark Side of Quantum Computing - GlobeNewswire (press release) [Last Updated On: June 8th, 2017] [Originally Added On: June 8th, 2017]
- Scientists May Have Found a Way to Combat Quantum Computer Blockchain Hacking - Futurism [Last Updated On: June 9th, 2017] [Originally Added On: June 9th, 2017]
- Purdue, Microsoft to Collaborate on Quantum Computer - Photonics.com [Last Updated On: June 9th, 2017] [Originally Added On: June 9th, 2017]
- From the Abacus to Supercomputers to Quantum Computers - Duke Today [Last Updated On: June 12th, 2017] [Originally Added On: June 12th, 2017]
- Microsoft and Purdue work on scalable topological quantum computer - Next Big Future [Last Updated On: June 12th, 2017] [Originally Added On: June 12th, 2017]
- Are Enterprises Ready to Take a Quantum Leap? - IT Business Edge [Last Updated On: June 12th, 2017] [Originally Added On: June 12th, 2017]
- A Hybrid of Quantum Computing and Machine Learning Is Spawning New Ventures - IEEE Spectrum [Last Updated On: June 14th, 2017] [Originally Added On: June 14th, 2017]
- The Machine of Tomorrow Today: Quantum Computing on the Verge - Bloomberg [Last Updated On: June 14th, 2017] [Originally Added On: June 14th, 2017]
- KPN CISO details Quantum computing attack dangers - Mobile World Live [Last Updated On: June 15th, 2017] [Originally Added On: June 15th, 2017]
- Accenture, Biogen, 1QBit Launch Quantum Computing App to ... - HIT Consultant [Last Updated On: June 15th, 2017] [Originally Added On: June 15th, 2017]
- Angry Birds, qubits and big ideas: Quantum computing is tantalisingly close - The Australian Financial Review [Last Updated On: June 15th, 2017] [Originally Added On: June 15th, 2017]
- Consortium Applies Quantum Computing to Drug Discovery for Neurological Diseases - Drug Discovery & Development [Last Updated On: June 15th, 2017] [Originally Added On: June 15th, 2017]
- Accenture, 1QBit partner for drug discovery through quantum computing - ZDNet [Last Updated On: June 15th, 2017] [Originally Added On: June 15th, 2017]
- How to get ahead in quantum machine learning AND attract Goldman Sachs - eFinancialCareers [Last Updated On: June 15th, 2017] [Originally Added On: June 15th, 2017]
- Quantum computing, the machines of tomorrow - The Japan Times [Last Updated On: June 16th, 2017] [Originally Added On: June 16th, 2017]
- Toward optical quantum computing - MIT News [Last Updated On: June 17th, 2017] [Originally Added On: June 17th, 2017]
- Its time to decide how quantum computing will help your ... [Last Updated On: June 18th, 2017] [Originally Added On: June 18th, 2017]