Double eureka: Breakthroughs could lead to quantum ‘FM radio’ and the end of noise – The Next Web

A team of scientists from the University of Chicago discovered a method by which quantum states can be integrated and controlled in everyday electronics. The teams breakthrough research resulted in the experimental creation of what theyre dubbing a quantum FM radio to transmit data over long distances. This feels like an eureka moment for quantum computing.

The teams work involves silicon carbide, a naturally occurring semiconductor used to make all sorts of electronics including light emitting diodes (LEDs) and circuit boards. Its also used in rocketry due to its ability to withstand high temperatures and in the production of sand paper presumably because its coarse. What were excited about is its potential as a conduit for controlling quantum states.

Todays quantum computers under the IBM/Google/MIT paradigm are giant, unwieldy things that absolutely wont fit on your desktop. They require lasers and sub-zero temperatures to function. You need a team of physicists standing by in an expensive laboratory just to get started. But the University of Chicago teams work may change all that.

They used good old fashioned electricity, something were pretty good at controlling, to initiate and direct quantum states in silicon carbide. That means they didnt need fancy lasers, a super cold environment, or any of that mainframe-sized stuff to produce quantum results. This wasnt the result of a single experiment, but in fact involved two significant breakthroughs.

The first, the ability to control quantum states in silicon carbide, has the potential to solve quantum computings exotic materials problem. Silicon carbide is plentiful and relatively easy to work with compared to the standard-fair physicists use which includes levitated atoms, laser-ready metals, and perfectly-flawed diamonds. This is cool, and could fundamentally change the direction most quantum computing research goes in 2020 and beyond. But its the second breakthrough that might be the most exciting.

According to a press release from the University of Chicago, the teams method solves quantum computings noise problem. Per Chris Anderson, a co-author on the teams paper:

Impurities are common in all semiconductor devices, and at the quantum level, these impurities can scramble the quantum information by creating a noisy electrical environment. This is a near-universal problem for quantum technologies.

Co-author Alexandre Bourassa added:

In our experiments we need to use lasers, which unfortunately jostle the electrons around. Its like a game of musical chairs with electrons; when the light goes out everything stops, but in a different configuration. The problem is that this random configuration of electrons affects our quantum state. But we found that applying electric fields removes the electrons from the system and makes it much more stable.

The work is still early, but it has incredible implications for the field of quantum computing. With a little tweaking, it appears that this silicon carbide-based method of wrangling quantum states could lead us to the unhackable quantum communications network sooner than many experts believed. According to the team, it would work with the existing fiber optic network that already transmits 90 percent of the worlds data.

On the outside, a quantum FM radio, that essentially sends data along frequency-modulated waves, could augment or replace existing wireless communication methods and bring about an entirely new class of technology. Were thinking something like Star Treks TriCorders, a gadget that records environmental data, processes it instantly, and uses quantum AI to analyze and interpret the results.

For more information read the Chicago teams research papers here and here.

H/t: Phys.Org

Read next: Buchardt S400 Review: Remarkable speakers near endgame material

Continue reading here:

Double eureka: Breakthroughs could lead to quantum 'FM radio' and the end of noise - The Next Web

D-Wave partners with NEC to build hybrid HPC and quantum apps – TechCrunch

D-Wave Systems announced a partnership with Japanese industrial giant NEC today to build what they call hybrid apps and services that work on a combination of NEC high-performance computers and D-Waves quantum systems.

The two companies also announced that NEC will be investing $10 million in D-Wave, which has raised $204 million prior to this, according to Crunchbase data.

D-Waves chief product officer and EVP of R&D, Alan Baratz, whom the company announced this week will be taking over as CEO effective January 1st, says the company has been able to do a lot of business in Japan, and the size of this deal could help push the technology further. Our collaboration with global pioneer NEC is a major milestone in the pursuit of fully commercial quantum applications, he said in a statement.

The company says it is one of the earliest deals between a quantum vendor and a multinational IT company with the size and scale of NEC. The deal involves three key elements. First of all, NEC and D-Wave will come together to develop hybrid services that combine NECs supercomputers and other classical systems with D-Waves quantum technology. The hope is that by combining the classical and quantum systems, they can create better performance for lower cost than you could get if you tried to do similar computing on a strictly classical system.

The two companies will also work together with NEC customers to build applications that will take advantage of this hybrid approach. Also, NEC will be an authorized reseller of D-Wave cloud services.

For NEC, which claims to have demonstrated the worlds first quantum bit device way back in 1999, it is about finding ways to keep advancing commercial quantum computing. Quantum computing development is critical for the future of every industry tasked with solving todays most complex problems. Hybrid applications and greater access to quantum systems is what will allow us to achieve truly commercial-grade quantum solutions, Motoo Nishihara, executive vice president and CTO at NEC Corporation, said in a statement.

This deal should help move the companies toward that goal.

See the rest here:

D-Wave partners with NEC to build hybrid HPC and quantum apps - TechCrunch

Shaping the technology transforming our society | News – Fermi National Accelerator Laboratory

Technology and society are intertwined. Self-driving cars and facial recognition technologies are no longer science fiction, and data and efficiency are harbingers of this new world.

But these new technologies are only the beginning. In the coming decades, further advances in artificial intelligence and the dawn of quantum computing are poised to change lives in both discernible and inconspicuous ways.

Even everyday technology, like a smartphone app, affects people in significant ways that they might not realize, said Fermilab scientist Daniel Bowring. If there are concerns about something as familiar as an app, then we need to take more opaque and complicated technology, like AI, very seriously.

A two-day workshop took place from Oct. 31-Nov.1 at the University of Chicago to raise awareness and generate strategies for the ethical development and implementation of AI and quantum computing. The workshop was organized by the Chicago Quantum Exchange, a Chicago-based intellectual hub and community of researchers whose aim is to promote the exploration of quantum information technologies, and funded by the Kavli Foundation and the Center for Data and Computing, a University of Chicago center for research driven by data science and AI approaches.

Members of the Chicago Quantum Exchange engage in conversation at a workshop at the University of Chicago. Photo: Anne Ryan, University of Chicago

At the workshop, industry experts, physicists, sociologists, journalists and more gathered to learn, share insights and identify next steps as AI and quantum computing advance.

AI and quantum computing are developing tools that will affect everyone, said Bowring, a member of the workshop organizing team. It was important to us to get as many stakeholders in the room as possible.

Workshop participants listened to presentations that framed concerns such as power asymmetries, algorithmic bias and privacy before breaking out into small groups to deliberate these topics and develop actionable strategies. Groups reported to all attendees after each breakout session. On the last day of the workshop, participants considered how they would nurture the dialogue.

At one of the breakout sessions, participants discussed the balance between collaborative quantum computing research and national security. Today, the results of quantum computing research are dispersed in a wide variety of academic journals, and a lot of code is accessible and open source. However, because of its potential implications for cybersecurity and encryption, quantum computing is also of interest to national security, so it may be subject to intelligence and export controls. What endeavors, if any, should be open source or private? Are these outcomes realizable? What level of control should be maintained? How should these technologies be regulated?

Were already behind on setting ground rules for these technologies, which, if left to progress on their own, could increase power asymmetries in society, said Brian Nord, Fermilab and University of Chicago scientist and member of the workshop organizing team. Our research programs, for example, need to be crafted in a way that does not reinforce or exacerbate these asymmetries.

Workshop participants will continue the dialogue through online and in-person meetings to address key ethical and societal issues in the quantum and AI space. Potential future activities include writing proposals for joint research projects that consider ethical and societal implications, white papers addressed to academic audiences, and media editorials and developing community action plans.

Organizers are planning to hold a panel next spring to engage the public, as well.

The spring event will help us continue to spread awareness and engage a variety of groups on issues of ethics in AI and quantum computing, Nord said.

The workshop was sponsored by the Kavli Foundation in partnership with the Center for Data and Computing at the University of Chicago. Artificial intelligence and quantum information science are two of six initiatives identified as special priority by the Department of Energy Office of Science.

The Kavli Foundation is dedicated to advancing science for the benefit of humanity, promoting public understanding of scientific research, and supporting scientists and their work. The foundations mission is implemented through an international program of research institutes, initiatives and symposia in the fields of astrophysics, nanoscience, neuroscience, and theoretical physics, as well as the Kavli Prize and a program in public engagement with science. Visitkavlifoundation.org.

The Chicago Quantum Exchange catalyzes research activity across disciplines and member institutions. It is anchored by the University of Chicago, Argonne National Laboratory, Fermi National Accelerator Laboratory, and the University of Illinois at Urbana-Champaign and includes the University of Wisconsin-Madison, Northwestern University and industry partners. Visit chicagoquantum.org.

View post:

Shaping the technology transforming our society | News - Fermi National Accelerator Laboratory

Inside the weird, wild, and wondrous world of quantum video games – Digital Trends

IBM Research

In 1950, a man named John Bennett, an Australian employee of the now-defunct British technology firm Ferranti, created what may be historys first gaming computer. It could play a game called Nim, a long-forgotten parlor game in which players take turns removing matches from several piles. The player who loses is the one who removes the last match. For his computerized version, Bennett created a vast machine 12 feet wide, 5 feet tall, and 9 feet deep. The majority of this space was taken up by light-up vacuum tubes which depicted the virtual matches.

Bennetts aim wasnt to create a game-playing machine for the sake of it; the reason that somebody might build a games PC today. As writer Tristan Donovan observed in Replay, his superlative 2010 history of video games: Despite suggesting Ferranti create a game-playing computer, Bennetts aim was not to entertain but to show off the ability of computers to do [math].

Jump forward almost 70 years and a physicist and computer scientist named Dr. James Robin Wootton is using games to demonstrate the capabilities of another new, and equally large, experimental computer. The computer in this question is a quantum computer, a dream of scientists since the 1980s, now finally becoming a scientific reality.

Quantum computers encode information as delicate correlations with an incredibly rich structure. This allows for potentially mind-boggling densities of information to be stored and manipulated. Unlike a classical computer, which encodes as a series of ones and zeroes, the bits (called qubits) in a quantum computer can be either a one, a zero, or both at the same time. These qubits are composed of subatomic particles, which conform to the rules of quantum rather than classical mechanics. They play by their own rules a little bit like Tom Cruises character Maverick from Top Gun if he spent less time buzzing the tower and more time demonstrating properties like superpositions and entanglement.

I met Wootton at IBMs research lab in Zurich on a rainy day in late November. Moments prior, I had squeezed into a small room with a gaggle of other excited onlookers, where we stood behind a rope and stared at one of IBMs quantum computers like people waiting to be allowed into an exclusive nightclub. I was reminded of the way that people, in John Bennetts day, talked about the technological priesthood surrounding computers: then enormous mainframes sequestered away in labyrinthine chambers, tended to by highly qualified people in white lab coats. Lacking the necessary seminary training, we quantum computer visitors could only bask in its ambience from a distance, listening in reverent silence to the weird vee-oing vee-oing vee-oing sound of its cooling system.

Wottons interest in quantum gaming came about from exactly this scenario. In 2016, he attended a quantum computing event at the same Swiss ski resort where, in 1925, Erwin Schrdinger had worked out his famous Schrdinger wave equation while on vacation with a girlfriend. If there is a ground zero for quantum computing, this was it. Wotton was part of a consortium, sponsored by the Swiss government, to do (and help spread the word about) quantum computing.

At that time quantum computing seemed like it was something that was very far away, he told Digital Trends. Companies and universities were working on it, but it was a topic of research, rather than something that anyone on the street was likely to get their hands on. We were talking about how to address this.

Wootton has been a gamer since the early 1990s. I won a Game Boy in a competition in a wrestling magazine, he said. It was a Slush Puppy competition where you had to come up with a new flavor. My Slush Puppy flavor was called something like Rollin Redcurrant. Im not sure if you had to use the adjective. Maybe thats what set me apart.

While perhaps not a straight path, Wootton knew how an interest in gaming could lead people to an interest in other aspects of technology. He suggested that making games using quantum computing might be a good way of raising public awareness of the technology.He applied for support and, for the next year, was given to my amazement the chance to go and build an educational computer game about quantum computing. At the time, a few people warned me that this was not going to be good for my career, he said. [They told me] I should be writing papers and getting grants; not making games.

But the idea was too tantalizing to pass up.

That same year, IBM launched its Quantum Experience, an online platform granting the general public (at least those with a background in linear algebra) access to IBMs prototype quantum processors via the cloud. Combined with Project Q, a quantum SDK capable of running jobs on IBMs devices, this took care of both the hardware and software component of Woottons project. What he needed now was a game. Woottons first attempt at creating a quantum game for the public was a version of the game Rock-Paper-Scissors, named Cat-Box-Scissors after the famous Schrdingers cat thought experiment. Wootton later dismissed it as [not] very good Little more than a random number generator with a story.

But others followed. There was Battleships, his crack at the first multiplayer game made with a quantum computer. There was Quantum Solitaire. There was a text-based dungeon crawler, modeled on 1973s Hunt the Wumpus, called Hunt the Quantpus. Then the messily titled, but significant, Battleships with partial NOT gates, which Wootton considers the first true quantum computer game, rather than just an experiment. And so on. As games, these dont exactly make Red Dead Redemption 2 look like yesterdays news. Theyre more like Atari 2600 or Commodore 64 games in their aesthetics and gameplay. Still, thats exactly what youd expect from the embryonic phases of a new computing architecture.

If youd like to try out a quantum game for yourself, youre best off starting with Hello Quantum, available for both iOS and Android. It reimagines the principles of quantum computing as a puzzle game in which players must flip qubits. It wont make you a quantum expert overnight, but it will help demystify the process a bit. (With every level, players can hit a learn more button for a digestible tutorial on quantum basics.)

Quantum gaming isnt just about educational outreach, though. Just as John Bennett imagined Nim as a game that would exist to show off a computers abilities, only to unwittingly kickstart a $130 billion a year industry, so quantum games are moving beyond just teaching players lessons about quantum computing.Increasingly, Wootton is excited about what he sees as real world uses for quantum computing. One of the most promising of these is taking advantage of quantum computings random number generating to create random terrain within computer games. In Zurich, he showed me a three-dimensional virtual landscape reminiscent of Minecraft. However, while much of the world of Minecraft is user generated, in this case the blocky, low-resolution world was generated using a quantum computer.

Quantum mechanics is known for its randomness, so the easiest possibility is just to use quantum computing as a [random number generator], Wootton said. I have a game in which I use only one qubit: the smallest quantum computer you can get. All you can do is apply operations that change the probabilities of getting a zero or one as output. I use that to determine the height of the terrain at any point in the game map.

Plenty of games made with classical computers have already included procedurally generated elements over the years. But as the requirements for these elements ranging from randomly generated enemies to entire maps increase in complexity, quantum could help.

Gaming is an industry that is very dependent on how fast things run

Gaming is an industry that is very dependent on how fast things run, he continued. If theres a factor of 10 difference in how long it takes something to run that determines whether you can actually use it in a game. He sees today as a great jumping-on point for people in the gaming industry to get involved to help shape the future development of quantum computing. Its going to be driven by what people want, he explained. If people find an interesting use-case and everyone wants to use quantum computing for a game where you have to submit a job once per frame, that will help dictate the way that the technology is made.

Hes now reached the point where he thinks the race may truly be on to develop the first commercial game using a quantum computer. Weve been working on these proof-of-principle projects, but now I want to work with actual game studios on actual problems that they have, he continued. That means finding out what they want and how they want the technology to be [directed].

One thing thats for certain is that Wootton is no longer alone in developing his quantum games. In the last couple of years, a number ofquantum game jams have popped up around the world. What most people have done is to start small, Wootton said. They often take an existing game and use one or two qubits to help allow you to implement a quantum twist on the game mechanics. Following this mantra, enthusiasts have used quantum computing to make remixed versions of existing games, including Dr. Qubit (a quantum version of Dr. Mario), Quantum Cat-sweeper (a quantum version of Minesweeper), and Quantum Pong (a quantum version of, err, Pong).

The world of quantum gaming has moved beyond its 1950 equivalent of Nim. Now we just have to wait and see what happens next. The decades which followed Nim gave us MITs legendary Spacewar in the 1960s, the arcade boom of the 1970s and 80s, the console wars of Sega vs. Nintendo, the arrival of the Sony PlayStation in the 1990s, and so on. In the process, classical computers became part of our lives in a way they never were before. As Whole Earth Catalog founder Stewart Brand predicted as far back as 1972 Rolling Stone in his classic essay on Spacewar: Ready or not, computers are coming to the people.

At present, quantum gamings future is at a crossroads. Is it an obscure niche occupied by just a few gaming physics enthusiasts or a powerful tool that will shape tomorrows industry? Is it something that will teach us all to appreciate the finer points of quantum physics or a tool many of us wont even realize is being used, that will nevertheless give us some dope ass games to play?

Like Schrdingers cat, right now its both at once. What a superposition to be in.

Read this article:

Inside the weird, wild, and wondrous world of quantum video games - Digital Trends

China is beating the US when it comes to quantum security – MIT Technology Review

Its been six years since hackers linked with China breached the US Office of Personnel Managements computer system and stole sensitive information about millions of federal employees and contractors. It was the sort of information thats collected during background checks for security clearancesvery personal stuff. But not all was lost. Even though there were obviously some massive holes in the OPMs security setup, some of its data was encrypted. It was useless to the attackers.

Perhaps not for much longer. Its only a matter of time before even encrypted data is at risk. Thats the view of John Prisco, CEO of Quantum Xchange, a cybersecurity firm based in Bethesda, Maryland. Speaking at the EmTech Future Compute event last week, he said that Chinas aggressive pursuit of quantum computing suggests it will eventually have a system capable of figuring out the key to access that data. Current encryption doesnt stand much of a chance against a quantum system tasked with breaking it.

China is moving forward with a harvest today, read tomorrow approach, said Prisco. The country wants to steal as much data as possible, even if it cant access it yet, because its banking on a future when it finally can, he said. Prisco says the China is outspending the US in quantum computing 10 times over. Its allegedly spending $10 billion alone to build the National Laboratory for Quantum Information Sciences, scheduled to open next year (although this number is disputed). Americas counterpunch is just $1.2 billion over five years toward quantum information science. Were not really that safe, he said.

Sign up for The Download your daily dose of what's up in emerging technology

Part of Chinas massive investment has gone toward quantum security itself, including the development of quantum key distribution, or QKD. This involves sending encrypted data as classical bits (strictly binary information) over a fiber-optic network, while sending the keys used to decrypt the information in the form of qubits (which can represent more than just two states, thanks to quantum superposition). The mere act of trying to observe the key changes its state, alerting the sender and receiver of a security breach.

Bu it has its limits. QKD requires sending information-carrying photons over incredibly long distances (tens to hundreds of miles). The best way to do this right now is by installing a fiber-optic network, a costly and time-consuming process.

Its not foolproof, either. The signals eventually scatter and break down over long stretches of fiber optics, so you need to build nodes that will continue to boost them forward. These networks are also point-to-point only (as opposed to a broadcast connection), so you can communicate with only one other party at a time.

Nevertheless, China looks to be all in on QKD networks. Its already built a 1,263-mile link between Beijing and Shanghai to deliver quantum keys. And a successful QKD demonstration by the Chinese Micius satellite was reported across the 4,700 miles between Beijing and Vienna.

Even Europe is making aggressive strides: the European Unions OPENQKD initiative calls for using a combination of fiber optics and satellites to create a QKD-safe communications network covering 13 nations. The US, Prisco argues, is incredibly far behind, for which he blames a lack of urgency. The closest thing it has is a 500-mile fiber-optic cable running down the East Coast. Quantum Xchange has inked a deal to use the cable to create a QKD network that secures data transfers for customers (most notably the financial companies based around New York City).

With Europe and China already taking QKD seriously, Prisco wants to see the US catch upand fast. Its a lot like the space race, he said. We really cant afford to come in second place.

Update: This story has been amended to note that the funding figures for the National Laboratory for Quantum Information Sciences are disputed among some experts.

See more here:

China is beating the US when it comes to quantum security - MIT Technology Review

Atos Boosts Quantum Application Development Through the Creation of the First Quantum User Group – AiThority

Following on from the 6thmeeting of its Quantum Scientific Council, Atos, a global leader in digital transformation, announces that it is continuing to enrich its quantum development ecosystem, through the creation of a global User Group of the Atos Quantum Learning Machine (QLM), which will be chaired by a representative from French multi-national energy company Total. This announcement follows the commercial success of the QLM, the worlds highest-performing quantum programming appliance, allowing for the first time to simulate quantic behaviors. This ecosystem is supported by the Atos Quantum Scientific Council, which includes universally recognized quantum physicists. It is also further enhanced by partners such as leading software company Zapata and start-up Xofia.

Just two years on from its launch in 2017, Atos QLM users continue to grow as the QLM is being used in numerous countries worldwide includingAustria,France,Germany, Ireland, Mexico, the Netherlands,UKand theUnited States, empowering major research programs in various sectors.

Read More: ImmersiveTouch Launches New Personalized VR Imaging Platform into the Radiology Market

The User Group will bring together current QLM customers and their ecosystems of users from around the world, including research centers, universities and global industrial companies. It will be chaired by a representative from Total, Henri Calandra, Expert in Numerical Methods and High Performance Computing. This QLM User Group aims to drive advances in quantum programming and simulation, as well as to develop and enrich collaboration between users and share best practice and support. Feedback will be used to influence Atos QLM evolutions and further enhance the technical support that it provides its customers, paving the road towards the new world of quantum computing.

Atos is committed to enrich its quantum ecosystem and with this, its research program in order to continue to provide researchers worldwide with the right conditions and solutions so that they can take advantage of the innovative opportunities provided by quantum computing. We have some of the worlds leading scientists on our Quantum Scientific Council which, together with our rich base of QLM customers, means we are creating the most advanced quantum ecosystem said Elie Girard, CEO of Atos Now, with the creation of this Group of Atos QLM Users, we are ensuring that we continue to support them to develop new advances in deep learning, algorithmics and artificial intelligence with the support of the breakthrough computing acceleration capacities that quantum simulation provides.

Read More: Test Release of Virtual Graffiti App Kokorobakari for Writing Graffiti and Making Donations Using Blockchains Developed by Profound Design Technology and Saltfish

As President of this new User Group, Total is involved in the advancement of quantum research, together with Atos. Quantum simulationenables us to explore new ways of solving complex problems, improve performance and drive significant technological advances to prepare the future of low carbon energy. This contributes to realizing Totals ambition: to become the responsible energy major, said Marie-Noelle Semeria, Senior Vice President, Group CTO at Total.

The Quantum Scientific Council is made up of universally recognized quantum physicists, including Nobel prize laureate in Physics, Serge Haroche; Research Director, CEA Saclay, and Head of Quantronics, Daniel Estve; professor at the Institut dOptique and Ecole Polytechnique, Alain Aspect; Alexander von Humboldt Professor, Director of the Institute for Theoretical Nanoelectronics at the Juelich Research Center, David DiVincenzo; and Professor of Quantum Physics at the Mathematical Institute, University of Oxford and Singapore, Artur Ekert.

Atos ambitious program to anticipate the future of quantum computing and to be prepared for the opportunities as well as the risks that come with it Atos Quantum program was launched in November 2016. As a result of this initiative,Atos was the first organization to offer a quantum noisy simulation module, the Atos QLM. Earlier this year, it launched myQLM, a free tool that allows a broader ecosystem to get acquainted with quantum programming and discover some features of the Atos QLM.

Quantum computing should make it possible, in the years to come, to deal with the explosion of data, which Big Data and the Internet of Things bring about. With its targeted and unprecedented compute acceleration capabilities, notably based on the exascale class supercomputerBullSequana, quantum computing should also promote advances in deep learning, algorithmics and artificial intelligence for areas as various as pharmaceuticals or new materials.

Read More: Artificial Intelligence Will Facilitate Growth of Innovative Kinds of VR and AR Platforms

See the rest here:

Atos Boosts Quantum Application Development Through the Creation of the First Quantum User Group - AiThority

Quantum Computers Are the Ultimate Paper Tiger – The National Interest Online

Google announced this fall to much fanfare that it had demonstrated quantum supremacy that is, it performed a specific quantum computation far faster than the best classical computers could achieve. IBM promptly critiqued the claim, saying that its own classical supercomputer could perform the computation at nearly the same speed with far greater fidelity and, therefore, the Google announcement should be taken with a large dose of skepticism.

This wasnt the first time someone cast doubt on quantum computing. Last year, Michel Dyakonov, a theoretical physicist at the University of Montpellier in France, offered a slew of technical reasons why practical quantum supercomputers will never be built in an article in IEEE Spectrum, the flagship journal of electrical and computer engineering.

So how can you make sense of what is going on?

As someone who has worked on quantum computing for many years, I believe that due to the inevitability of random errors in the hardware, useful quantum computers are unlikely to ever be built.

Whats a quantum computer?

To understand why, you need to understand how quantum computers work since theyre fundamentally different from classical computers.

A classical computer uses 0s and 1s to store data. These numbers could be voltages on different points in a circuit. But a quantum computer works on quantum bits, also known as qubits. You can picture them as waves that are associated with amplitude and phase.

Qubits have special properties: They can exist in superposition, where they are both 0 and 1 at the same time, and they may be entangled so they share physical properties even though they may be separated by large distances. Its a behavior that does not exist in the world of classical physics. The superposition vanishes when the experimenter interacts with the quantum state.

Due to superposition, a quantum computer with 100 qubits can represent 2100 solutions simultaneously. For certain problems, this exponential parallelism can be harnessed to create a tremendous speed advantage. Some code-breaking problems could be solved exponentially faster on a quantum machine, for example.

There is another, narrower approach to quantum computing called quantum annealing, where qubits are used to speed up optimization problems. D-Wave Systems, based in Canada, has built optimization systems that use qubits for this purpose, but critics also claim that these systems are no better than classical computers.

Regardless, companies and countries are investing massive amounts of money in quantum computing. China has developed a new quantum research facility worth US$10 billion, while the European Union has developed a 1 billion ($1.1 billion) quantum master plan. The United States National Quantum Initiative Act provides $1.2 billion to promote quantum information science over a five-year period.

Breaking encryption algorithms is a powerful motivating factor for many countries if they could do it successfully, it would give them an enormous intelligence advantage. But these investments are also promoting fundamental research in physics.

Many companies are pushing to build quantum computers, including Intel and Microsoft in addition to Google and IBM. These companies are trying to build hardware that replicates the circuit model of classical computers. However, current experimental systems have less than 100 qubits. To achieve useful computational performance, you probably need machines with hundreds of thousands of qubits.

Noise and error correction

The mathematics that underpin quantum algorithms is well established, but there are daunting engineering challenges that remain.

For computers to function properly, they must correct all small random errors. In a quantum computer, such errors arise from the non-ideal circuit elements and the interaction of the qubits with the environment around them. For these reasons the qubits can lose coherency in a fraction of a second and, therefore, the computation must be completed in even less time. If random errors which are inevitable in any physical system are not corrected, the computers results will be worthless.

In classical computers, small noise is corrected by taking advantage of a concept known as thresholding. It works like the rounding of numbers. Thus, in the transmission of integers where it is known that the error is less than 0.5, if what is received is 3.45, the received value can be corrected to 3.

Further errors can be corrected by introducing redundancy. Thus if 0 and 1 are transmitted as 000 and 111, then at most one bit-error during transmission can be corrected easily: A received 001 would be a interpreted as 0, and a received 101 would be interpreted as 1.

Quantum error correction codes are a generalization of the classical ones, but there are crucial differences. For one, the unknown qubits cannot be copied to incorporate redundancy as an error correction technique. Furthermore, errors present within the incoming data before the error-correction coding is introduced cannot be corrected.

Quantum cryptography

While the problem of noise is a serious challenge in the implementation of quantum computers, it isnt so in quantum cryptography, where people are dealing with single qubits, for single qubits can remain isolated from the environment for significant amount of time. Using quantum cryptography, two users can exchange the very large numbers known as keys, which secure data, without anyone able to break the key exchange system. Such key exchange could help secure communications between satellites and naval ships. But the actual encryption algorithm used after the key is exchanged remains classical, and therefore the encryption is theoretically no stronger than classical methods.

Quantum cryptography is being commercially used in a limited sense for high-value banking transactions. But because the two parties must be authenticated using classical protocols, and since a chain is only as strong as its weakest link, its not that different from existing systems. Banks are still using a classical-based authentication process, which itself could be used to exchange keys without loss of overall security.

Quantum cryptography technology must shift its focus to quantum transmission of information if its going to become significantly more secure than existing cryptography techniques.

Commercial-scale quantum computing challenges

While quantum cryptography holds some promise if the problems of quantum transmission can be solved, I doubt the same holds true for generalized quantum computing. Error-correction, which is fundamental to a multi-purpose computer, is such a significant challenge in quantum computers that I dont believe theyll ever be built at a commercial scale.

[ Youre smart and curious about the world. So are The Conversations authors and editors. You can get our highlights each weekend. ]

Subhash Kak, Regents Professor of Electrical and Computer Engineering, Oklahoma State University

This article is republished from The Conversation under a Creative Commons license. Read the original article.

Image: Reuters

Read more:

Quantum Computers Are the Ultimate Paper Tiger - The National Interest Online

Quantum supremacy is here, but smart data will have the biggest impact – Quantaneo, the Quantum Computing Source

Making fast and powerful quantum computing available through the cloud can enable tasks to be processed millions of times faster, and could shape lives and businesses as we know it. For example, applications using quantum computing could reduce or prevent traffic congestion, cybercrimes, and cancer. However, reaching the quantum supremacy landmark doesnt mean that Google can take its foot off the gas. Rather, the company has thrown down the gauntlet and the race to commercialize quantum computing is on. Delivering this killer technology is still an uphill battle to harness the power of highly fickle machines and move around quantum bits of information, which is inherently error-prone.

To deliver quantum cloud services, whether for commercial or academic research, Google must tie together units of quantum information (qubits) and wire data, which is part of every action and transaction across the entire IT infrastructure. If quantum cloud services get to the big league, it will still rely on traffic flows based on wire data to deliver value to users. This raises a conundrum for IT and security professionals who must assure services and deliver a flawless user experience. On one hand, the quantum cloud service solves a million computations in parallel and in real time. On the other hand, the results are delivered through wire data across a cloud, SD-WAN, or 5G network. It does not matter if a quantum computer today or tomorrow can crank out an answer 100 million times faster than a regular computer chip if an application that depends on it experiences performance problems or a threat actor is lurking in your on-premises data centre or penetrated the IT infrastructure first and last lines of defence.

No matter what the quantum computing world will look like in the future, IT teams such as NetOps and SecOps will still need to use wire data to gain end-to-end visibility into their on-premises data centres and cloud environment. Wire data is used to fill the visibility gap and see what others cant; to gain actionable intelligence to detect cyber-attacks or quickly solve service degradations. Quantum computing may increase speed, but it also adds a new dimension of infrastructure complexity and the potential for something breaking anywhere along the service delivery path. With that said, reducing risk therefore requires removing service delivery blind spots. A proven way to do that is by turning wire data into smart data to cut through infrastructure complexity and gain visibility without borders. When that happens, the IT organization will fully understand with precise accuracy the issues impacting service performance and security.

In the rush to embrace quantum computing, wire data therefore cannot, and should not, be ignored. Wire data can be turned into contextually, useful smart data. With a smart data platform, the IT organization can help make quantum computing a success by protecting user experience across different industries including automotive, manufacturing and healthcare. Therefore, while Google is striving for high quality qubits and blazing new quantum supremacy trails, success ultimately relies on using smart data for service assurance and security in an age of infinite devices, cloud applications and exponential scalability.

Ron Lifton, Senior Enterprise Solutions Manager, NETSCOUT

View post:

Quantum supremacy is here, but smart data will have the biggest impact - Quantaneo, the Quantum Computing Source

InfoQ’s 2019, and Software Predictions for 2020 – InfoQ.com

Key Takeaways

Looking back, 2019 saw some significant announcements in Quantum computing. In May, IBM published a paper in Nature that suggested they may have found a path to dealing with decoherence in current quantum computers. Writing for InfoQ Sergio De Simone pointed out:

"The main issue with decoherence is the fast decay of a wave function, which has the undesirable effect of generating noise and errors after a very short time period. The paper proposes two approaches, one called probabilistic error correction and the other zero noise extrapolation, to keep decoherence under control."

In September, Google announced, also via a paper in Nature, it had built a machine that achieved quantum supremacy - the point at which a quantum computer can solve problems which classical computers practically cannot. The claim was disputed by IBM, and the practical application of Googles achievement is still limited, but both these announcements demonstrate real progress in the field.

Also significant - the news that Microsoft open-sourced Q#, its Language for Quantum Computing.

A surprise this year was the decline of interest in Virtual Reality, at least in the context of Smart-phone-based VR. Sergio notes:

"Google's decision to stop supporting its Daydream VR headset seemingly marks the end of phone-based virtual reality, a vision that attempted to combine the use of smartphones with 'dumb' VR headsets to bring VR experiences to the masses. Google's decision is accompanied by the BBC disbanding its VR content team after two years of successful experimentation."

JavaScript, Java, and C# remain the most popular languages we cover, but were also seeing strong interest in Rust, Swift, and Go, and our podcast with Bryan Cantrill on "Rust and Why He Feels Its The Biggest Change In Systems Development in His Career" is one of the top-performing podcasts weve published this year. Weve also seen a growing interest in Python this year, probably fuelled by its popularity for machine learning tasks.

After a rather turbulent 2018, Java seems to be settling into its bi-annual release cycle. According to our most-recent reader survey Java is the most used language amongst InfoQ readers, and there continues to be a huge amount of interest in the newer language features and how the language is evolving. We also continue to see strong and growing interest in Kotlin.

It has been interesting to see Microsofts growing involvement in Java, joining the OpenJDK, acquiring JClarity, and hiring other well known figures including Monica Beckwith.

Our podcast with Rod Johnson in which he chats about the early days of the Spring Framework, Languages Post-Java, & Rethinking CI/CD was another of our top-performing podcasts this year.

Matt Raibles JHipster book, now in its 5th version, was one of our most-downloaded books of the year.

In the Java programming language trends report, we noted increased adoption of non-HotSpot JVMs, and we believe OpenJ9 is now within the early-adopter stage. As the time we noted that:

"We believe that the increasing adoption of cloud technologies within all types of organisation is driving the requirements for JREs that embrace associated "cloud-native" principles such as fast start-up times and a low memory footprint. Graal in itself may not be overly interesting, but the ability to compile Java application to native binaries, in combination with the support of polyglot languages, is ensuring that we keep a close watch on this project."

Since the report came out, we feel that the GraalVM has demonstrated significant potential, and will continue to watch its progress with interest.

Our top performing content for Java this year included:

Although it didnt quite make it in the top five list, an honourable mention should go to Brian Goetzs fantastic "Java Feature Spotlight" article "Local Variable Type Inference."

The release of .NET Core 3 in September generated a huge buzz on InfoQ and produced some of our most-popular .NET content of the year. WebAssembly has been another area of intense interest, and we saw a corresponding surge in interest for Blazor, a new framework in ASP.NET Core that allows developers to create interactive web applications using C# and HTML. Blazor comes in multiple editions, including Blazor WebAssembly which allows single-page applications to run in the client's web browser using a WebAssembly-based .NET runtime.

According to our most-recent reader survey, C# is the second-most widely used language among InfoQ readers after Java, and interest in C#8 in particular was also strong.

Our top performing .NET content included:

Jonathan Allens piece in the list is part of an excellent series of articles and news posts that Jonathan Allen wrote for InfoQ during 2019. Others included:

Unsurprisingly, the majority of InfoQ readers write at least some JavaScript - around 70% according to the most recent reader survey - making it the most widely used language among our readers. The dominant JavaScript frameworks for InfoQ readers seem to currently be Vue and React. We also saw interest in using Javascript for machine learning via TensorFlow.js. Away from JavaScript, we saw strong interest in some of the transpiler options. In addition to Blazor, mentioned above, we saw strong interest in Web Assembly, Typescript, Elm and Svelte.

Top-performing content included:

Its unsurprising that distributed computing, and in particular the microservices architecture style, remains a huge part of our news and feature content. We see strong interest in related topics, with our original Domain Driven Design Quickly book, and our more-recent eMag "Domain-Driven Design in Practice" continuing to perform particularly well, and interest in topics like observability and distributed tracing. We also saw interest in methods of testing distributed systems, including a strong performance from our Chaos Engineering eMag, and a resurgence in reader interest in some for the core architectural topics such as API design, diagrams, patterns, and models.

Our top performing architecture content was:

Our podcast with Grady Booch on todays Artificial Intelligence reality and what it means for developers was one of our most popular podcasts of the year, and revealed strong interest in the topic from InfoQ readers.

Key AI stories in 2019 were MIT introducing GEN, a Julia-basd language for artificial intelligence, Googles ongoing work on ML Kit, and discussions around conversational interfaces, as well as more established topics such as streaming.

Its slightly orthogonal to the rest of the pieces listed here, but we should also mention "Postgres Handles More Than You Think" by Jason Skowronski which performed amazingly well.

Our top-performing content for AI and ML was:

If there was an overarching theme to our culture and methods coverage this year it might best be summed up as "agile done wrong" and many of our items focused on issues with agile, and/or going back to the principles outlined in the Agile Manifesto.

We also saw continued in interest in some of the big agile methodologies, notably Scrum, with both "Scrum and XP from the Trenches", and "Kanban and Scrum - Making the Most of Both" performing well in our books department.

We also saw strong reader interest in remote working with Judy Rees eMag on "Mastering Remote Meetings", and her corresponding podcast, performing well, alongside my own talk on "Working Remotely and Managing Remote Teams" from Aginext this year.

Our most popular published content for Culture and Methods was:

Our top performing culture podcasts were:

In our DevOps and Cloud trends report, we noted that Kubernetes has effectively cornered the market for container orchestration, and is arguably becoming the cloud-agnostic compute abstraction. The next "hot topics" in this space appear to be "service meshes" and developer experience/workflow tooling. We continue to see strong interest in all of thee among InfoQs readers.

A trend were also starting to note is a number of languages which are either infrastructure or cloud-orientated. In our Programming Languages trends report, we noted increased interest and innovation related to infrastructure-aware or cloud-specific languages, DSLs, and SDKs like Ballerina and Pulumi. In this context we should also mention Dark, a new language currently still in private beta, but already attracting a lot of interest. Somewhat related, we should also mention the Ecstasy language, co-created by Tangosol founders Cameron Purdy and Gene Gleyzer. Chris Swan, CTO for the Global Delivery at DXC Technology, spoke to Cameron Purdy about the language and the problems its designed to solve.

In the eMags department "Kubernetes: Past, Present and Future", and "DevSecOps in Practice" were among our top performers:

Making predictions in software in notoriously hard to do, but we expect to see enterprise development teams consolidate their cloud-platform choices as Kubernetes adoption continues. Mostly this will be focussed on the "big five" cloud providers - Amazon, Google, IBM (plus Red Hat), Microsoft, and VMware (plus Pivotal). We think that, outside China, Alibaba will struggle to gain traction, as will Oracle, Salesforce, and SAP.

In the platform/operations space were expecting that service meshes will become more integrated with the underlying orchestration frameworks (e.g. Kubernetes). Were also hopeful that the developer workflow for interacting with service meshes becomes more integrated with current workflows, technologies, and pipelines.

Ultimately developers should be able to control deploy, release, and debugging via the same continuous/progressive delivery pipeline. For example, using a "GitOps" style pipeline to deploy a service by configuring k8s YAML (or some higher-level abstraction), controlling the release of the new functionality using techniques like canarying or shadowing via the configuration of some traffic management k8s custom resource definition (CRD) YAML, and enabling additional logging or debug tooling via some additional CRD config.

In regards to architecture, next year will hopefully be the year of "managing complexity". Architectural patterns such microservices and functions(as-a-service) have enabled developers to better separate concerns, implement variable rates of change via independent isolated deployments, and ultimately work more effectively at scale. However, our ability to comprehend the complex distributed systems we are now building -- along with the availability of related tooling -- has not kept pace with these developments. Were looking forward to seeing what the open source community and vendors are working on in the understandability, observability, and debuggability space.

We expect to see more developers experimenting with "low code" platforms. This is partly fueled by a renewed push from Microsoft for its PowerApps, Flow, Power BI, and Power Platform products.

In the .NET ecosystem, we believe that Blazor will keep gaining momentum among web developers. .NET 5 should also bring significant changes to the ecosystem with the promised interoperability with Java, Objective-C, and Swift. Although it is early to say, Microsoft's recent efforts on IoT and AI (with ML.NET) should also help to raise the interest in .NET development. Related we expect to see the interest in Web Assembly continue and hope that the tooling hear will start to mature.

Despite the negative news around VR this year, we still think that something in the AR/VR space, or some other form of alternative computer/human interaction, is likely to come on the market in the next few years and gain significant traction, though it does seem that the form factor for this hasnt really arrived.

Charles Humble took over as editor-in-chief at InfoQ.com in March 2014, guiding our content creation including news, articles, books, video presentations and interviews. Prior to taking on the full-time role at InfoQ, Charles led our Java coverage, and was CTO for PRPi Consulting, a remuneration research firm that was acquired by PwC in July 2012. For PRPi he had overall responsibility for the development of all the custom software used within the company. He has worked in enterprise software for around 20 years as a developer, architect and development manager. In his spare time he writes music as 1/3 of London-based ambient techno group Twofish, whose debut album came out in February 2014 after 14 years of messing about with expensive toys, and spends as much time as he can with his wife and young family.Erik Costlow is a software security expert with extensive Java experience. He manages developer relations for Contrast Security and public Community Edition. Contrast weaves sensors into applications, giving them the ability to detect security threats based on how the application uses its data. Erik was the principal product manager in Oracle focused on security of Java 8, joining at the height of hacks and departing after a two-year absence of zero-day vulnerabilities. During that time, he learned the details of Java at both a corporate/commercial and community level. He also assisted Turbonomic's product management team to achieve $100M annual revenue in data center/cloud performance automation. Erik also lead product management for Fortify static code analyzer, a tool that helps developers find and fix vulnerabilities in custom source code. Erik has also published several developer courses through Packt Publishing on data analysis, statistics, and cryptography.

Arthur Casals is a Computer Science researcher working in the area of Artificial Intelligence / Multi-agent Systems. He has been developing software for 20+ years, in different markets/industries. Arthur has also assumed different roles in the past: startup founder, CTO, tech manager, software engineer. He holds a B.Sc. degree in Computer Engineering and an MBA degree.

Daniel Bryant works as an Independent Technical Consultant and Product Architect at Datawire. His technical expertise focuses on 'DevOps' tooling, cloud/container platforms, and microservice implementations. Daniel is a Java Champion, and contributes to several open source projects. He also writes for InfoQ, O'Reilly, and TheNewStack, and regularly presents at international conferences such as OSCON, QCon and JavaOne. In his copious amounts of free time he enjoys running, reading and traveling.

Bruno Couriol holds a Msc in Telecommunications, a BsC in Mathematics and a MBA by INSEAD. Most of his career has been spent as a business consultant, helping large companies addressing their critical strategical, organizational and technical issues. In the last few years, he developed a focus on the intersection of business, technology and entrepreneurship.

Ben Linders is an Independent Consultant in Agile, Lean, Quality and Continuous Improvement, based in The Netherlands. Author of Getting Value out of Agile Retrospectives, Waardevolle Agile Retrospectives, What Drives Quality, The Agile Self-assessment Game and Continuous Improvement. As an adviser, coach and trainer he helps organizations by deploying effective software development and management practices. He focuses on continuous improvement, collaboration and communication, and professional development, to deliver business value to customers. Ben is an active member of networks on Agile, Lean and Quality, and a frequent speaker and writer. He shares his experience in a bilingual blog (Dutch and English) and as an editor for Agile at InfoQ. Follow him on Twitter: @BenLinders.

Shane Hastie is the Director of Community Development for ICAgile, a global accreditation and certification body dedicated to improving the state of agile learning. Since first using XP in 2000 Shane's been passionate about helping organizations and teams adopt sustainable, humanistic ways of working irrespective of the brand or label they go by. Shane was a Director of the Agile Alliance from 2011 until 2016. Shane leads the Culture and Methods editorial team for InfoQ.com

Read more from the original source:

InfoQ's 2019, and Software Predictions for 2020 - InfoQ.com

Breakthrough in creation of gamma ray lasers that use antimatter – Big Think

Scientists are closer to taming the most powerful light in the Universe. A physicist at the University of California has figured out how to make stable positronium atoms, which may lead to the creation of gamma ray lasers.

Gamma rays are the product of electromagnetic radiation that is caused by the radioactive decay of atomic nuclei. Harnessing these extremely bright (and usually very brief) lights, which have the highest photon energy, could lead to next-generation technologies. The highly penetrating gamma rays are shorter in wavelength than x-rays, and can be utilized for spacecraft propulsion, advanced medical imaging and treating cancers.

Creating a gamma ray laser requires manipulating positronium, a hydrogen-like atom that is a mixture of matter and antimatter in particular, of electrons and their antiparticles known as positrons. The collision of a positron with an electron results in the production of gamma ray photons.

To make gamma-ray laser beams, the positronium atoms need to be in the same quantum state, called a Bose-Einstein condensate. The new study from professor Allen Mills of the UC Riverside Department of Physics and Astronomy, shows that hollow spherical bubbles filled with a positronium atom gas can be kept stable in liquid helium.

"My calculations show that a bubble in liquid helium containing a million atoms of positronium would have a number density six times that of ordinary air and would exist as a matter-antimatter Bose-Einstein condensate," said Mills.

Mills thinks helium would work as the stabilizing container because at extremely low temperatures, the gas would turn to liquid and actually repel positronium. This results from its negative affinity fo positronium and would cause bubbles to be created, which would be the source of the necessary Bose-Einstein condensates.

Testing these ideas and actually configuring an antimatter beam to produce such bubbles in liquid helium is the next goal for the Positron laboratory at UC Riverside that Mills directs.

"Near term results of our experiments could be the observation of positronium tunneling through a graphene sheet, which is impervious to all ordinary matter atoms, including helium, as well as the formation of a positronium atom laser beam with possible quantum computing applications," explained the physicist.

Check out the new study in Physical Review A.

Professor Allen Mills of the UC Riverside Department of Physics and Astronomy.

Credit: I. Pittalwala, UC Riverside.

Related Articles Around the Web

Original post:

Breakthrough in creation of gamma ray lasers that use antimatter - Big Think