What Is Quantum Computing and How Does it Work? – Built In

Accustomed to imagining worst-case scenarios, many cryptography experts are more concerned than usual these days: one of the most widely used schemes for safely transmitting data is poised to become obsolete once quantum computing reaches a sufficiently advanced state.

The cryptosystem known as RSA provides the safety structure for a host of privacy and communication protocols, from email to internet retail transactions. Current standards rely on the fact that no one has the computing power to test every possible way to de-scramble your data once encrypted, but a mature quantum computer could try every option within a matter of hours.

It should be stressed that quantum computers havent yet hit that level of maturity and wont for some time but when a large, stable device is built (or if its built, asan increasingly diminishing minority argue), its unprecedented ability to factor large numbers would essentially leave the RSA cryptosystem in tatters. Thankfully, the technology is still a ways away and the experts are on it.

Dont panic. Thats what Mike Brown, CTO and co-founder of quantum-focused cryptography company ISARA Corporation, advises anxious prospective clients. The threat is far from imminent. What we hear from the academic community and from companies like IBM and Microsoft is that a 2026-to-2030 timeframe is what we typically use from a planning perspective in terms of getting systems ready, he said.

Cryptographers from ISARA are among several contingents currently taking part in the Post-Quantum Cryptography Standardization project, a contest of quantum-resistant encryption schemes. The aim is to standardize algorithms that can resist attacks levied by large-scale quantum computers. The competition was launched in 2016 by the National Institute of Standards and Technology (NIST), a federal agency that helps establish tech and science guidelines, and is now gearing up for its third round.

Indeed, the level of complexity and stability required of a quantum computer to launch the much-discussed RSA attack is very extreme, according to John Donohue, scientific outreach manager at the University of Waterloos Institute for Quantum Computing. Even granting that timelines in quantum computing particularly in terms of scalability are points of contention, the community is pretty comfortable saying thats not something thats going to happen in the next five to 10 years, he said.

When Google announced that it had achieved quantum supremacy or that it used a quantum computer to run, in minutes, an operation that would take thousands of years to complete on a classical supercomputer that machine operated on 54 qubits, the computational bedrocks of quantum computing. While IBMs Q 53 system operates at a similar level, many current prototypes operate on as few as 20 or even five qubits.

But how many qubits would be needed to crack RSA? Probably on the scale of millions of error-tolerant qubits, Donohue told Built In.

Scott Aaronson, a computer scientist at the University of Texas at Austin, underscored the same last year in his popular blog after presidential candidate Andrew Yang tweeted that no code is uncrackable in the wake of Googles proof-of-concept milestone.

Thats the good news. The bad news is that, while cryptography experts gain more time to keep our data secure from quantum computers, the technologys numerous potential upsides ranging from drug discovery to materials science to financial modeling is also largely forestalled. And that question of error tolerance continues to stand as quantum computings central, Herculean challenge. But before we wrestle with that, lets get a better elemental sense of the technology.

Quantum computers process information in a fundamentally different way than classical computers. Traditional computers operate on binary bits information processed in the form of ones or zeroes. But quantum computers transmit information via quantum bits, or qubits, which can exist either as one or zero or both simultaneously. Thats a simplification, and well explore some nuances below, but that capacity known as superposition lies at the heart of quantums potential for exponentially greater computational power.

Such fundamental complexity both cries out for and resists succinct laymanization. When the New York Times asked 10 experts to explain quantum computing in the length of a tweet, some responses raised more questions than they answered:

Microsoft researcher David Reilly:

A quantum machine is a kind of analog calculator that computes by encoding information in the ephemeral waves that comprise light and matter at the nanoscale.

D-Wave Systems executive vice president Alan Baratz:

If were honest, everything we currently know about quantum mechanics cant fully describe how a quantum computer works.

Quantum computing also cries out for a digestible metaphor. Quantum physicist Shohini Ghose, of Wilfrid Laurier University, has likened the difference between quantum and classical computing to light bulbs and candles: The light bulb isnt just a better candle; its something completely different.

Rebecca Krauthamer, CEO of quantum computing consultancy Quantum Thought, compares quantum computing to a crossroads that allows a traveler to take both paths. If youre trying to solve a maze, youd come to your first gate, and you can go either right or left, she said. We have to choose one, but a quantum computer doesnt have to choose one. It can go right and left at the same time.

It can, in a sense, look at these different options simultaneously and then instantly find the most optimal path, she said. That's really powerful.

The most commonly used example of quantum superposition is Schrdingers cat:

Despite its ubiquity, many in the QC field arent so taken with Schrodingers cat. The more interesting fact about superposition rather than the two-things-at-once point of focus is the ability to look at quantum states in multiple ways, and ask it different questions, said Donohue. That is, rather than having to perform tasks sequentially, like a traditional computer, quantum computers can run vast numbers of parallel computations.

Part of Donohues professional charge is clarifying quantums nuances, so its worth quoting him here at length:

In superposition I can have state A and state B. I can ask my quantum state, are you A or B? And it will tell me, I'm a or I'm B. But I might have a superposition of A + B in which case, when I ask it, Are you A or B? Itll tell me A or B randomly.

But the key of superposition is that I can also ask the question, Are you in the superposition state of A + B? And then in that case, they'll tell me, Yes, I am the superposition state A + B.

But theres always going to be an opposite superposition. So if its A + B, the opposite superposition is A - B.

Thats about as simplified as we can get before trotting out equations. But the top-line takeaway is that that superposition is what lets a quantum computer try all paths at once.

Thats not to say that such unprecedented computational heft will displace or render moot classical computers. One thing that we can really agree on in the community is that it wont solve every type of problem that we run into, said Krauthamer.

But quantum computing is particularly well suited for certain kinds of challenges. Those include probability problems, optimization (what is, say, the best possible travel route?) and the incredible challenge of molecular simulation for use cases like drug development and materials discovery.

The cocktail of hype and complexity has a way of fuzzing outsiders conception of quantum computing which makes this point worth underlining: quantum computers exist, and they are being used right now.

They are not, however, presently solving climate change, turbocharging financial forecasting probabilities or performing other similarly lofty tasks that get bandied about in reference to quantum computings potential. QC may have commercial applications related to those challenges, which well explore further below, but thats well down the road.

Today, were still in whats known as the NISQ era Noisy, Intermediate-Scale Quantum. In a nutshell, quantum noise makes such computers incredibly difficult to stabilize. As such, NISQ computers cant be trusted to make decisions of major commercial consequence, which means theyre currently used primarily for research and education.

The technology just isnt quite there yet to provide a computational advantage over what could be done with other methods of computation at the moment, said Dohonue. Most [commercial] interest is from a long-term perspective. [Companies] are getting used to the technology so that when it does catch up and that timeline is a subject of fierce debate theyre ready for it.

Also, its fun to sit next to the cool kids. Lets be frank. Its good PR for them, too, said Donohue.

But NISQ computers R&D practicality is demonstrable, if decidedly small-scale. Donohue cites the molecular modeling of lithium hydrogen. Thats a small enough molecule that it can also be simulated using a supercomputer, but the quantum simulation provides an important opportunity to check our answers after a classical-computer simulation. NISQs have also delivered some results for problems in high-energy particle physics, Donohue noted.

One breakthrough came in 2017, when researchers at IBM modeled beryllium hydride, the largest molecule simulated on a quantum computer to date. Another key step arrived in 2019, when IonQ researchers used quantum computing to go bigger still, by simulating a water molecule.

These are generally still small problems that can be checked using classical simulation methods. But its building toward things that will be difficult to check without actually building a large particle physics experiment, which can get very expensive, Donohue said.

And curious minds can get their hands dirty right now. Users can operate small-scale quantum processors via the cloud through IBMs online Q Experience and its open-source software Quiskit. Late last year, Microsoft and Amazon both announced similar platforms, dubbed Azure Quantum and Braket. Thats one of the cool things about quantum computing today, said Krauthamer. We can all get on and play with it.

RelatedQuantum Computing and the Gaming Industry

Quantum computing may still be in its fussy, uncooperative stage, but that hasnt stopped commercial interests from diving in.

IBM announced at the recent Consumer Electronics Show that its so-called Q Network had expanded to more than 100 companies and organizations. Partners now range from Delta Air Lines to Anthem health to Daimler AG, which owns Mercedes-Benz.

Some of those partnerships hinge on quantum computings aforementioned promise in terms of molecular simulation. Daimler, for instance, is hoping the technology will one day yield a way to produce better batteries for electric vehicles.

Elsewhere, partnerships between quantum computing startups and leading companies in the pharmaceutical industry like those established between 1QBit and Biogen, and ProteinQure and AstraZeneca point to quantum molecular modelings drug-discovery promise, distant though it remains. (Today, drug development is done through expensive, relatively low-yield trial-and-error.)

Researchers would need millions of qubits to compute the chemical properties of a novel substance, noted theoretical physicist Sabine Hossenfelder in the Guardian last year. But the conceptual underpinning, at least, is there. A quantum computer knows quantum mechanics already, so I can essentially program in how another quantum system would work and use that to echo the other one, explained Donohue.

Theres also hope that large-scale quantum computers will help accelerate AI, and vice versa although experts disagree on this point. The reason theres controversy is, things have to be redesigned in a quantum world, said Krauthamer, who considers herself an AI-quantum optimist. We cant just translate algorithms from regular computers to quantum computers because the rules are completely different, at the most elemental level.

Some believe quantum computers can help combat climate change by improving carbon capture. Jeremy OBrien, CEO of Palo Alto-based PsiQuantum, wrote last year that quantum simulation of larger molecules if achieved could help build a catalyst for scrubbing carbon dioxide directly from the atmosphere.

Long-term applications tend to dominate headlines, but they also lead us back to quantum computings defining hurdle and the reason coverage remains littered with terms like potential and promise: error correction.

Qubits, it turns out, are higher maintenance than even the most meltdown-prone rock star. Any number of simple actions or variables can send error-prone qubits falling into decoherence, or the loss of a quantum state (mainly that all-important superposition). Things that can cause a quantum computer to crash include measuring qubits and running operations in other words: using it. Even small vibrations and temperature shifts will cause qubits to decohere, too.

Thats why quantum computers are kept isolated, and the ones that run on superconducting circuits the most prominent method, favored by Google and IBM have to be kept at near-absolute zero (a cool -460 degrees Fahrenheit).

Thechallenge is two-fold, according to Jonathan Carter, a scientist at Berkeley Quantum. First, individual physical qubits need to have better fidelity. That would conceivably happen either through better engineering, discovering optimal circuit layout, and finding the optimal combination of components. Second, we have to arrange them to form logical qubits.

Estimates range from hundreds to thousands to tens of thousands of physical qubits required to form one fault-tolerant qubit. I think its safe to say that none of the technology we have at the moment could scale out to those levels, Carter said.

From there, researchers would also have to build ever-more complex systems to handle the increase in qubit fidelity and numbers. So how long will it take until hardware-makers actually achieve the necessary error correction to make quantum computers commercially viable?

Some of these other barriers make it hard to say yes to a five- or 10-year timeline, Carter said.

Donohue invokes and rejects the same figure. Even the optimist wouldnt say its going to happen in the next five to 10 years, he said. At the same time, some small optimization problems, specifically in terms of random number generation could happen very soon.

Weve already seen some useful things in that regard, he said.

For people like Michael Biercuk, founder of quantum-engineering software company Q-CTRL, the only technical commercial milestone that matters now is quantum advantage or, as he uses the term, when a quantum computer provides some time or cost advantage over a classical computer. Count him among the optimists: he foresees a five-to-eight year time scale to achieve such a goal.

Another open question: Which method of quantum computing will become standard? While superconducting has borne the most fruit so far, researchers are exploring alternative methods that involve trapped ions, quantum annealing or so-called topological qubits. In Donohues view, its not necessarily a question of which technology is better so much as one of finding the best approach for different applications. For instance, superconducting chips naturally dovetail with the magnetic field technology that underpins neuroimaging.

The challenges that quantum computing faces, however, arent strictly hardware-related. The magic of quantum computing resides in algorithmic advances, not speed, Greg Kuperberg, a mathematician at the University of California at Davis, is quick to underscore.

If you come up with a new algorithm, for a question that it fits, things can be exponentially faster, he said, using exponential literally, not metaphorically. (There are currently 63 algorithms listed and 420 papers cited at Quantum Algorithm Zoo, an online catalog of quantum algorithms compiled by Microsoft quantum researcher Scott Jordan.)

Another roadblock, according to Krauthamer, is general lack of expertise. Theres just not enough people working at the software level or at the algorithmic level in the field, she said. Tech entrepreneur Jack Hidaritys team set out to count the number of people working in quantum computing and found only about 800 to 850 people, according to Krauthamer. Thats a bigger problem to focus on, even more than the hardware, she said. Because the people will bring that innovation.

While the community underscores the importance of outreach, the term quantum supremacy has itself come under fire. In our view, supremacy has overtones of violence, neocolonialism and racism through its association with white supremacy, 13 researchers wrote in Nature late last year. The letter has kickstarted an ongoing conversation among researchers and academics.

But the fields attempt to attract and expand also comes at a time of uncertainty in terms of broader information-sharing.

Quantum computing research is sometimes framed in the same adversarial terms as conversations about trade and other emerging tech that is, U.S. versus China. An oft-cited statistic from patent analytics consultancy Patinformatics states that, in 2018, China filed 492 patents related to quantum technology, compared to just 248 in the United States. That same year, the think tank Center for a New American Security published a paper that warned, China is positioning itself as a powerhouse in quantum science. By the end of 2018, the U.S. passed and signed into law the National Quantum Initiative Act. Many in the field believe legislators were compelled due to Chinas perceived growing advantage.

The initiative has spurred domestic research the Department of Energy recently announced up to $625 million in funding to establish up to five quantum information research centers but the geopolitical tensions give some in the quantum computing community pause, namely for fear of collaboration-chilling regulation. As quantum technology has become prominent in the media, among other places, there has been a desire suddenly among governments to clamp down, said Biercuk, who has warned of poorly crafted and nationalistic export controls in the past.

What they dont understand often is that quantum technology and quantum information in particular really are deep research activities where open transfer of scientific knowledge is essential, he added.

The National Science Foundation one of the government departments given additional funding and directives under the act generally has a positive track record in terms of avoiding draconian security controls, Kuperberg said. Even still, the antagonistic framing tends to obscure the on-the-ground facts. The truth behind the scenes is that, yes, China would like to be doing good research and quantum computing, but a lot of what theyre doing is just scrambling for any kind of output, he said.

Indeed, the majority of the aforementioned Chinese patents are quantum tech, but not quantum computing tech which is where the real promise lies.

The Department of Energy has an internal list of sensitive technologies that it could potentially restrict DOE researchers from sharing with counterparts in China, Russia, Iran and North Korea. It has not yet implemented that curtailment, however, DOE Office of Science director Chris Fall told the House committee on science, space and technology and clarified to Science, in January.

Along with such multi-agency-focused government spending, theres been a tsunami of venture capital directed toward commercial quantum-computing interests in recent years. A Nature analysis found that, in 2017 and 2018, private funding in the industry hit at least $450 million.

Still, funding concerns linger in some corners. Even as Googles quantum supremacy proof of concept has helped heighten excitement among enterprise investors, Biercuk has also flagged the beginnings of a contraction in investment in the sector.

Even as exceptional cases dominate headlines he points to PsiQuantums recent $230 million venture windfall there are lesser-reported signs of struggle. I know of probably four or five smaller shops that started and closed within about 24 months; others were absorbed by larger organizations because they struggled to raise, he said.

At the same time, signs of at least moderate investor agitation and internal turmoil have emerged. The Wall Street Journal reported in January that much-buzzed quantum computing startup Rigetti Computing saw its CTO and COO, among other staff, depart amid concerns that the companys tech wouldnt be commercially viable in a reasonable time frame.

Investor expectations had become inflated in some instances, according to experts. Some very good teams have faced more investor skepticism than I think has been justified This is not six months to mobile application development, Biercuk said.

In Kuperbergs view, part of the problem is that venture capital and quantum computing operate on completely different timelines. Putting venture capital into this in the hope that some profitable thing would arise quickly, that doesnt seem very natural to me in the first place, he said, adding the caveat that he considers the majority of QC money prestige investment rather than strictly ROI-focused.

But some startups themselves may have had some hand in driving financiers over-optimism. I wont name names, but there definitely were some people giving investors outsize expectations, especially when people started coming up with some pieces of hardware, saying that advantages were right around the corner, said Donohe. That very much rubbed the academic community the wrong way.

Scott Aaronson recently called out two prominent startups for what he described as a sort of calculated equivocation. He wrote of a pattern in which a party will speak of a quantum algorithms promise, without asking whether there are any indications that your approach will ever be able to exploit interference of amplitudes to outperform the best classical algorithm.

And, mea culpa, some blame for the hype surely lies with tech media. Trying to crack an area for a lay audience means you inevitably sacrifice some scientific precision, said Biercuk. (Thanks for understanding.)

Its all led to a willingness to serve up a glass of cold water now and again. As Juani Bermejo-Vega, a physicist and researcher at University of Granada in Spain, recently told Wired, the machine on which Google ran its milestone proof of concept is mostly still a useless quantum computer for practical purposes.

Bermejo-Vegas quote came in a story about the emergence of a Twitter account called Quantum Bullshit Detector, which decrees, @artdecider-like, a bullshit or not bullshit quote tweet of various quantum claims. The fact that leading quantum researchers are among the accounts 9,000-plus base of followers would seem to indicate that some weariness exists among the ranks.

But even with the various challenges, cautious optimism seems to characterize much of the industry. For good and ill, Im vocal about maintaining scientific and technical integrity while also being a true optimist about the field and sharing the excitement that I have and to excite others about whats coming, Biercuk said.

This year could prove to be formative in the quest to use quantum computers to solve real-world problems, said Krauthamer. Whenever I talk to people about quantum computing, without fail, they come away really excited. Even the biggest skeptics who say, Oh no, theyre not real. Its not going to happen for a long time.

Related20 Quantum Computing Companies to Know

Read more:
What Is Quantum Computing and How Does it Work? - Built In

Quantum computing has arrived, but we still don’t really know what to do with it – ZDNet

As of 2019, the UK is half-way througha ten-year national programme designed to boost quantum technologies, which has so far benefited from a combined 1 billion investment from government and industry. The verdict? Quantum has a lot of potential but we're not sure what for.

Speaking at a conference in London, Claire Cramer, from the US Department of Energy, said: "There is a lot of promise in quantum, but we don't have a transformative solution yet. In reality, we don't know what impact the technology will have."

That is not to say, of course, that the past five years have been a failure. Quite the opposite: researchers around the world can now effectively trial and test quantum technology, because the hardware has been developed. In other words, quantum computers are no longer a feat of the imagination. The devices exist, and that in itself is a milestone.

SEE: Sensor'd enterprise: IoT, ML, and big data (ZDNet special report) | Download the report as a PDF (TechRepublic)

Earlier this month at the Consumer Electronics Show, in fact, IBM went to great lengths to remind the public that the IBM Q System One a 20-qubit quantum computer that the company says is capable of performing reliable quantum computations is gaining more momentum among researchers.

The Q System One has been deployed to 15 companies and laboratories so far, as a prototype that research teams can run to work out how quantum computers may be used to solve problems in the future.

Finding out what those problems might be is quantum's next challenge. Liam Blackwell, deputy director at the engineering and physical sciences research council, said: "A lot of money has been invested, and we need to start seeing actual outcomes that will benefit the UK. The challenge now, really, is that we have to deliver."

Research teams are not leaping into the unknown: there are already a few potential applications of quantum technology that have been put forward, ranging from enhancing security with quantum cryptography to improving the accuracy of GPS.

Pharmaceuticals and drug discovery have been identified as fields that could hugely benefit from the new technology as well. Last year, for example, neuroscience firm Biogen partnered with quantum computing research firm 1QBit to better tackle diseases like Alzheimer's and multiple sclerosis.

For Cramer, though, this is only scratching the surface. "Look at laser technology, for example," she said. "Seventy years ago, people didn't think lasers could even exist, and now you wouldn't think twice about holding a laser pointer in your hand.

"It's the same thing with quantum. We can't imagine what the transformative applications will be yet; so we need to maintain a culture of discovery."

There is only one secret to achieve a successful "culture of discovery", she continued: research, research, and more research. In the US, for example, the Department of Commerce recently created the Quantum Economic Development Consortium (QEDC). Its objectives? To "identify technology solutions" and "highlight use cases and grand challenges to accelerate development efforts".

It is not enough, however, to pump money into labs. Once blue-sky researchers have come up with an unexplored application of quantum, they still have to be able to commercialise their idea and bridging between labs and industry might be easier said than done.

In the UK, the issue is not confined to quantum technology. A recent report by VC company Octopus Ventures showed that trillions of pounds are lost every year because of the difficulty of bringing new ideas from university labs to the stock exchange.

In contrast, in the US, over 26,000 companies started in research teams from the Massachusetts Institute of Technology (MIT). Combined, these businesses have an annual turnaround of over $2 trillion (1.5 trillion).

"The UK has a very strong lead on research in quantum, but we have lessons to learn from the US," said Elham Kashefi, professor of computer science at the University of Edinburgh. "We need to push research to the next level, to connect it to industry."

SEE: The dark side of IoT, AI and quantum computing: Hacking, data breaches and existential threat

UKRI, an organisation that directs innovation funding through the budget of the department for business, energy and industrial strategy, recently stressed that commercialising quantum technology would be a priority.

The UK organisation invested 20 million in "pioneer funding" for start-ups leveraging quantum technology to develop "products of the future". Four projects benefited from the award to develop prototypes ranging from quantum sensors that can detect objects underground, to encryption tools that keep data safe.

UKRI is now investing another 153 million in new projects, alongside a 205 million investment from industry. Presenting the organisation's plans for the future, UKRI's director for quantum technologies, Roger McKinlay, said: "I don't know what's coming next, but I hope that we can continue to support what I believe is by far the most interesting emerging technology at the moment."

It doesn't seem, therefore, that quantum uncertainty will be resolved anytime soon but it certainly is worth watching this spot.

Follow this link:
Quantum computing has arrived, but we still don't really know what to do with it - ZDNet

IBM Just Called Out Google Over Their "Quantum Computer" – The National Interest Online

On Oct. 23, 2019, Google published a paper in the journal Nature entitled Quantum supremacy using a programmable superconducting processor. The tech giant announced its achievement of a much vaunted goal: quantum supremacy.

This perhaps ill-chosen term (coined by physicist John Preskill) is meant to convey the huge speedup that processors based on quantum-mechanical systems are predicted to exhibit, relative to even the fastest classical computers.

Googles benchmark was achieved on a new type of quantum processor, code-named Sycamore, consisting of 54 independently addressable superconducting junction devices (of which only 53 were working for the demonstration).

Each of these devices allows the storage of one bit of quantum information. In contrast to the bits in a classical computer, which can only store one of two states (0 or 1 in the digital language of binary code), a quantum bit qbit can store information in a coherent superposition state which can be considered to contain fractional amounts of both 0 and 1.

Sycamore uses technology developed by the superconductivity research group of physicist John Martinis at the University of California, Santa Barbara. The entire Sycamore system must be kept cold at cryogenic temperatures using special helium dilution refrigeration technology. Because of the immense challenge involved in keeping such a large system near the absolute zero of temperature, it is a technological tour de force.

Contentious findings

The Google researchers demonstrated that the performance of their quantum processor in sampling the output of a pseudo-random quantum circuit was vastly better than a classical computer chip like the kind in our laptops could achieve. Just how vastly became a point of contention, and the story was not without intrigue.

An inadvertent leak of the Google groups paper on the NASA Technical Reports Server (NTRS) occurred a month prior to publication, during the blackout period when Nature prohibits discussion by the authors regarding as-yet-unpublished papers. The lapse was momentary, but long enough that The Financial Times, The Verge and other outlets picked up the story.

A well-known quantum computing blog by computer scientist Scott Aaronson contained some oblique references to the leak. The reason for this obliqueness became clear when the paper was finally published online and Aaronson could at last reveal himself to be one of the reviewers.

Challenges to Googles story

The story had a further controversial twist when the Google groups claims were immediately countered by IBMs quantum computing group. IBM shared a preprint posted on the ArXiv (an online repository for academic papers that have yet to go through peer review) and a blog post dated Oct. 21, 2019 (note the date!).

While the Google group had claimed that a classical (super)computer would require 10,000 years to simulate the same 53-qbit random quantum circuit sampling task that their Sycamore processor could do in 200 seconds, the IBM researchers showed a method that could reduce the classical computation time to a mere matter of days.

However, the IBM classical computation would have to be carried out on the worlds fastest supercomputer the IBM-developed Summit OLCF-4 at Oak Ridge National Labs in Tennessee with clever use of secondary storage to achieve this benchmark.

While of great interest to researchers like myself working on hardware technologies related to quantum information, and important in terms of establishing academic bragging rights, the IBM-versus-Google aspect of the story is probably less relevant to the general public interested in all things quantum.

For the average citizen, the mere fact that a 53-qbit device could beat the worlds fastest supercomputer (containing more than 10,000 multi-core processors) is undoubtedly impressive. Now we must try to imagine what may come next.

Quantum futures

The reality of quantum computing today is that very impressive strides have been made on the hardware front. A wide array of credible quantum computing hardware platforms now exist, including ion traps, superconducting device arrays similar to those in Googles Sycamore system and isolated electrons trapped in NV-centres in diamond.

These and other systems are all now in play, each with benefits and drawbacks. So far researchers and engineers have been making steady technological progress in developing these different hardware platforms for quantum computing.

What has lagged quite a bit behind are custom-designed algorithms (computer programs) designed to run on quantum computers and able to take full advantage of possible quantum speed-ups. While several notable quantum algorithms exist Shors algorithm for factorization, for example, which has applications in cryptography, and Grovers algorithm, which might prove useful in database search applications the total set of quantum algorithms remains rather small.

Much of the early interest (and funding) in quantum computing was spurred by the possibility of quantum-enabled advances in cryptography and code-breaking. A huge number of online interactions ranging from confidential communications to financial transactions require secure and encrypted messages, and modern cryptography relies on the difficulty of factoring large numbers to achieve this encryption.

Quantum computing could be very disruptive in this space, as Shors algorithm could make code-breaking much faster, while quantum-based encryption methods would allow detection of any eavesdroppers.

The interest various agencies have in unbreakable codes for secure military and financial communications has been a major driver of research in quantum computing. It is worth noting that all these code-making and code-breaking applications of quantum computing ignore to some extent the fact that no system is perfectly secure; there will always be a backdoor, because there will always be a non-quantum human element that can be compromised.

Quantum applications

More appealing for the non-espionage and non-hacker communities in other words, the rest of us are the possible applications of quantum computation to solve very difficult problems that are effectively unsolvable using classical computers.

Ironically, many of these problems emerge when we try to use classical computers to solve quantum-mechanical problems, such as quantum chemistry problems that could be relevant for drug design and various challenges in condensed matter physics including a number related to high-temperature superconductivity.

So where are we in the wonderful and wild world of quantum computation?

In recent years, we have had many convincing demonstrations that qbits can be created, stored, manipulated and read using a number of futuristic-sounding quantum hardware platforms. But the algorithms lag. So while the prospect of quantum computing is fascinating, it will likely be a long time before we have quantum equivalents of the silicon chips that power our versatile modern computing devices.

Michael Bradley, Professor of Physics & Engineering Physics, University of Saskatchewan.This article is republished from The Conversation under a Creative Commons license. Read the original article.

Media: Reuters

See original here:
IBM Just Called Out Google Over Their "Quantum Computer" - The National Interest Online

Saving salmon and coronavirus outbreak: News from the College | Imperial News – Imperial College London

Heres a batch of fresh news and announcements from across Imperial.

From a new project to preserve safe havens for salmon, to Imperial researchers analysing the extent of the coronavirus outbreak, here is some quick-read news from across the College.

The population of Wild North Atlantic Salmon is now at its lowest level ever recorded, inspiring the new Six Rivers Project, led by the Marine and Freshwater Research Institute (MFRI) Iceland and Imperial College London, and funded by Sir Jim Ratcliffe and Ineos.

The project, which had its inaugural conference this month, is focused on preserving both the land and river ecosystems across six rivers in northeast Iceland, supporting one of the last safe havens where salmon populations still thrive.

Imperials Professor Guy Woodward said: The North Atlantic Salmon is a keystone species in the ecosystem. Icelands rivers have simple ecosystems providing ideal research conditions. Their latitude also brings with it a potential sensitivity to the effects of climate change, more so than in other parts of the world.

Read more about the inaugural conference of the Six Rivers Project.

Provost Ian Walmsley discussed the future of quantum computing at the Digital-Life-Design (DLD) conference in Munich.

He made the case for globalcollaboration in the race to develop a viable quantum computer, and spoke to German media, including BR24.

Other participants in DLD, one of the worlds most important technology events, included Nick Clegg, Ursula von der Leyen and Garry Kasparov.

The first Photonics Online Meetup a free, online-only global conference for photonics researchers went ahead with great success this month.

The five-hour-long conference on 13 January 2020 brought together 1,100 researchers in 37 countries across six continents in real time. More than 635 of these researchers gathered at 66 local hubs in 27 countries to join in together.

There was also a Twitter-based poster session, with 59 virtual posters averaging 3,000 views each. Videos of the event are now available online, with around 150 people downloading the videos in the first 24 hours.

Read more at the Photonics Online Meetup.

The Early Years Centre (EYC) has reopened after an extensive refurbishment. Staff, parents and children attended the opening event on Thursday 16 January.

Tracy Halsey, Early Years Centre Manager, thanked staff for their efforts, and Professor Emma McCoy, Early Years Committee chair, declared the new centre open with a ribbon-cutting ceremony.

This 8m investment has expand the EYCs capacity, creating an additional 56 places and refurbishing the existing indoor and outdoor space. The extra places are being introduced in response to the growing demand for affordable childcare onsite. The EYC can offer places to over 200 children and will reduce the average waiting time for a place. The EYC will celebrate its fiftieth anniversary this year.

Read more about the project on our news site.

Pembridge Hall has become one of the top halls in the UK to complete the Student Switch Off campaigns climate change quiz. Over 1,000 Imperial students took the quiz with over 500 pledging to save energy, water and recycle. Pembridge Hall will receive 50 tubs of Ben & Jerrys ice cream as their reward.

The Student Switch Off campaign, aimed at encouraging sustainability, also includes a microgrant scheme which gives students funding to organise their own pro-environmental activities. Imperial undergraduate, Lauren Wheeler, has become the first student in the UK to receive a microgrant to run an event she will raise funds to help those affected by the recent wildfires in Australia.

The hall that gets the most student engagement over the year will receive 250 for their hall committee. The campaign will continue this term.

Imperial researchers are helping with the global response to the spread of coronavirus. They are also leading voices on the matter in the media worldwide, appearing in over a thousand media articles and broadcast news packages about the outbreak.

An ongoing series of reports from the MRC Centre for Global Infectious Disease Analysis and J-IDEA at Imperial is looking at the number of cases and understanding the transmissibility of the disease. Other researchers at the College are working on areas including vaccine development and helping the UK to respond.

Commentary from eleven Imperial experts has featured in global outlets including the BBC World Service, CNN, andNew York Times.

For further updates, visit the Centres website.

Want to be kept up to date on news at Imperial?

Sign up for our free quick-read daily e-newsletter, Imperial Today.

Original post:
Saving salmon and coronavirus outbreak: News from the College | Imperial News - Imperial College London

Detailed Analysis and Report on Topological Quantum Computing Market By Microsoft, IBM, Google. – New Day Live

The Topological Quantum Computing market has been changing all over the world and we have been seeing a great growth In the Topological Quantum Computing market and this growth is expected to be huge by 2026 and in this report, we provide a comprehensive valuation of the marketplace. The growth of the market is driven by key factors such as manufacturing activity, risks of the market, acquisitions, new trends, assessment of the new technologies and their implementation. This report covers all of the aspects required to gain a complete understanding of the pre-market conditions, current conditions as well as a well-measured forecast.

The report has been segmented as per the examined essential aspects such as sales, revenue, market size, and other aspects involved posting good growth numbers in the market.

Top Companies are covering This Report:- Microsoft, IBM, Google, D-Wave Systems, Airbus, Raytheon, Intel, Hewlett Packard, Alibaba Quantum Computing Laboratory, IonQ.

Get Sample PDFBrochure@https://www.reportsintellect.com/sample-request/504676

Description:

In this report, we are providing our readers with the most updated data on the Topological Quantum Computing market and as the international markets have been changing very rapidly over the past few years the markets have gotten tougher to get a grasp of and hence our analysts have prepared a detailed report while taking in consideration the history of the market and a very detailed forecast along with the market issues and their solution.

The given report has focused on the key aspects of the markets to ensure maximum benefit and growth potential for our readers and our extensive analysis of the market will help them achieve this much more efficiently. The report has been prepared by using primary as well as secondary analysis in accordance with porters five force analysis which has been a game-changer for many in the Topological Quantum Computing market. The research sources and tools that we use are highly reliable and trustworthy. The report offers effective guidelines and recommendations for players to secure a position of strength in the Topological Quantum Computing market. The newly arrived players in the market can up their growth potential by a great amount and also the current dominators of the market can keep up their dominance for a longer time by the use of our report.

Topological Quantum Computing Market Type Coverage:

SoftwareHardwareService

Topological Quantum Computing Market Application Coverage:

CivilianBusinessEnvironmentalNational SecurityOthers

Market Segment by Regions, regional analysis covers

North America (United States, Canada, Mexico)

Asia-Pacific (China, Japan, Korea, India, Southeast Asia)

South America (Brazil, Argentina, Colombia, etc.)

Europe, Middle East and Africa (Germany, France, UK, Russia and Italy, Saudi Arabia, UAE, Egypt, Nigeria, South Africa)

Discount PDF Brochure @https://www.reportsintellect.com/discount-request/504676

Competition analysis

As the markets have been advancing the competition has increased by manifold and this has completely changed the way the competition is perceived and dealt with and in our report, we have discussed the complete analysis of the competition and how the big players in the Topological Quantum Computing market have been adapting to new techniques and what are the problems that they are facing.

Our report which includes the detailed description of mergers and acquisitions will help you to get a complete idea of the market competition and also give you extensive knowledge on how to excel ahead and grow in the market.

Why us:

Reasons to buy:

About Us:

Reports Intellect is your one-stop solution for everything related to market research and market intelligence. We understand the importance of market intelligence and its need in todays competitive world.

Our professional team works hard to fetch the most authentic research reports backed with impeccable data figures which guarantee outstanding results every time for you.So whether it is the latest report from the researchers or a custom requirement, our team is here to help you in the best possible way.

Contact Us:

sales@reportsintellect.comPhone No: + 1-706-996-2486US Address:225 Peachtree Street NE,Suite 400,Atlanta, GA 30303

Read the original here:
Detailed Analysis and Report on Topological Quantum Computing Market By Microsoft, IBM, Google. - New Day Live

Superconductivity? Stress is the Answer For Once – Cornell University The Cornell Daily Sun

Weve been told all of our lives to avoid stress but in physics, stress might just be the key to unlocking the secret of superconductivity.

Superconductivity, the phenomenon in which the electrical resistance of a material suddenly drops to zero when cooled below a certain temperature, has been a scientific curiosity ever since its discovery in the early 20th century.

A group of Cornell researchers led by Prof. Katja Nowack, physics, published a paper on Oct. 11 in Science that investigates how physically deforming a material can cause it to show traits of partial superconductivity.

The interest first arose in the work of collaborator Philip Moll, a researcher at the Institute of Material Science and Engineering at cole Polytechnique Fdral de Lausanne in Switzerland, during his investigation of the superconductive properties of the metal cerium iridium indium-5.

In an attempt to establish superconductivity, Moll discovered that the critical temperature was changing depending on the placement of the wire contacts. This collides directly with the conventional belief of superconductivity, which is that the entire material must be either completely, uniformly superconductive, or not.

Nowack learned of these strange results from Prof. Brad Ramshaw, physics, and decided to investigate them using a device called a superconducting quantum interference device, which can measure local resistivities of small areas.

What we found in the end was that in these little microstructures, superconductivity doesnt uniformly form in the device, but forms in a very spatially modulated, nonuniform fashion. So theres these little puddles of superconductivity in some parts of the device, and other parts stay non-superconductive down to much lower temperatures, Nowack said.

They also discovered that these superconductive puddles correlated to the varying amounts of physical stress produced from the creation of the samples. Molls team had created the samples by gluing CeIrIn5 crystals to a sapphire substrate and etching patterns into them using a focus ion beam, similar to a mini-sandblower.

According to Nowack, CeIrIn5 shrinks by about 0.3 percent as it cools due to its metallic properties, whereas sapphire does not shrink at all. The resulting strain seemed to be causing the irregular superconductivity noticed by Moll.

Actually in the literature, it was known that the superconducting transition temperature of the material must depend on strain, Nowack said. However, only some simple strains, like a single stretch along one axis, had been tested. Using this theory, the Cornell group developed a model for relating strain to superconductivity, and upon comparing their models predictions to the more complex deformations of the CeIrIn5 samples, found that the findings correlated exactly.

These findings open up a whole host of possible applications. This correlation between strain and superconductivity may become a new way of investigating the superconductive properties of other metals, which in turn could help refine physicists understanding of this relationship even further.

The group hopes to investigate how these new discoveries could affect existing devices, like the Josephson junction, a device which utilizes two superconductors and has applications in quantum computing. Were [also] thinking we can apply this to interesting magnetic systems that have interesting magnetic order, and change the properties of the magnetic order using strain, Nowack said.

Link:
Superconductivity? Stress is the Answer For Once - Cornell University The Cornell Daily Sun

Healthcare venture investment in 2020: Quantum computing gets a closer look – Healthcare IT News

Among the healthcare technologies venture firms be looking at most closely at in 2020, various artificial intelligence and machine learning applications are atop this list, of course. But so are more nuts-and-bolts tools like administrative process automation and patient engagement platforms, VCs say.

Other, more leading-edge technologies genomics-focused data and analytics, and even quantum computing are among the areas attracting investor interest this year.

"We expect 2020 to mark the first year where health IT venture firms will start to look at quantum computing technology for upcoming solutions," Dr. Anis Uzzaman, CEO and general partner of Pegasus Tech Ventures, told Healthcare IT News.

"With the breakthrough supremacy announcement from Google validating the technology and the subsequent launch of the service Amazon Braket in 2019, there is sure to be a new wave of entrepreneurial activity starting in 2020."

He said quantum computing technology holds a lot of promise for the healthcare industry with potential breakthroughs possible throughout the health IT stack from operations and administration to security.

Among the promising companies, Uzzaman pointed to Palo Alto-based QC Ware, a startup pioneering a software solution that enables companies to use a variety of quantum hardware platforms such as Rigetti and IBM to solve a variety of enterprise problems, including those specifically related to healthcare.

He also predicted artificial intelligence would continue to be at the forefront for health IT venture firms in 2020 as it becomes more clear which startups may be winners in their initial target sectors.

"There has been consistent growth of investment activity over the past few years into healthcare startups using artificial intelligence to target a range of areas from imaging to diagnostics," he said.

However, Uzzaman also noted regulation and long enterprise sales cycles have largely slowed the ability for these companies to significantly scale their revenues.

"Therefore, we anticipate 2020 will be the year where it will become clearer to health IT venture firms who will be winners in applying artificial intelligence to imaging, pathology, genomics, operations, diagnostics, transcription, and more," he said. "We will also continue to see moderate growth in the overall investment amount in machine learning and AI companies, but will see a notable decrease in the number of companies receiving an investment.

Uzzaman explained there were already some signs in late 2019 that there could be late in a short-term innovation cycle for artificial intelligence with many companies, particularly those applying machine learning and AI to robotics, shutting down.

"However, we anticipate many companies will reach greater scale with their solutions and separate themselves from the competition, which will translate into more mega funding rounds," he said.

Ezra Mehlman, managing partner with Health Enterprise Partners, explained that at the beginning of each year, the firm conducts a market mapping exercise to determine which healthcare IT categories are rising to the top of the prioritization queue of its network of hospital and health plan limited partners.

"In the past year, we have seen budgets meaningfully open for automation solutions in administrative processing, genomics-focused data and analytics offerings, aging-in-place technologies and, in particular, patient engagement platforms rooted in proven clinical use cases," he said. "We are actively looking at all of these spaces."

He pointed out that in 2018, more than $2 billion was invested into artificial intelligence and machine learning healthcare IT companies, which represented a quarter of the total dollars invested into digital health companies that year.

"We view this as a recognition of two things: the meteoric aspirations that the market has assigned to AI and machine learning's potential, and a general sense that the underlying healthcare data infrastructure has reached the point of maturity, where it is possible to realize ROI from AI/machine learning initiatives," he said.

However, he said Health Enterprise Partners is still waiting for the "breakout" to occur in adoption.

"We believe we have now reached the point where category leaders will emerge in each major healthcare AI subsector and the usage will become more widespread we have made one such investment in the clinical AI space in the last year," Mehlman said.

Heading into 2020, Mehlman said companies that cannot deliver high-six-figure, year-one ROI in the form of increased revenue or reduced cost will struggle, and companies that cannot crisply answer the question, "Who is the buyer and what is the budget?" will be challenged.

"If one applies these tests to some of the areas that have attracted the most healthcare VC investment--social determinants of health, blockchain and digital therapeutics to name a few the number of viable companies sharply drops off," he said.

Mehlman noted that while these sound like simple principles, the current environment of rapidly consolidating, budget-constrained hospitals, vertically integrating health plans, and big tech companies making inroads into healthcare has raised the bar on what is required for a healthcare startup to gain meaningful market traction.

Read more:
Healthcare venture investment in 2020: Quantum computing gets a closer look - Healthcare IT News

What Is Quantum Computing, And How Can It Unlock Value For Businesses? – Computer Business Review

Add to favorites

We are at an inflection point

Ever since Professor Alan Turing proposed the principle of the modern computer in 1936, computing has come a long way. While advancements to date have been promising, the future is even brighter, all thanks to quantum computing, which performs calculations based on the behaviour of particles at the sub-atomic level, writes Kalyan Kumar, CVP and CTO IT Services,HCL Technologies.

Quantum computing promises to unleash unimaginable computing power thats not only capable of addressing current computational limits, but unearthing new solutions to unsolved scientific and social mysteries. Whats more, thanks to increasing advancement since the 1980s, quantum computing can now drive some incredible social and business transformations.

Quantum computing holds immense promise in defining a positive, inclusive and human centric future, which is what theWEF Future Council on Quantum Computingenvisages. The most anticipated uses of quantum computing are driven by its potential to simulate quantum structures and behaviours across chemicals and materials. This promise is being seen guardedly by current scientists who claim quantum computing is still far from making a meaningful impact.

This said, quantum computing is expected to open amazing and much-needed possibilities in medical research. Drug development time, which usually takes more than 10 to 12 years with billions of dollars of investment, is expected to reduce considerably, alongside the potential to explore unique chemical compositions that may just be beyond the limits of current classical computing. Quantum computing can also help with more accurate weather forecasting, and provide accurate information that can help save tremendous amounts of agriculture production from damage.

Quantum computing promises a better and improved future, and while humans are poised to benefit greatly from this revolution, businesses too can expect unapparelled value.

When it comes to quantum computing, it can be said that much of the world is at the they dont know what they dont know stage. Proof points are appearing, and it is seemingly becoming clear that quantum computing solves problems that cannot be addressed by todays computers. Within transportation, for example, quantum computing is being used to develop battery and self-driving technologies, while Volkswagen has also been using quantum computing to match patterns and predict traffic conditions in advance, ensuring a smoother movement of traffic. In supply chains, logistics and trading are receiving a significant boost from the greater computing power and high-resolution modelling quantum computing provides, adding a huge amount of intelligence using new approaches to machine learning.

The possibilities for businesses are immense and go way beyond these examples mentioned above, in domains such as healthcare, financial services and IT. Yet a new approach is required. The companies that succeed in quantum computing will be those that create value chains to exploit the new insights, and form a management system to match the high-resolution view of the business that will emerge.

While there are some initial stage quantum devices already available, these are still far from what the world has been envisaging. Top multinational technology companies have been investing considerably in this field, but they still have some way to go. There has recently been talk of prototype quantum computers performing computations that would have previously taken 10,000 years in just 200 seconds. Though of course impressive, this is just one of the many steps needed to achieve the highest success in quantum computing.

It is vital to understand how and when we are going to adopt quantum computing, so we know the right time to act. The aforementioned prototype should be a wakeup call to early adopters who are seeking to find ways to create a durable competitive advantage. We even recently saw a business announcing its plans to make a prototype quantum computer available on its cloud, something we will all be able to buy or access some time from now. If organisations truly understand the value and applications of quantum computing, they will be able to create new products and services that nobody else has. However, productising and embedding quantum computing into products may take a little more time.

One important question arises from all this: are we witnessing the beginning of the end for classical computing? When looking at the facts, it seems not. With the advent of complete and practical quantum computers, were seeing a hybrid computing model emerging where digital binary computers will co-process and co-exist with quantum Qbit computers. The processing and resource sharing needs are expected to be optimised using real time analysis, where quantum takes over exponential computational tasks. To say the least, quantum computing is not about replacing digital computing, but about coexistence enabling composed computing that handles different tasks at the same time similar to humans having left and right brains for analytical and artistic dominance.

If one things for sure, its that we are at an inflection point, witnessing what could arguably be one of the most disruptive changes in human existence. Having a systematic and planned approach to adoption of quantum computing will not only take some of its mystery away, but reveal its true strategic value, helping us to know when and how to become part of this once in a lifetime revolution.

Read more:
What Is Quantum Computing, And How Can It Unlock Value For Businesses? - Computer Business Review

New Centers Lead the Way towards a Quantum Future – Energy.gov

The world of quantum is the world of the very, very small. At sizes near those of atoms and smaller, the rules of physics start morphing into something unrecognizableat least to us in the regular world. While quantum physics seems bizarre, it offers huge opportunities.

Quantum physics may hold the key to vast technological improvements in computing, sensing, and communication. Quantum computing may be able to solve problems in minutes that would take lifetimes on todays computers. Quantum sensors could act as extremely high-powered antennas for the military. Quantum communication systems could be nearly unhackable. But we dont have the knowledge or capacity to take advantage of these benefitsyet.

The Department of Energy (DOE) recently announced that it will establish Quantum Information Science Centers to help lay the foundation for these technologies. As Congress put forth in the National Quantum Initiative Act, the DOEs Office of Science will make awards for at least two and up to five centers.

These centers will draw on both quantum physics and information theory to give us a soup-to-nuts understanding of quantum systems. Teams of researchers from universities, DOE national laboratories, and private companies will run them. Their expertise in quantum theory, technology development, and engineering will help each center undertake major, cross-cutting challenges. The centers work will range from discovery research up to developing prototypes. Theyll also address a number of different technical areas. Each center must tackle at least two of these subjects: quantum communication, quantum computing and emulation, quantum devices and sensors, materials and chemistry for quantum systems, and quantum foundries for synthesis, fabrication, and integration.

The impacts wont stop at the centers themselves. Each center will have a plan in place to transfer technologies to industry or other research partners. Theyll also work to leverage DOEs existing facilities and collaborate with non-DOE projects.

As the nations largest supporter of basic research in the physical sciences, the Office of Science is thrilled to head this initiative. Although quantum physics depends on the behavior of very small things, the Quantum Information Science Centers will be a very big deal.

The Office of Science is the single largest supporter of basic research in the physical sciences in the United States and is working to address some of the most pressing challenges of our time. For more information, please visit https://www.energy.gov/science.

Read more here:
New Centers Lead the Way towards a Quantum Future - Energy.gov

ASC20 Finals to be Held in Shenzhen, Tasks Include Quantum Computing Simulation and AI Language Exam – HPCwire

BEIJING, Jan. 21, 2020 The 2020 ASC Student Supercomputer Challenge (ASC20) announced the tasks for the new season: using supercomputers to simulate Quantum circuit and training AI models to take English test. These tasks can be unprecedented challenges for the 300+ ASC teams from around the world. From April 25 to 29, 2020, top 20 finalists will fiercely compete at SUSTech in Shenzhen, China.

ASC20 set up Quantum Computing tasks for the first time. Teams are going to use the QuEST (Quantum Exact Simulation Toolkit) running on supercomputers to simulate 30 qubits in two cases: quantum random circuits (random.c), and quantum fast Fourier transform circuits (GHZ_QFT.c). Quantum computing is a disruptive technology, considered to be the next generation high performance computing. However the R&D of quantum computers is lagging behind due to the unique properties of quantum. It adds extra difficulties for scientists to use real quantum computers to solve some of the most pressing problems such as particle physics modeling, cryptography, genetic engineering, and quantum machine learning. From this perspective, the quantum computing task presented in the ASC20 challenge, hopefully, will inspire new algorithms and architectures in this field.

The other task revealed is Language Exam Challenge. Teams will take on the challenge to train AI models on an English Cloze Test dataset, vying to achieve the highest test scores. The dataset covers multiple levels of English language tests in China, including the college entrance examination, College English Test Band 4 and Band 6, and others. Teaching the machines to understand human language is one of the most elusive and long-standing challenges in the field of AI. The ASC20 AI task signifies such a challenge, by using human-oriented problems to evaluate the performance of neural networks.

Wang Endong, ASC Challenge initiator, member of the Chinese Academy of Engineering and Chief Scientist at Inspur Group, said that through these tasks, students from all over the world get to access and learn the most cutting-edge computing technologies. ASC strives to foster supercomputing & AI talents of global vision, inspiring technical innovation.

Dr. Lu Chun, Vice President of SUSTech host of the ASC20 Finals, commented that supercomputers are important infrastructure for scientific innovation and economic development. SUSTech makes focused efforts on developing supercomputing and hosting ASC20, hoping to drive the training of supercomputing talent, international exchange and cooperation, as well as inter discipline development at SUSTech.

Furthermore, during January 15-16, 2020, the ASC20 organizing committee held a competition training camp in Beijing to help student teams prepare for the ongoing competition. HPC and AI experts from the State Key Laboratory of High-end Server and Storage Technology, Inspur, Intel, NVIDIA, Mellanox, Peng Cheng Laboratory and the Institute of Acoustics of the Chinese Academy of Sciences gathered to provide on-site coaching and guidance. Previous ASC winning teams also shared their successful experiences.

About ASC

The ASC Student Supercomputer Challenge is the worlds largest student supercomputer competition, sponsored and organized by Asia Supercomputer Community in China and supported by Asian, European, and American experts and institutions. The main objectives of ASC are to encourage exchange and training of young supercomputing talent from different countries, improve supercomputing applications and R&D capacity, boost the development of supercomputing, and promote technical and industrial innovation. The annual ASC Supercomputer Challenge was first held in 2012 and has since attracted over 8,500 undergraduates from all over the world. Learn more ASC athttps://www.asc-events.org/.

Source: ASC

Follow this link:
ASC20 Finals to be Held in Shenzhen, Tasks Include Quantum Computing Simulation and AI Language Exam - HPCwire