On the Path to Exascale, Q-Exa Consortium Tightens the Bonds Between Quantum Computers and Traditional Supercomputing – HPCwire

Nov. 15, 2021 During a press conference on Nov. 15, 2021, German Federal Minister for Education and Research Anja Karliczek announced the beginning of the Q-Exa consortium, an ambitious project aimed at accelerating European quantum computing technologies with the assistance of traditional high-performance computing (HPC).

Q-Exa brings together experts from academia and industry to deploy a 20-qbit quantum demonstrator at the end of 2023 and integrate it into the Leibniz Supercomputing Centres (LRZs) HPC ecosystem. LRZ, one of the 3 centers comprising the Gauss Centre for Supercomputing, is partnering with quantum computer hardware company IQM, software developer HQS, and supercomputer manufacturer Atos. The project is funded with 40 million and will run for 3 years.

LRZ Director Prof. Dr. Dieter Kranzlmller indicated that in addition to developing applications for quantum computing, Q-Exa also serves as an important milestone on the path to exascale computingthe next major milestone is traditional HPC, representing a 40-fold increase in supercomputing power from LRZs current flagship computer, SuperMUC-NG.

At LRZ, we are focused on more than just faster computerswe are looking at new ways of computing, and have been developing and implementing our integrated supercomputing architecture, he said. The Q-Exa project fits in perfectly with our goals in that regard, and also serves as a foundational piece to our Quantum Integration Centre and the Munich Quantum Valley. With Q-Exa, we are able to enhance our current large-scale computing resources with this quantum demonstrator.

Kranzlmller also emphasized that by participating in a co-design project with IQM and HQS, LRZ would be able to bring its decades of experience in bringing new computing technologies to science and industry to a new disruptive computing technology, ensuring that these systems are designed with users from academia and industry in mind and that applications can be ported and scaledto take advantage of the promise of quantum computers.

For more information on the Q-Exa project, read the BMBFpress release(in German) or watch thelivestreamof the event.

Source: Gauss Centre for Supercomputing

Go here to read the rest:
On the Path to Exascale, Q-Exa Consortium Tightens the Bonds Between Quantum Computers and Traditional Supercomputing - HPCwire

What Europe can learn from France when it comes to quantum computing – Sifted

The French ambition to become a world leader in deeptech is one of Europes worst-kept secrets.

Not only does the country have one of the biggest deeptech funds in Europe, Bpifrance,but more importantly it has the people and the pipeline of talent through a best-in-breed university system, which is helping the country become a hotbed for innovation.

Quantum is one segment of deeptech where the French are leaving the rest of Europe, and in fact most other nations, far behind. The ambition to set up a quantum hub in the Paris region, linking large corporations and startups, is truly impressive and far-reaching.

Not only is the region focusing on nurturing homegrown talents, but they are also actively scouting for overseas companies to set up European headquarters in the cluster. How would we know? Well, we were one of the very few UK companies targeted.

France has always been at the forefront of cryptography and has one of the richest ecosystems for quantum pioneers. That history includes individuals ranging from the winners of the Nobel Prize in Physics, Albert Fert and Serge Haroche, to French National Centre for Scientific Research (CNRS) Gold Medallist Alain Aspects pioneering research on quantum entanglement and quantum simulators.

To build on this, earlier this year the French government announced a 1.8bn strategy to boost research in quantum technologies over five years. This will see public investment in the field increase from 60m to 200m a year.

Not only is investment increasing, but the often overlooked part is that funding is being funnelled into various fields of quantum computing.France recognises that quantum computing is not a homogenous industry and that various aspects require attention outside the development of actual quantum computers.

France is building a frameworkto make the country a key player across the entire quantum ecosystem

For example, one such area is security. Once a functioning quantum computer emerges, the cryptography that is used to secure all data and communications will become obsolete overnight.

Compounding this risk is the harvest now, decrypt later threat. Nefarious hackers might intercept data today and then hold onto it until quantum computers are advanced enough to decrypt it. To tackle this, new encryption methods are being developed that can stand against these new powerful computers, also known as post-quantum cryptography (PQC).

Its clear France recognises this threat, with plans to put 150 million directly to R&D in the field of PQC. This is in addition to the 780 million that is being devoted to developing computing alone, and the 870 million that is being set aside for sensor research, quantum communications and other related technologies.

Taken together, France is building a framework for industrial and research forces to make the country a key player across the entire quantum ecosystem, from computing development to post-quantum security.

So how does the rest of Europe compare? The short answer is that its lagging far behind.

Frances closest competitor is Germany, with its government recently pledging to invest 2bn in quantum computing and related technologies over five years. Thats a larger number than Frances commitment but it appears the scope is to only build a competitive quantum computer in five years while growing a network of companies to develop applications.

France is well on its way to protecting itself against the very real security threats quantum computers will pose

Investments by other individual governments across the rest of Europe are minimal, with many relying on the EUs Quantum Technologies Flagship programme to lead the way. However, with $1.1bn earmarked to cover 27 countries, little attention is being placed beyond computing R&D into adjacent fields like quantum security and communications.

Even if we focus on the security side of the coin, France is well on its way to protecting itself against the very real security threats quantum computers will pose, with the rest of Europe leaving themselves vulnerable.

It is also the case that France, in my opinion, is keeping pace with the traditional leaders the US, China and Canada and even pushing ahead in some areas.

While the US, Canadian and Chinese governments have committed impressive amounts to quantum, much of the focus in these countries is on developing a functioning computer, without recognising that a successful quantum strategy needs to be much broader. For example, although it has now developed a broad security roadmap, the US Department of Homeland Securitys budget for next year makes scant reference to quantum computing and the technology that is going to underpin post-quantum security.

If we measure success in quantum by not only how quickly we can develop such computers, but also how effectively they can be applied and how robust our protection is against the darker side of the technology, then Id argue that France has the worlds most balanced and systemic approach.

France is firmly Europes trailblazing nation; the rest of the continent ought to take note.

Andersen Cheng is CEO of Post-Quantum and Nomidio

More here:
What Europe can learn from France when it comes to quantum computing - Sifted

Quantum computing leaders founded Zapata to accelerate the …

Christopher Savoie, Ph.D., JD

CEO & Founder

CSO, Founder & Professor at University of Toronto

CTO & Founder

Professional Service Lead & Founder

Associate Director for Quantum Science IP & Founder

Lead Research Scientist & Founder

We founded Zapata to develop quantum algorithms and software that deliver real-world advances for applications on near term quantum computers.

Christopher Savoie, CEO & Founder

CEO & Founder

CSO, Founder & Professor at University of Toronto

Managing Director, C Sensei Group LLC

Board Director; CEO of RealPage, Inc.

Managing Director, Comcast Ventures

Principal at Prelude Ventures

Founding CEO & Vice Chairman of GRAIL, Former SVP of Google Ads, Apps, Maps and [x]

Board Director; Chair & President Family Foundation; Retired CSO & CMO Honeywell

Partner at Pillar VC (Board Observer)

Strong teams built around innovation in quantum algorithms are going to be the key to make these advances practical and widely available.

Alan Aspuru-Guzik, Co-Founder & CSO

Professor in the Department of Computer Science and Institute for Advanced Computer Studies at the University of Maryland

Associate Professor of Physics at the MIT Center for Theoretical Physics

Landon T. Clay Professor of Mathematics and Theoretical Science at Harvard

Professor of Quantum Physics at Freie Universitt Berlin

Chair of the Zapata SAB, Associate Professor of Physics and Astronomy at Tufts University

Associate Professor of Electrical Engineering and Computer Science and Lincoln Laboratory Fellow at MIT, Director of the MIT Center for Quantum Engineering, and Associate Director of the MIT Research Laboratory of Electronics

To realize the full promise of quantum computing will take time, science, and engineering across the board. Zapata has brought together a fantastic team of researchers who want to work with academia and industry to develop tomorrows quantum algorithms.

Will Oliver, Associate Professor MIT

Vice President Business Development

Vice President of Engineering

Chief Orquestra Evangelist

Chief Marketing Officer

Chief Financial Officer

General Counsel

Vice President of Corporate Operations

Director, Global Channel Partnerships

Director, Quantum Solutions

Deputy General Counsel

Senior Legal Analyst

Zapata is all about bridging the gap and helping those interested parties get into the quantum ecosystem and connect them to the right hardware partner.

Jonny Olson, Co-Founding Scientist

Associate Director of Quantum AI

Lead Quantum Software Engineer

Quantum Application Scientist

Quantum Application Scientist

Quantum Application Scientist

Quantum Application Scientist

Quantum Application Scientist

Quantum Application Scientist

Sr ML-DevOps Engineer

Quantum Software Engineer

Quantum Software Engineer

Quantum Software Engineer

Quantum Software Engineer

Cloud Engineer

Quantum Research Scientist

Quantum Research Scientist

Quantum Platform Engineering Manager

Quantum Application Scientist

Sr Quantum Platform Architect

Senior Quantum Platform Engineer

Quantum Application Scientist

Quantum Application Scientist

Quantum Software Engineer

Machine Learning Engineer

Quantum Platform Engineer

Quantum Platform Engineer

People at Zapata come from many different backgrounds and domains. Everyone is extremely driven, working at quantums edgesbut also genuinely thoughtful and caring.

Micha Stchy, Quantum Software Engineer

Quantum Solutions Engineer

Quantum Solution Engineer

Quantum Solution Engineer

UK/EU Business Development

Quantum Solutions Engineer

Strategic Partner Alliance Manager

Controller

Executive Assistant

People Operations Manager

Operations Administrator

Product Marketing Manager

Marketing Specialist

Marketing & Product Intern

Marketing Coordinator

Administrative Assistant

Here is the original post:
Quantum computing leaders founded Zapata to accelerate the ...

Supercomputers are becoming another cloud service. Here’s what it means – ZDNet

These days supercomputers aren't necessarily esoteric, specialised hardware; they're made up of high-end servers that are densely interconnected and managed by software that deploys high performance computing (HPC) workloads across that hardware. Those servers can be in a data centre but they could also be in the cloud as well.

When it comes to large simulations like the computational fluid dynamics to simulate a wind tunnel processing the millions of data points needs the power of a distributed system and the software that schedules these workloads is designed for HPC systems. If you want to simulate 500 million data points and you want to do that 7,000 or 8,000 times to look at a variety of different conditions, that's going to generate about half a petabyte of data; even if a cloud virtual machine (VM) could cope with that amount of data, the compute time would take millions of hours so you need to distribute it and the tools to do that efficiently need something that looks like a supercomputer, even if it lives in a cloud data centre.

The best cloud storage services

Free and cheap personal and small business cloud storage services are everywhere. But, which one is best for you? Let's look at the top cloud storage options.

Read More

When the latest Top 500 list came out this summer, Azure had four supercomputers in the top 30; for comparison, AWS had one entry on the list, in 41st place.

SEE: Nextcloud Hub: User tips (free PDF) (TechRepublic)

HPC users on Azure run computational fluid dynamics, weather forecasting, geoscience simulation, machine learning, financial risk analysis, modelling for silicon chip design (a popular enough workload that Azure has FX-series VMs with an architecture specifically for electronic design automation), medical research, genomics, biomedical simulations and physics simulations, as well as workloads like rendering.

They do some of that on traditional HPC hardware; Azure offers Cray XC and CS supercomputers and the UK's Met Office is getting four Cray EX systems on Azure for its new weather-forecasting supercomputer. But you can also put together a supercomputer from H and N-Series VMs (using hardware like NVidia A100 Tensor Core GPUs and Xilinx FPGAs as well as the latest Epyc 7300 CPUs) with HPC images.

One reason the Met Office picked a cloud supercomputer was the flexibility to choose whatever the best solution is in 2027. As Richard Lawrence, the Met Office IT Fellow for supercomputing.put it at the recent HPC Forum, they wanted "to spend less time buying supercomputers and more time utilizing them".

But how does Microsoft build Azure to support HPC well when the requirements can be somewhat different? "There are things that cloud generically needs that HPC doesn't, and vice versa," Andrew Jones from Microsoft's HPC team told us.

Everyone needs fast networks, everybody needs fast storage, fast processors and more memory bandwidth, but the focus on how all that is integrated together is clearly different, he says.

HPC applications need to perform at scale, which cloud is ideal for, but they need to be deployed differently in cloud infrastructure from typical cloud applications.

SEE: Google's new cloud computing tool helps you pick the greenest data centers

If you're deploying a whole series of independent VMs it makes sense to spread them out across the datacenter so that they are relatively independent and resilient from each other, whereas in the HPC world you want to pack all your VMs as closest together as possible, so they have the tightest possible network connections between each other to get the best performance he explains.

Some HPC infrastructure proves very useful elsewhere. "The idea of high-performance interconnects that really drive scalable application performance and latency is a supercomputing and HPC thing," Jones notes. "It turns out it also works really well for other things like AI and some aspects of gaming and things like that."

Although high speed interconnects are enabling disaggregation in the hyperscale data centre, where you can split the memory and compute into different hardware and allocate as much as you need of each, that may not be useful for HPC even though more flexibility in allocating memory would be helpful, because it's expensive and not all the memory you allocate to a cluster will be used for every job.

"In the HPC world we are desperately trying to drag every bit of performance out of the interconnect we can and distributing stuff all over the data centre is probably not the right path to take for performance reasons. In HPC, we're normally stringing together large numbers of things that we mostly want to be as identical as possible to each other, in which case you don't get those benefits of disaggregation," he says.

What will cloud HPC look like in the future?

"HPC is a big enough player that we can influence the overall hardware architectures, so we can make sure that there are things like high memory bandwidth considerations, things like considerations for higher power processes and, therefore, cooling constraints and so on are built into those architectures," he points out.

The HPC world has tended to be fairly conservative, but that might be changing, Jones notes, which is good timing for cloud. "HPC has been relatively static in technology terms over the last however many years; all this diversity and processor choice has really only been common in the last couple of years," he says. GPUs have taken a decade to become common in HPC.

SEE: What is quantum computing? Everything you need to know about the strange world of quantum computers

The people involved in HPC have often been in the field for a while. But new people are coming into HPC who have different backgrounds; they're not all from the traditional scientific computing background.

"I think that diversity of perspectives and viewpoints coming into both the user side, and the design side will change some of the assumptions we'd always made about what was a reasonable amount of effort to focus on to get performance out of something or the willingness to try new technologies or the risk reward payoff for trying new technologies," Jone predicts.

So just as HPC means some changes for cloud infrastructure, cloud may mean big changes for HPC.

Here is the original post:
Supercomputers are becoming another cloud service. Here's what it means - ZDNet

Quantum computing pioneer Umesh Vazirani to give Cruickshank Lecture as part of three-day conference – EurekAlert

KINGSTON, R.I. Oct. 12, 2021 University of California, Berkeley Professor Umesh Vazirani, a pioneer in quantum computing algorithms and complexity theory, will deliver the annual University of Rhode Island Cruickshank Lecture on Monday, Oct. 18, in conjunction with the three-day Frontiers in Quantum Computing conference.

Frontiers in Quantum Computing, which celebrates the launch this semester of URIs new masters degree in quantum computing, will take place Oct. 18-20 on the Kingston Campus. More than 30 experts in the fields of quantum computing and quantum information science will deliver daily talks on such topics as the future of quantum computing, research and industry developments, and educational initiatives for the next generation of experts in the field.

This will be an impressive gathering, said Vanita Srinivasa, director of URIs Quantum Information Science program and a conference organizer. These scientists have made seminal contributions to quantum computing and quantum information science. We have speakers who are well-established in quantum information science, even before it was a major field, and we have speakers who are up and coming and are now among the top researchers in their fields.

Vazirani, the Roger A. Strauch Professor of Electrical Engineering and Computer Science at UC Berkeley and director of the Berkeley Quantum Computation Center, is considered one of the founders of the field of quantum computing. His talk will explore quantum computings impact on the foundations of quantum mechanics and the philosophy of science.

There are several different theories about how quantum mechanics can be interpreted. Advances in quantum computing will change our understanding of the foundations of quantum mechanics and maybe our overall view of the universe, said Leonard Kahn, chair of the URIDepartment of Physicswho helped organize the conference.

Vaziranis virtual talk, A Quantum Wave in Computing, will be presented to an in-person audience in room 100 of the Beaupre Center for Chemical and Forensic Sciences, 140 Flagg Road, on the Kingston campus, at 6:30 p.m. on Oct. 18. The lecture can also be viewed live with a link from the conferenceswebsite.

The conferences list of speakers includes U.S. Sen. Jack Reed, who will deliver an address at 9:45 am. on the opening day of the conference, along with experts from around the U.S. as well as Australia, Canada, Netherlands, and Denmark.

Jacob Taylor, a physicist at the National Institute of Standards and Technology, Joint Quantum Institute Fellow, and founder of the national effort overseeing implementation of the National Quantum Initiative Act, will deliver the conferences opening keynote address on Monday, Oct. 18, at 8 a.m. in the Ballroom of the Memorial Union.

Charles Tahan, assistant director for Quantum Information Science and director of the National Quantum Coordination Office in the White House Office of Science and Technology Policy (OTSP), will give the keynote address before the roundtable discussion on the future of quantum computing on Tuesday, Oct. 19, at 5:15 p.m. in the ballroom, which is sponsored by D-Wave.

The panel will include Taylor, the first assistant director for Quantum Information Science at the OSTP; Michelle Simmons, a pioneer in atomic electronics and silicon-based quantum computing and director of the Australian Research Councils Centre of Excellence for Quantum Computation and Communication Technology; Catherine McGeoch, Senior Scientist with D-Wave; and Christopher Lirakis, IBM Quantum Lead For Quantum Systems Deployment.

The panelists will provide their perspectives on the future of quantum computing from industry, government and academia, said Srinivasa. The future is uncertain, but hopeful, and there are exciting challenges along the way. Quantum computing technology has progressed from something thats been a dream to something that can actually be built.

Quantum computers have the promise of solving key problems that would take a prohibitively long time to execute on classical computers. Because of the nature of the quantum bit, as compared to the classical bit, some of those intractable calculations can be done on a quantum computer in minutes rather than thousands of years. The impact on many problems from molecular simulations to encryption of credit card data will have far-reaching consequences.

I dont think theres been a time when theres been this much publicity and press about quantum computing, said Kahn. Theres clearly a path forward but there are a lot of hurdles along the way.

With the conference celebrating URIs masters in quantum computing, education will be an important topic. Daily speakers will explore education initiatives, including developing curriculum at all levels to make the field more accessible to students. Presentations will include Chandralekha Singh, president of the American Association of Physics Teachers; Charles Robinson, IBM Quantum Computing Public Sector leader; and Robert Joynt, of the University of Wisconsin-Madison.

Other topics include implementation of quantum computing and industry developments, including talks by Christopher Savoie 92, founder and chief executive officer of Zapata Computing and a conference organizer, and Andrew King, director of Performance Research at D-Wave.

Its going to be amazing science that will be talked about at the conference, said Srinivasa, whose research focuses on quantum information processing theory for semiconductor systems. Christopher Savoie has commented that this conference is equivalent to any of the major conferences on quantum computing that hes been to.

###

Frontiers in Quantum Computing is free and open to the public. Except for the Cruickshank Lecture, all events will be held in the Memorial Union Ballroom, 50 Lower College Road, on the Kingston Campus. While events are in-person, some speakers will take part virtually. All sessions can also be viewed online. For more information or to take part, go to the conferenceswebsite.

The conference is sponsored by Zapata Computing, D-Wave, IBM Quantum, PSSC Labs, and Microway, along with URIs College of Arts and Sciences, University Libraries, Information Technology Services, the Office of the Provost, and the Department of Physics.

The Alexander M. Cruickshank Endowed Lectureship was established in 1999. It is named for Alexander M. Cruickshank, who served on the URI chemistry faculty for 30 years and was subsequently the director of the Gordon Research Conferences until his retirement in 1993. The lecture series is sponsored by the URI Department of Physics, the Gordon Research Center and URIs College of Arts and Sciences.

For more information, contact Leonard Kahn atlenkahn@uri.edu.

Read more from the original source:
Quantum computing pioneer Umesh Vazirani to give Cruickshank Lecture as part of three-day conference - EurekAlert

Quantum venture funding dipped 12% in 2020, but quantum investments rose 46% – VentureBeat

Sorting through the hype surrounding quantum computing these days isnt easy for enterprises trying to figure out the right time to jump in. Skeptics say any real impact is still years away, and yet quantum startups continue to seduce venture capitalists in search of the next big thing.

A new report from CB Insights may not resolve this debate, but it does add some interesting nuance. While the number of venture capital deals for quantum computing startups rose 46% to 37 in 2020 compared to 2019, the total amount raised in this sector fell 12% to $365 million.

Looking at just the number of deals, the annual tally has ticked up steadily from just 6 deals in 2015. As for the funding total, while it was down from $417 million in 2019, it remains well above the $73 million raised in 2015.

Theres a couple of conclusions to draw from this.

First, the number of startups being drawn into this space is clearly rising. As research has advanced, more entrepreneurs with the right technical chops feel the time is now to start building their startup.

Second, the average deal size for 2020 was just under $10 million. And if you include the $46 million IQM raised, that squeezes the average for most other deals down even further. That certainly demonstrates optimism, but its far from the kind of financial gusher or valuations that would indicate any kind of quantum bubble.

Finally, its important to remember that startups are likely a tiny slice of whats happening in quantum these days. A leading indicator? Perhaps.But a large part of the agenda is still being driven by tech giants who have massive resources to invest in a technology that may have a long horizon and could be years away from generating sufficient revenues. That includes Intel, IBM, Google, Microsoft, and Amazon.

Indeed, Amazon just rolled out a new blog dedicated to quantum computing.Last year, Amazon Web Services launched Amazon Braket, a product that lets enterprises start experimenting with quantum computing. Even so, AWS quantum computing director Simone Severini wrote in the inaugural blog post that business customers are still scratching their heads over the whole phenomenon.

We heard a recurring question, When will quantum computing reach its true potential? My answer was, I dont know.' she wrote. No one does. Its a difficult question because there are still fundamental scientific and engineering problems to be solved. The uncertainty makes this area so fascinating, but it also makes it difficult to plan. For some customers, thats a real issue. They want to know if and when they should focus on quantum computing, but struggle to get the facts, to discern the signal from all the noises.

Continued here:
Quantum venture funding dipped 12% in 2020, but quantum investments rose 46% - VentureBeat

How researchers are mapping the future of quantum computing, using the tech of today – GeekWire

Pacific Northwest National Laboratory computer scientist Sriram Krishnamoorthy. (PNNL Photo)

Imagine a future where new therapeutic drugs are designed far faster and at a fraction of the cost they are today, enabled by the rapidly developing field of quantum computing.

The transformation on healthcare and personalized medicine would be tremendous, yet these are hardly the only fields this novel form of computing could revolutionize. From cryptography to supply-chain optimization to advances in solid-state physics, the coming era of quantum computers could bring about enormous changes, assuming its potential can be fully realized.

Yet many hurdles still need to be overcome before all of this can happen. This one of the reasons the Pacific Northwest National Laboratory and Microsoft have teamed up to advance this nascent field.

The developer of the Q# programming language, Microsoft Quantum recently announced the creation of an intermediate bridge that will allow Q# and other languages to be used to send instructions to different quantum hardware platforms. This includes the simulations being performed on PNNLs own powerful supercomputers, which are used to test the quantum algorithms that could one day run on those platforms. While scalable quantum computing is still years away, these simulations make it possible to design and test many of the approaches that will eventually be used.

We have extensive experience in terms of parallel programming for supercomputers, said PNNL computer scientist Sriram Krishnamoorthy. The question was, how do you use these classical supercomputers to understand how a quantum algorithm and quantum architectures would behave while we build these systems?

Thats an important question given that classical and quantum computing are so extremely different from each other. Quantum computing isnt Classical Computing 2.0. A quantum computer is no more an improved version of a classical computer than a lightbulb is a better version of a candle. While you might use one to simulate the other, that simulation will never be perfect because theyre such fundamentally different technologies.

Classical computing is based on bits, pieces of information that are either off or on to represent a zero or one. But a quantum bit, or qubit, can represent a zero or a one or any proportion of those two values at the same time. This makes it possible to perform computations in a very different way.

However, a qubit can only do this so long as it remains in a special state known as superposition. This, along with other features of quantum behavior such as entanglement, could potentially allow quantum computing to answer all kinds of complex problems, many of which are exponential in nature. These are exactly the kind of problems that classical computers cant readily solve if they can solve them at all.

For instance, much of the worlds electronic privacy is based on encryption methods that rely on prime numbers. While its easy to multiply two prime numbers, its extremely difficult to reverse the process by factoring the product of two primes. In some cases, a classical computer could run for 10,000 years and still not find the solution. A quantum computer, on the other hand, might be capable of performing the work in seconds.

That doesnt mean quantum computing will replace all tasks performed by classical computers. This includes programming the quantum computers themselves, which the very nature of quantum behaviors can make highly challenging. For instance, just the act of observing a qubit can make it decohere, causing it to lose its superposition and entangled states.

Such challenges drive some of the work being done by Microsoft Azures Quantum group. Expecting that both classical and quantum computing resources will be needed for large-scale quantum applications, Microsoft Quantum has developed a bridge they call QIR, which stands for quantum intermediate representation. The motivation behind QIR is to create a common interface at a point in the programming stack that avoids interfering with the qubits. Doing this makes the interface both language- and platform-agnostic, which allows different software and hardware to be used together.

To advance the field of quantum computing, we need to think beyond just how to build a particular end-to-end system, said Bettina Heim, senior software engineering manager with Microsoft Quantum, during a recent presentation. We need to think about how to grow a global ecosystem that facilitates developing and experimenting with different approaches.

Because these are still very early days think of where classical computing was 75 years ago many fundamental components still need to be developed and refined in this ecosystem, including quantum gates, algorithms and error correction. This is where PNNLs quantum simulator, DM-SIM comes in. By designing and testing different approaches and configurations of these elements, they can discover better ways of achieving their goals.

As Krishnamoorthy explains: What we currently lack and what we are trying to build with this simulation infrastructure is a turnkey solution that could allow, say a compiler writer or a noise model developer or a systems architect, to try different approaches in putting qubits together and ask the question: If they do this, what happens?

Of course, there will be many challenges and disappointments along the way, such as an upcoming retraction of a 2018 paper in the journal, Nature. The original study, partly funded by Microsoft, declared evidence of a theoretical particle called a Majorana fermion, which could have been a major quantum breakthrough. However, errors since found in the data contradict that claim.

But progress continues, and once reasonably robust and scalable quantum computers are available, all kinds of potential uses could become possible. Supply chain and logistics optimization might be ideal applications, generating new levels of efficiency and energy savings for business. Since quantum computing should also be able to perform very fast searches on unsorted data, applications that focus on financial data, climate data analysis and genomics are likely uses, as well.

Thats only the beginning. Quantum computers could be used to accurately simulate physical processes from chemistry and solid-state physics, ushering in a new era for these fields. Advances in material science could become possible because well be better able to simulate and identify molecular properties much faster and more accurately than we ever could before. Simulating proteins using quantum computers could lead to new knowledge about biology that would revolutionize healthcare.

In the future, quantum cryptography may also become common, due to its potential for truly secure encrypted storage and communications. Thats because its impossible to precisely copy quantum data without violating the laws of physics. Such encryption will be even more important once quantum computers are commonplace because their unique capabilities will also allow them to swiftly crack traditional methods of encryption as mentioned earlier, rendering many currently robust methods insecure and obsolete.

As with many new technologies, it can be challenging to envisage all of the potential uses and problems quantum computing might bring about, which is one reason why business and industry need to become involved in its development early on. Adopting an interdisciplinary approach could yield all kinds of new ideas and applications and hopefully help to build what is ultimately a trusted and ethical technology.

How do you all work together to make it happen? asks Krishnamoorthy. I think for at least the next couple of decades, for chemistry problems, for nuclear theory, etc., well need this hypothetical machine that everyone designs and programs for at the same time, and simulations are going to be crucial to that.

The future of quantum computing will bring enormous changes and challenges to our world. From how we secure our most critical data to unlocking the secrets of our genetic code, its technology that holds the keys to applications, fields and industries weve yet to even imagine.

Here is the original post:
How researchers are mapping the future of quantum computing, using the tech of today - GeekWire

Major Quantum Computing Projects And Innovations Of 2020 – Analytics India Magazine

Quantum computing has opened multiple doors of possibilities for quick and accurate computation for complex problems, something which traditional methods fail at doing. The pace of experimentation in quantum computing has very naturally increased in recent years. 2020 too saw its share of such breakthroughs, which lays the groundwork for future innovations. We list some of the significant quantum computing projects and experiments of 2020.

IT services company Atos devised Q-Score for measuring quantum performance. As per the company, this is the first universal quantum metric that applies to all programmable quantum processors. The company said that in comparison to qubits, the standard figure of merit for performance assessment, Q-Score provides explicit, reliable, objective, and comparable results when solving real-world optimisation problems.

The Q-Score is calculated against three parameters: application-driven, ease of use, and objectiveness and reliability.

Googles AI Quantum team performed the largest chemical simulation, to date, on a quantum computer. Explaining the experiment in a paper titled, Hartree-Fock on a superconducting qubit quantum computer, the team said it used variational quantum eigensolver (VQE) to simulate chemical mechanisms using quantum algorithms.

It was found that the calculations performed in this experiment were two times larger than the previous similar experiments and contained about ten times the number of quantum gate operations.

The University of Sydney developed an algorithm for characterising noise in large scale quantum computers. Noise is one of the major obstacles in building quantum computers. With this newly developed algorithm, they have tried to tame the noise by reducing interference and instability.

A new method was introduced to return an estimate of the effective noise with relative precision. The method could also detect all correlated errors, enabling the discovery of long-range two-qubit correlations in the 14 qubit device. In comparison, the previous methods would render infeasible for device size above 10 qubits.

The tool is highly scalable, and it has been tested successfully on the IBM Quantum Experience device. The team believes that with this, the efficiency of quantum computers in solving computing problems will be addressed.

Canadian quantum computing D-Wave Systems announced the general availability of its next-generation quantum computing platform. This platform offers new hardware, software, and tools for accelerating the delivery of quantum computing applications. The platform is now available in the Leap quantum cloud service and has additions such as Advantage quantum system with 5000 qubits and 15-way qubit connectivity.

It also has an expanded solver service that can perform calculations of up to one million variables. With these capabilities, the platform is expected to assist businesses that are running real-time quantum applications for the first time.

Physicists at MIT reported evidence of Majorana fermions on the surface of gold. Majorana fermions are particles that are theoretically their own antiparticle; it is the first time these have been observed on metal as common as gold. With this discovery, physicists believe that this could prove to be a breakthrough for stable and error-free qubits for quantum computing.

The future innovation in this direction would be based on the idea that combinations of Majorana fermions pairs can build qubit in such a way that if noise error affects one of them, the other would still remain unaffected, thereby preserving the integrity of the computations.

In December, Intel introduced Horse Ridge II. It is the second generation of its cryogenic control chip, considered a milestone towards developing scalable quantum computers. Based on its predecessor, Horse Ridge I, it supports a higher level of integration for the quantum systems control. It can read qubit states and control several gates simultaneously to entangle multiple qubits. One of its key features is the Qubit readout that provides the ability to read the current qubit state.

With this feature, Horse Ridge II allows for faster on-chip, low latency qubit state detection. Its multigate pulsing helps in controlling the potential of qubit gates. This ability allows for the scalability of quantum computers.

I am a journalist with a postgraduate degree in computer network engineering. When not reading or writing, one can find me doodling away to my hearts content.

More here:
Major Quantum Computing Projects And Innovations Of 2020 - Analytics India Magazine

The Quantum Computing market is expected to grow from USD 472 million in 2021 to USD 1,765 million by 2026, at a CAGR of 30.2% – GlobeNewswire

New York, Feb. 10, 2021 (GLOBE NEWSWIRE) -- Reportlinker.com announces the release of the report "Quantum Computing Market with COVID-19 impact by Offering, Deployment, Application, Technology, End-use Industry and Region - Global Forecast to 2026" - https://www.reportlinker.com/p05064748/?utm_source=GNW Several companies are focusing on the adoption of QCaaS post-COVID-19. This, in turn, is expected to contribute to the growth of the quantum computing market. However, stability and error correction issues is expected to restrain the growth of the market.

Services segment is attributed to hold the largest share of the Quantum Computing marketThe growth of services segment can be attributed to the increasing number of startups across the world that are investing in research and development activities related to quantum computing technology. This technology is used in optimization, simulation, and machine learning applications, thereby leading to optimum utilization costs and highly efficient operations in various end-use industries.

Cloud based deployment to witness the highest growth in Quantum Computing market in coming yearsWith the development of highly powerful systems, the demand for cloud-based deployment of quantum computing systems and services is expected to increase.This, in turn, is expected to result in a significant revenue source for service providers, with users paying for access to noisy intermediate-scale quantum (NISQ) systems that can solve real-world problems.

The limited lifespan of rapidly advancing quantum computing systems also favors cloud service providers.The flexibility of access offered to users is another factor fueling the adoption of cloud-based deployment of quantum computing systems and services.

For the foreseeable future, quantum computers are expected not to be portable. Cloud can provide users with access to different devices and simulators from their laptops.

Optimization accounted for a major share of the overall Quantum Computing marketOptimization is the largest application for quantum computing and accounted for a major share of the overall Quantum Computing market.Companies such as D-Wave Systems, Cambridge Quantum Computing, QC Ware, and 1QB Information Technologies are developing quantum computing systems for optimization applications.

Networked Quantum Information Technologies Hub (NQIT) is expanding to incorporate optimization solutions for resolving problems faced by the practical applications of quantum computing technology.

Trapped ions segment to witness highest CAGR of Quantum Computing market during the forecast periodThe trapped ions segment of the market is projected to grow at the highest CAGR during the forecast period as quantum computing systems based on trapped ions offer more stability and better connectivity than quantum computing systems based on other technologies. IonQ, Alpine Quantum Technologies, and Honeywell are a few companies that use trapped ions technology in their quantum computing systems.

Banking and finance is attributed to hold major share of Quantum Computing market during the forecast periodIn the banking and finance end-use industry, quantum computing is used for risk modeling and trading applications.It is also used to detect the market instabilities by identifying stock market risks and optimize the trading trajectories, portfolios, and asset pricing and hedging.

As the financial sector is difficult to understand; the quantum computing approach is expected to help users understand the complexities of the banking and finance end-use industry. Moreover, it can help traders by suggesting them solutions to overcome financial challenges.

APAC to witness highest growth of Quantum Computing market during the forecast periodAPAC region is a leading hub for several industries, including healthcare and pharmaceuticals, banking and finance, and chemicals.Countries such as China, Japan, and South Korea are the leading manufacturers of consumer electronics, including smartphones, laptops, and gaming consoles, in APAC.

There is a requirement to resolve complications in optimization, simulation, and machine learning applications across these industries.The large-scale development witnessed by emerging economies of APAC and the increased use of advanced technologies in the manufacturing sector are contributing to the development of large and medium enterprises in the region.

This, in turn, is fueling the demand for quantum computing services and systems in APAC.In APAC, the investments look promising, as most countries such as China, Japan, and South Korea have successfully contained the virus compared with the US and European countries.China is easing the restrictions placed on factory lockdowns and worker movement.

Despite being the epicenter of COVID-19, China has maintained its dominant position as a global network leader.

The break-up of primary participants for the report has been shown below: By Company Type: Tier 1 - 18%, Tier 2 - 22%, and Tier 3 - 60% By Designation: C-level Executives - 21%, Manager Level - 35%, and Others - 44% By Region: North America - 45%, Europe - 38%, APAC - 12%, and RoW - 5%

The Quantum Computing market was dominated by International Business Machines (US), D-Wave Systems (Canada), Microsoft (US), Amazon (US), and Rigetti Computing (US).

Research Coverage:This research report categorizes the Quantum Computing based on offering, deployment, application, technology, end-use industry and region. The report describes the major drivers, restraints, challenges, and opportunities pertaining to the Quantum Computing market and forecasts the same till 2026.

Key Benefits of Buying the Report

The report would help leaders/new entrants in this market in the following ways:1. This report segments the Quantum Computing market comprehensively and provides the closest market size projection for all subsegments across different regions.2. The report helps stakeholders understand the pulse of the market and provides them with information on key drivers, restraints, challenges, and opportunities for market growth.3. This report would help stakeholders understand their competitors better and gain more insights to improve their position in the business. The competitive landscape section includes product launches and developments, partnerships, and collaborations.4. This report would help understand the pre and post-COVID-19 scenarios as to how would the penetration of quantum computing will look like for the forecast period. The region segment includes the country wise impact analysis of COVID-19 and initiatives taken to overcome these impacts.

Read the full report: https://www.reportlinker.com/p05064748/?utm_source=GNW

About ReportlinkerReportLinker is an award-winning market research solution. Reportlinker finds and organizes the latest industry data so you get all the market research you need - instantly, in one place.

__________________________

Go here to see the original:
The Quantum Computing market is expected to grow from USD 472 million in 2021 to USD 1,765 million by 2026, at a CAGR of 30.2% - GlobeNewswire

Tech trends in 2021: How artificial intelligence and technology will reshape businesses – The Financial Express

What better time than now to unveil what to look out for in the world of AI and technology in 2021.

By Prithwis De

The year 2020 will be marked as an unprecedented year in history due to the adverse impact of coronavirus worldwide. This pandemic has started bringing extraordinary changes in some key areas. The trends of faster drug development, effective remote care, efficient supply chain, etc, will continue into 2021. Drone technology is already playing a vital role in delivering food and other essentials alongside relief activities.

With Covid-19 came a new concept of the Internet of Behaviour within organisations to track human behaviour in the work environment and trace any slack in maintaining guidelines. Now on, organisations are set to capture and combine behaviour-related data from different sources and use it. We can assertively say it will affect the way organisations interact with people, going forward. Students are experiencing distance learning, taking examinations under remotely-monitored and proctored surveillance systems through identity verification and authentication in real time.

All these will have a high impact on technology, which will shape our outlook in the future. Businesses around the globe are taking the giant leap to become tech-savvy with quantum computing, artificial intelligence (AI), cybersecurity, etc. AI and cloud computing are alluring us all towards an environment of efficiency, security, optimisation and confidence. What better time than now to unveil what to look out for in the world of AI and technology in 2021.

What 2020 has paved the way for is quantum computing. Now, be prepared to adapt to a hybrid computing approach (conventional cum quantum computing) to problem-solving. This paradigm shift in computing will result in the emergence of implausible ways to solve existing business problems and ideate new opportunities. Its effects will be visible on our ability to perform better in diverse areasfinancial forecasting, weather predictions, drug and vaccine development, blood-protein analysis, supply chain planning and optimisation, etc. Quantum Computing as a Service (QCaaS) will be a natural choice for organisations to plug into the experiments as we advance. Forward-thinking businesses are excited to take the quantum leap, but the transition is still in a nascent stage. This new year will be a crucial stepping stone towards the future of things to change in the following years.

Cloud providers such as Amazon (AWS), Microsoft (Azure) and Google will continue to hog the limelight as the AI tool providers for most companies leaning towards real-time experiments in their business processes in the months to follow. Efficiency, security and customisation are the advantages for which serverless and hybrid cloud computing are gaining firm ground with big enterprises. It will continue to do so in 2021.

Going forward, the aim is to make the black box of AI transparent with explainable AI. The lack of clarity hampers our ability to trust AI yet. Automated machine learning (AutoML), another crucial area, is likely to be very popular in the near future. One more trend that caught on like wildfire in 2020 is Machine Learning Operations (MLOps). It provides organisations visibility of their models and has become an efficient tool to steer clear of duplicated efforts in AI. Most of the companies have been graduating from AI experimentations and pilot projects to implementation. This endeavour is bound to grow further and enable AI experts to have more control over their work from end-to-end now onwards.

Cybersecurity will gain prime importance in 2021 and beyond as there is no doubt that hacking and cybercrime prevention are priorities for all businesses with sensitive data becoming easily accessible with advanced phishing tools. Advanced prediction algorithms, along with AI, will play a decisive role in the future to prevent such breaches in data security.

AI and the Internet of Things along with edge computing, which is data processing nearer the source closer to the device at the edge of the network, will usher in a new era for actionable insights from the vast amount of data. The in-memory-accelerated-real-time AI will be needed, particularly when 5G has started creating new opportunities for disruption.

In 2020, there was a dip in overall funding as the pandemic had badly impacted the investment sector due to a reduction in activity. Some of the technology start-ups are still unable to cope up with the challenges created due to Covid-19 and the consequent worsening economic conditions. According to NASSCOM, around 40% of Indian start-ups were forced to stop their operations. In 2021, mergers and acquisitions of start-ups are expected. The larger companies are likely to target smaller companies, specialised mainly in niche and innovative areas such as drug development, cybersecurity, AI chips, cloud computing, MLOps, etc.

The businesses in 2021 and beyond will develop into efficient workplaces for everybody who believes in the power of technology. It is important to bear in mind that all trends are not necessarily independent of each other, but rather form the support base of the other as well as work in tandem with human intervention. So, are the hybrid trends and solutions here to stay for the next few years for the smooth running of various organisations? Only time will tell. But the need for AI and newer technology adoption and modernisation increases manifold.

The author is an analytics and AI professional, based in London, working in a big IT company. Views are personal

Get live Stock Prices from BSE, NSE, US Market and latest NAV, portfolio of Mutual Funds, Check out latest IPO News, Best Performing IPOs, calculate your tax by Income Tax Calculator, know markets Top Gainers, Top Losers & Best Equity Funds. Like us on Facebook and follow us on Twitter.

Financial Express is now on Telegram. Click here to join our channel and stay updated with the latest Biz news and updates.

Go here to read the rest:
Tech trends in 2021: How artificial intelligence and technology will reshape businesses - The Financial Express

Farewell 2020: Bleak, Yes. But a Lot of Good Happened Too – HPCwire

Here on the cusp of the new year, the catchphrase 2020 hindsight has a distinctly different feel. Good riddance, yes. But also proof of sciences power to mobilize and do good when called upon. Theres gratitude by those who came through less scathed, and, maybe more willingness to assist those who didnt.

Despite the unrelenting pandemic, high performance computing (HPC) proved itself an able member of the worldwide community of pandemic fighters. We should celebrate that, perhaps quietly since the work isnt done. HPC made a significant difference in speeding up and enabling vastly distributed research and funneling the results to those who could turn them into patient care, epidemiology guidance, and now vaccines. Remarkable really. Necessary, of course, but actually got done too. (Forget the quarreling; thats who we are.)

Across the Tabor family of publications, weve run more than 200 pandemic-related articles. I counted nearly 70 significant pieces in HPCwire. The early standing up of Fugaku at RIKEN, now comfortably astride the Top500 for a second time and by a significant margin, to participate in COVID-19 research is a good metaphor for HPCs mobilization. Many people and organizations contributed to the HPC v. pandemic effort and that continues.

Before spotlighting a few pandemic-related HPC activities and digging into a few other topics, lets do a speed-drive through the 2020 HPC/AI technology landscape.

Consolidation continued among chip players (Nvidia/Arm, AMD/Xilinx) while the AI chip newcomers (Cerebras, Habana (now Intel), SambaNova, Graphcore et. al.) were winning deals. Nvidias new A100 GPU is amazing and virtually everyone else is taking potshots for just that reason. Suddenly RISC-V looks very promising. Systems makers weathered 2020s storm with varying success while IBM seems to be winding down its HPC focus; it also plans to split/spin off its managed infrastructure services. Firing up Fugaku (notably a non-accelerated system) quickly was remarkable. The planned Frontier (ORNL) supercomputer now has the pole position in the U.S. exascale race ahead of the delayed Aurora (ANL).

The worldwide quantum computing frenzy is in full froth as the U.S. looks for constructive ways to spend its roughly $1.25 billion (U.S. Quantum Initiative) and, impressively, China just issued a demonstration of quantum supremacy. Theres a quiet revolution going on in storage and memory (just ask VAST Data). Nvidia/Mellanox introduced its line of 400 Gbs network devices while Ethernet launched its 800 Gbs spec. HPC-in-the-cloud is now a thing not a soon-to-be thing. AI is no longer an oddity but quickly infusing throughout HPC (That happened fast).

Last but not least, hyperscalers demonstrably rule the IT roost. Chipmakers used to, consistently punching above their weight (sales volume). Not so much now:

Ok then. Apologies for the many important topics omitted (e.g. exascale and leadership systems, neuromorphic tech, software tools (can oneAPI flourish?), newer fabrics, optical interconnect, etc.).

Lets start.

I want to highlight two HPC pandemic-related efforts, one current and one early on, and also single out the efforts of Oliver Peckham, HPCwires editor who leads our pandemic coverage which began in earnest with articles on March 6 (Summit Joins the Fight Against the Coronavirus) and March 13 (Global Supercomputing Is Mobilizing Against COVID-19). Actually, the very first piece Tech Conferences Are Being Canceled Due to Coronavirus, March 3 was more about interrupted technology events and we picked it up from our sister pub, Datanami which ran it on March 2. Weve since become a virtualized event world.

Heres an excerpt from the first Summit piece about modeling COVID-19s notorious spike:

Micholas Smith, a postdoctoral researcher at the University of Tennessee/ORNL Center for Molecular Biophysics (UT/ORNL CMB), used early studies and sequencing of the virus to build a virtual model of the spike protein.[A]fter being granted time on Summit through a discretionary allocation, Smith and his colleagues performed a series of molecular dynamics simulations on the protein, cycling through 8,000 compounds within a few days and analyzing how they bound to the spike protein, if at all.

Using Summit, we ranked these compounds based on a set of criteria related to how likely they were to bind to the S-protein spike, Smith said in aninterviewwith ORNL. In total, the team identified 77 candidate small-molecule compounds (such as medications) that they considered worthy of further experimentation, helping to narrow the field for medical researchers.

It took us a day or two whereas it would have taken months on a normal computer, said Jeremy Smith, director of UT/ORNL CMB and principal researcher for the study. Our results dont mean that we have found a cure or treatment for the Wuhan coronavirus. We are very hopeful, though, that our computational findings will both inform future studies and provide a framework that experimentalists will use to further investigate these compounds. Only then will we know whether any of them exhibit the characteristics needed to mitigate this virus.

The flood (and diversity) of efforts that followed was startling. Olivers advice on what to highlight catches the flavor of the challenge: You could go with something like the Fugaku vs. COVID-19 piece or the grocery store piece, maybe contrast them a bit, earliest vs. current simulations of viral particle spreador something like the LANL retrospective piece vs. the piece I just wrote up on their vaccine modeling. Think that might work for a how far weve come angle, either way.

Theres too much to cover.

Last week we ran Olivers article on LANL efforts to optimize vaccine distribution (At Los Alamos National Lab, Supercomputers Are Optimizing Vaccine Distribution). Heres a brief excerpt:

The new vaccines from Pfizer and Moderna have been deemed highly effective by the FDA; unfortunately, doses are likely to be limited for some time. As a result, many state governments are struggling to weigh difficult choices should the most exposed, like frontline workers, be vaccinated first? Or perhaps the most vulnerable, like the elderly and immunocompromised? And after them, whos next?

LANL was no stranger to this kind of analysis: earlier in the year, the lab had used supercomputer-powered tools like EpiCast to simulate virtual cities populated by individuals with demographic characteristics to model how COVID-19 would spread under different conditions. The first thing we looked at was whether it made a difference to prioritize certain populations such as healthcare workers or to just distribute the vaccine randomly,saidSara Del Valle, the LANL computational epidemiologist who is leading the labs COVID-19 modeling efforts. We learned that prioritizing healthcare workers first was more effective in reducing the number of COVID cases and deaths.

You get the idea. The well of HPC efforts to tackle and stymie COVID-19 is extremely deep. Turning unproven mRNA technology into a vaccine in record time was awe-inspiring and required many disciplines. For those unfamiliar with mRNA mechanism heres a brief CDC explanation as it relates to the new vaccines. Below are links to a few HPCwirearticles on the worldwide effort to bring HPC computational power to bear. (The last is a link to the HPCwire COVID-19 Archive which has links to all our major pandemic coverage):

COVID COVERAGE LINKS

Global Supercomputing Is Mobilizing Against COVID-19 (March 12, 2020)

Gordon Bell Special Prize Goes to Massive SARS-CoV-2 Simulations (November 19, 2020)

Supercomputer Research Leads to Human Trial of Potential COVID-19 Therapeutic Raloxifene (October 29, 2020)

AMDs Massive COVID-19 HPC Fund Adds 18 Institutions, 5 Petaflops of Power (September 14, 2020)

Supercomputer-Powered Research Uncovers Signs of Bradykinin Storm That May Explain COVID-19 Symptoms (July 28, 2020)

Researchers Use Frontera to Investigate COVID-19s Insidious Sugar Coating (June 16, 2020)

COVID-19 HPC Consortium Expands to Europe, Reports on Research Projects (May 28, 2020)

At SC20, an Expert Panel Braces for the Next Pandemic (December, 17, 2020)

Whats New in Computing vs. COVID-19: Cerebras, Nvidia, OpenMP & More (May 18, 2020)

Billion Molecules Against COVID-19 Challenge to Launch with Massive Supercomputing Support (April 22, 2020)

Pandemic Wipes Out 2020 HPC Market Growth, Flat to 12% Drop Expected (March 31, 2020)

[emailprotected]Turns Its Massive Crowdsourced Computer Network Against COVID-19 (March 16, 2020)

2020 HPCwire Awards Honor a Year of Remarkable COVID-19 Research (December, 23, 2020)

HPCWIRE COVID-19 COVERAGE ARCHIVE

Making sense of the processor world is challenging. Microprocessors are still the workhorses in mainstream computing with Intel retaining its giant market share despite AMDs encroachment. That said, the rise of heterogeneous computing and blended AI/HPC requirements has shifted focus to accelerators. Nvidias A100 GPU (54 billion transistors on 826mm2of silicon, worlds largest seven-nanometer chip) was launched this spring. Then at SC20 Nvidia announced an enhanced version of the A100, doubling its memory to 80GB; it now delivers 2TB/s of bandwidth. The A100 is an impressive piece of work.

The A100s most significant advantage, says Rick Stevens, associate lab director, Argonne National Laboratory, is its multi-instance GPU capability.

For many people the problem is achieving high occupancy, that is, being able to fill the GPU up because that depends on how much work you have to do. [By] introducing this MIG, this multi instance stuff that they have, theyre able to virtualize it. Most of the real-world performance wins are actually kind of throughput wins by using the virtualization. What weve seen isour big performance improvement is not that individual programs run much faster its that we can run up to seven parallel things on each GPU. When you add up the aggregate performance, you get these factors of three to five improvement over the V100, said Stevens.

Meanwhile, Intels XE GPU line is slowly trickling to market, mostly in card form. At SC20 Intel announced plans to make its high performance discrete GPUs available to early access developers. Notably, the new chips have been deployed at ANL and will serve as a transitional development vehicle for the future (2022) Aurora supercomputer, subbing in for the delayed IntelXE-HPC (Ponte Vecchio) GPUs that are the computational backbone of the system.

AMD, also at SC20, launched its latest GPU the MI100. AMD says it delivers 11.5 teraflops peak double-precision (FP64), 46.1 teraflops peak single-precision matrix (FP32), 23.1 teraflops peak single-precision (FP32), 184.6 teraflops peak half-precision (FP16) floating-point performance, and 92.3 peak teraflops of bfloat16 performance. HPCwire reported, AMDs MI100GPU presents a competitive alternative to Nvidias A100 GPU, rated at 9.7 teraflops of peak theoretical performance. However, the A100 is returning even higher performance than that on its FP64 Linpack runs. It will be interesting to see the specs of the GPU AMD eventually fields for use in its exascale system wins.

The stakes are high in what could become a GPU war. Today, Nvidia is the market leader in HPC.

Turning back to CPUs, which many in HPC/AI have begun to regard as the lesser of CPU/GPU pairings. Perhaps that will change with the spectacular showing of Fujitsus A64FX at the heart of Fugaku. Nvidias proposed acquisition of Arm, not a done deal yet (regulatory concerns), would likely inject fresh energy in what was already a surging Arm push into the datacenter. Of course, Nvidia has jumped into the systems business with its DGX line and presumably wants a home-grown CPU. The big mover of the last couple of years, AMDs Epyc microprocessor line, continues its steady incursion into Intel x86 territory.

Theres not been much discussion around Power10 beyond IBMs summer announcement that Power10 would offer a ~3x performance gain and ~2.6x core efficiency gain over Power9. The new executive director of OpenPOWER Foundation, James Kulina, says attracting more chipmakers to build Power devices is a top goal. Well see. RISC-V is definitely drawing interest but exactly how it fits into the processor puzzle is unclear. Esperanto unveiled a RISC-V based chip aimed at machine learning with 1,100 low-power cores based on the open-source RISC-V. Esperanto reported a goal of 4,000 cores on a single device. Europe is betting on RISC-V. However, at least near-term, RISC-V variants are seen as specialized chips.

The CPU waters are murkier than ever.

Sort of off in a land of their own are AI chip/system players. Their proliferation continues with the early movers winning important deployments. Some observers think 2021 will start sifting winners from the losers. Lets not forget that last year Intel stopped development of its newly-acquired Nervana line in favor of its even more newly-acquired Habana products. Its a high-risk, high-reward arena still.

PROCESSOR COVERAGE LINKS

Intel Xe-HP GPU Deployed for Aurora Exascale Development

Is the Nvidia A100 GPU Performance Worth a Hardware Upgrade?

LLNL, ANL and GSK Provide Early Glimpse into Cerebras AI System Performance

David Patterson Kicks Off AI Hardware Summit Championing Domain Specific Chips

Graphcores IPU Tackles Particle Physics, Showcasing Its Potential for Early Adopters

Intel Debuts Cooper Lake Xeons for 4- and 8-Socket Platforms

Intel Launches Stratix 10 NX FPGAs Targeting AI Workloads

Nvidias Ampere A100 GPU: Up to 2.5X the HPC, 20X the AI

AMD Launches Three New High-Frequency Epyc SKUs Aimed at Commercial HPC

IBM Debuts Power10; Touts New Memory Scheme, Security, and Inferencing

AMDs Road Ahead: 5nm Epyc, CPU-GPU Coupling, 20% CAGR

AI Newcomer SambaNova GAs Product Lineup and Offers New Service

Japans AIST Benchmarks Intel Optane; Cites Benefit for HPC and AI

Storage and memory dont get the attention they deserve. 3D XPoint memory (Intel and Micron), declining flash costs, and innovative software are transforming this technology segment. Hard disk drives and tape arent going away, but traditional storage management approaches such as tiering based on media type (speed/capacity/cost) are under attack. Newcomers WekaIO, VAST Data, and MemVerge are all-in on solid state, and a few leading-edge adopters (NERSC/Perlmutter) are taking the plunge. Data-intensive computing driven by the data flood and AI compute requirements (gotta keep those GPUs busy!) are big drivers.

Our storage systems typically see over an exabyte of I/O annually. Balancing this I/O intensive workload with the economics of storage means that at NERSC, we live and breathe tiering. And this is a snapshot of the storage hierarchy we have on the floor today at NERSC. Although it makes for a pretty picture, we dont have storage tiering because we want to, and in fact, Id go so far as to say its the opposite of what we and our users really want. Moving data between tiers has nothing to do with scientific discovery, said NERSC storage architect Glenn Lockwood during an SC20 panel.

To put some numbers behind this, last year we did a study that found that between 15% and 30% of that exabyte of I/O is not coming from our users jobs, but instead coming from data movement between storage tiers. That is to say that 15% to 30% of the I/O at NERSC is a complete waste of time in terms of advancing science. But even before that study, we knew that both the changing landscape of storage technology and the emerging large-scale data analysis and AI workloads arriving at NERSC required us to completely rethink our approach to tiered storage, said Lockwood.

Not surprisingly Intel and Micron (Optane/3D XPoint) are trying to accelerate the evolution. Micron released what it calls a heterogeneous-memory storage engine (HSE) designed for solid-state drives, memory-based storage and, ultimately, applications requiring persistent memory. Legacy storage engines born in the era of hard disk drives have historically failed to architecturally provide for the increased performance and reduced latency of next-generation nonvolatile media, said the company. Again, well see.

Software defined storage leveraging newer media has all the momentum at the moment with all of the established players IBM, DDN, Panasas, etc., mixing those capabilities into their product sets. WekaIO and Intel have battled it out for the top IO500 spot the last couple of years and Intels DAOS (distributed asynchronous object store) is slated for use in Aurora.

The concept of asynchronous IO is very interesting, noted Ari Berman, CEO, BioTeam research consultancy. Its essentially a queue mechanism at the system write level so system waits in the processors dont have to happen while a confirmed write back comes from the disks. So asynchronous IO allows jobs can keep running while youre waiting on storage to happen, to a limit of course. That would really improve the data input-output pipelines in those systems. Its a very interesting idea. I like asynchronous data writes and asynchronous storage access. I can see there very easily being corruption that creeps into those types of things and data without very careful sequencing. It will be interesting to watch. If it works it will be a big innovation.

Change is afoot and the storage technology community is adapting. Memory technology is also advancing.

Micron introduced a 176-layer 3D NAND flash memory at SC230 that it says increases read and write densities by more than 35 percent.JEDEC published the DDR5 SDRAM spec, the next-generation standard for random access memory (RAM) in the summer. Compared to DDR4, the DDR5 spec will deliver twice the performance and improved power efficiency, addressing ever-growing demand from datacenter and cloud environments, as well as artificial intelligence and HPC applications. At launch, DDR5 modules will reach 4.8 Gbps, providing a 50 percent improvement versus the previous generation. Density goes up four-fold with maximum density increasing from 16 Gigabits per die to 64 Gigabits per die in the new spec. JEDEC representatives indicated there will be 8 Gb and 16 Gb DDR5 products at launch.

There are always the wildcards. IBMs memristive technology is moving closer to practical use. One outlier is DNA-based storage. Dave Turek, longtime IBMer, joined DNA storage start-up Catalog this year and, says Catalog is working on proof of concepts with government agencies and a number of Fortune 500 companies. Some of these are whos-who HPC players, but some are non-HPC players many names you would recognizeWere at what I would say is the beginning of the commercial beginning. Again, well see.

STORAGE & MEMORY LINKS

SC20 Panel OK, You Hate Storage Tiering. Whats Next Then?

Intels Optane/DAOS Solution Tops Latest IO500

Startup MemVerge on Memory-centric Mission

HPC Strategist Dave Turek Joins DNA Storage (and Computing) Company Catalog

DDN-Tintri Showcases Technology Integration with Two New Products

Intel Refreshes Optane Persistent Memory, Adds New NAND SSDs

Micron Boosts Flash Density with 176-Layer 3D NAND

DDR5 Memory Spec Doubles Data Rate, Quadruples Density

IBM Touts STT MRAM Technology at IDEM 2020

The Distributed File Systems and Object Storage Landscape: Whos Leading?

Its tempting to omit quantum computing this year. Too much happened to summarize easily and the overall feel is of steady carry-on progress from 2019. There was, perhaps, a stronger pivot at least by press release count towards seeking early applications for near-term noisy intermediate scale quantum (NISQ) computers. Ion trap qubit technology got another important player in Honeywell which formally rolled out its effort and first system. Intel also stepped out from the shadows a bit in terms of showcasing its efforts. D-Wave launched a giant 5000-qubit machine (Advantage), again using a quantum annealing approach thats different from universal gate-based quantum system. IBM announced a stretch goal of achieving one million qubits!

Calling quantum computing a market is probably premature but monies are being spent. The Quantum Economic Development Consortium (QED-C) and Hyperion Research issued a forecast that projects the global quantum computing (QC) market worth an estimated $320 million in 2020 to grow 27% CAGR between 2020 and 2024. That would reach approximately $830 million by 2024. Chump change? Perhaps but real activity.

IBMs proposed Quantum Volume metric has drawn support as a broad benchmark of quantum computer performance. Honeywell promoted the 128QV score of its launch system. In December IBM reported it too had achieved a 128QV. The first QV reported by IBM was 16 in 2019 at the APS March meeting. Just what a QV of 128 means in determining practical usefulness is unclear but it is steady progress and even Intel agrees that QV is as good as any measure at the moment. DoE is also working on benchmarks, focusing a bit more on performance on given workloads.

[One] major component of benchmarking is asking what kind of resources does it take to run this or that interesting problem. Again, these are problems of interest to DoE, so basic science problems in chemistry and nuclear physics and things like that. What well do is take applications in chemistry and nuclear physics and convert them into what we consider a benchmark. We consider it a benchmark when we can distill a metric from it. So the metric could be the accuracy, the quality of the solution, or the resources required to get a given level of quality, said Raphael Pooser, PI for DoEs Quantum Testbed Pathfinder project at ORNL, during an HPCwire interview.

Next year seems likely to bring more benchmarking activity around system quality, qubit technology, and performance on specific problem sets. Several qubit technologies still vie for sway superconducting, trapped ion, optical, quantum dots, cold atoms, et al. The need to operate at near-zero (K) temps complicates everything. Google claimed achieving Quantum Supremacy last year. This year a group of China researchers also did so. The groups used different qubit technologies (superconducting v. optical) and Chinas effort tried to skirt criticisms that were lobbed at Googles effort. Frankly, both efforts were impressive. Russia reported early last year it would invest $790 million in quantum with achieving quantum supremacy as one goal.

Whats happening now is a kind of pell-mell rush among a larger and increasingly diverse quantum ecosystem (hardware, software, consultants, governments, academia). Fault tolerant quantum computing still seems distant but clever algorithms and error mitigation strategies to make productive use of NISQ systems, likely on narrow applications, look more and more promising.

Here are a few snapshots:

The persistent question is when will all of these efforts pay off and will they be as game-changing as many believe. With new money flowing into quantum, one has the sense there will be few abrupt changes in the next couple years barring untoward economic turns.

QUANTUM COVERAGE LINKS

IBMs Quantum Race to One Million Qubits

Googles Quantum Chemistry Simulation Suggests Promising Path Forward

Intel Connects the (Quantum) Dots in Accelerating Quantum Computing Effort

D-Wave Delivers 5000-qubit System; Targets Quantum Advantage

Honeywell Debuts Quantum System, Subscription Business Model, and Glimpse of Roadmap

Global QC Market Projected to Grow to More Than $800 million by 2024

ORNLs Raphael Pooser on DoEs Quantum Testbed Project

RigettiComputing Wins $8.6M DARPA Grant to Demonstrate Practical Quantum Computing

Braket: Amazons Cloud-First Quantum Environment Is Generally Available

IBM-led Webinar Tackles Quantum Developer Community Needs

Microsofts Azure Quantum Platform Now Offers Toshibas Simulated Bifurcation Machine

As always theres personnel shuffling. Lately hyperscalers have been taking HPC folks. Two long-time Intel executives, Debra Goldfarb and Bill Magro, recently left for the cloud Goldfarb to AWS as director for HPC products and strategy, and Magro to Google as CTO for HPC. Going in the other direction, John Martinis left Googles quantum development team and recently joined Australian start-up Silicon Quantum Computing. Ginny Rometty, of course, stepped down as CEO and chairman at IBM. IBMs long-time HPC exec Dave Turek left to take position with DNA storage start-up, Catalog, and last January, IBMer Brad McCredie joined AMD as corporate VP, GPU platforms.

View post:
Farewell 2020: Bleak, Yes. But a Lot of Good Happened Too - HPCwire

Bitcoin is quantum computing resistant regardless of rising fears among investors – FXStreet

All cryptocurrencies are based on cryptography and require miners to solve extremely complex mathematical problems in order to secure the network. The idea behind quantum computing is that it will be able to crack Bitcoins algorithm much faster than the network.

The basic principle is that Bitcoins network has to be sufficiently fast in order for a quantum attacker to not have enough time to derive the private key of a specific public key before the network.

So far, it seems that quantum computers would take around 8 hours to derive a Bitcoin private key which, in theory, means the network is secure against them. It seems that the mark right now is around 10 minutes. If quantum computers can get close to this time, the Bitcoin network could be compromised.

Its also important to note that quantum computing not only poses a threat to Bitcoin and cryptocurrencies but to other platforms, even banks. Many platforms use encryption which would be broken if quantum computing becomes real, which means the implications of this technology go way beyond just cryptocurrencies.

Theoretically, cryptocurrencies have several ways to mitigate or completely stop quantum computing attacks in the future. For instance, a soft fork on the network of an asset could be enough to at least move some of the assets that are insecure.

Additionally, there are many algorithms that are theorized to be quantum-resistant. In fact, SHA-256 which is currently used should be resistant to these types of attacks. According to recent statistics, around 25% of Bitcoin in circulation remains vulnerable to quantum attacks. You should transfer your coins to a new p2pkh address to make sure they are safe.

Originally posted here:
Bitcoin is quantum computing resistant regardless of rising fears among investors - FXStreet

What the Hell Is Quantum Chess? | IE – Interesting Engineering

Have you ever heard of Quantum Chess? If not, we are confident you are in for a real treat.

Read on to find out more about this interesting take on a very ancient strategy game. But brace yourself, things are about to get a little "spooky".

RELATED: WINNER OF THE WORLD'S FIRST QUANTUM CHESS TOURNAMENT ANNOUNCED

Quantum Chess is a variant of the classical strategy game. It incorporates the principles of quantum physics. For example, unlike traditional chess, the piecescan be placed into a superposition of two locations, meaning that a piece can occupy more than one square.

Unlike chesspieces in the conventional game where, for example, a pawn is always a pawn, aquantum chesspiece is a superposition of "states", with each state representing a different conventional piece.

Conventional chess is a very complex game, although it is possible for computer algorithmsto beat the world's greatest chess playersby accurately determining the moves necessary to win the game at any point.

The main rationale behind the creation of Quantum Chess is to introduce an element of unpredictability into the game, and thereby place the computer and the human on a more equal footing. The game can also help "level the playing field" somewhat between human players of widely different skills and experience with chess.

Its like youre playing in a multiverse but the different boards [in different universes] are connected to each other, said Caltech physicist Spiros Michalakis during aLivestreamof a recent Quantum Chess tournament. It makes 3D chess fromStar Treklook silly.

But don't let the term intimidate you. New players to the game don't need to be experts in quantum physics a basic understanding of chess is more important actually.

While it might sound like something of a gimmick, Quantum Chess is an interesting and entertaining spin on the classic game that many find enjoyable. Unless, of course, you cannot live without knowing for sure what and where each piece is at any given time.

If that is the case, you might find this one of the most frustrating games ever created!

Quantum Chess, as you have probably already worked out, is not like any game of classical chess you have ever played. But, it is important to note that there are also several variants of Quantum Chess.

The best known is probably the one created by Chris Cantwell when he was a graduate student at theUniversity of Southern California.This variant differs from other examples by the fact that it is more "truly quantum" than others.

My initial goal was to create a version of quantum chess that was truly quantum in nature, so you get to play with the phenomenon,Cantwell said in an interview with Gizmodoback in 2016.

I didnt want it to just be a game that taught people, quantum mechanics. The idea is that by playing the game, a player will slowly develop an intuitive sense of the rules governing the quantum realm. In fact, I feel like Ive come to more intuitively understand quantum phenomena myself, just by making the game, he added.

In Cantwell's version of Quantum Chess, this superposition of pieces is indicated by a ring that details the probability that the piece can actually be found in a given square. Not only that, but when moving a piece, each action can also be governed by probability.

You can think of the pieces of the game existing on multiple boards in which their numbers are also not fixed. The board you see is a kind of overview of all of these other boards and a single move acts on other boards at the same time.

Whenever a piece moves, many calculations are made behind the scenes to determine the actual outcome, which could be completely unexpected.

That being said, moves do follow the basic rules of traditional chess, including things like castling and en passant. However, there are a few important differences:

Pieces in this version of Quantum Chess can make a series of either "quantum moves" (except for pawns) or regular chess moves. In this sense, the pieces can occupy more than one square on the multiverse of boards simultaneously.

These moves also come in a variety of "flavors".

The first is a move called a "split move". This can be performed by all non-pawn pieces and allows a piece to actually occupy two different target squares that it could traditionally reach in normal chess.

But, this can only be done if the target square is unoccupied or is occupied by pieces of the same color and type. A white knight, for example, could use this kind of move to occupy the space of another white knight.

Such a move cannot; however, be used to capture an opponent's piece.

Another interesting move is called a "merge move". This can be performed by all pieces except pawns and, like a split move, can only be performed on an unoccupied square or one occupied by a piece of the same type and color.

Using our previous example of a white knight, this would mean that two white knights could merge together on the same square. Again, this move cannot be used to capture enemy pieces.

So how do you take pieces in Quantum Chess?

Well, when two pieces of different colors meet on the same square the game makes a series of measurements.These measurements are designed to answer a specific yes or no question.

For example, the game's mechanics will look at certain squares to determine if they are occupied or not.The outcome of this can be to cause a piece's "superposition" state to "collapse".

If the superposition state collapses, then the desired move will be performed. If not, the move is not made and the player's turn ends.

Capturing is also very different in a game of Quantum Chess. When a player attempts to do this, the game will make calculations for the square where the piece is situated and for its target square, as well as any other squares in its path, to answer the question, "is the attacking piece present and can it reach the target?".

If the answer is no, it is important to note that this doesn't necessarily mean the attacking piece is not present. Nor does it mean that its path is blocked.

Another interesting concept of Quantum Chess is called "exclusion". If a moving target is occupied and is in superposition by a piece that cannot be captured by the move, it is called an exclusion move.

Again, calculations are made for the target square and any squares in the path of an allowed move by a piece in superposition. This is done to answer the same question as capturing, with similar outcomes.

Castling is also very different in Quantum Chess. This move always involves two targets, and the same measurements are made for both targets. Castling cannot be used to capture, and will always be an exclusion move.

So, you might be wondering how you actually win a game of Quantum Chess?

Just like traditional chess, the aim of the game is to capture the opponent's king. However, unlike in traditional chess, the concept of checkmate does not exist.

To win, the enemy king must no longer actually exist on the board. As any piece, including the king, exist in a state of superposition, they can either be captured or not which further complicates the issue.

The game, therefore, continues until it is known, with certainty, that a particular player has no king left. For this reason, it is possible for both players to lose their king at the same time and the game would then be considered a draw.

Another important thing to note is that each player has a set amount of time for the game. For this reason, you can also win by running an opponent's time out.

How you play Quantum Chess depends on the variant of the game you are playing. We have already covered the rules of one variant above, and that game can be played throughQuantum Realm Games. But another version created byAlice Wismath at theSchool of Computing at Queen's University in Californiahas some slightly different rules.

You can try that game for yourself here.

In her version, each player has sixteen pieces. These pieces are in a quantum state of superposition of two types: a primary and a secondary type.

They are also in an unknown (quantum) type or a known (classical) type.When a piece is "touched" it collapses into its classical state and has an equal probability of becoming either a primary or secondary type. The king, however, is an exception, and is always in a classical state.

Each player has one king and its position is always known.

All other pieces are assigned the following primary piece types: left rook, left bishop, left knight, queen, right knight, right bishop, right rook, and pawns one through eight. Secondary piece types are then randomly assigned from this same list of piece types so that each type occurs exactly twice in the player's pieces.

Each piece is created at the start of each game and superpositions are not changed throughout the game. Pieces also start as they would in regular chess, on the first two rows, according to their primary piece type with all, except the king, in a state of superposition.

Once a quantum state piece is touched (i.e. chosen to move), it collapses into one of its two predetermined states, and this state is suddenly revealed to both players.

This can mean that a pawn in the front row can suddenly become a white knight once the piece has been "touched". You won't know until the piece's quantum state collapses.

Quantum Chess boards are the same as regular chess boards except that when a piece lands on a white square it remains in its classical state. When pieces land on black squares, however, they undergo a quantum transformation and regain, if lost, their quantum superposition.

This means that a previously "revealed" pawn can also suddenly transform into a queen if that was one of its predetermined primary or secondary types. A very interesting concept indeed.

To play the game, each player chooses a piece to move and must move it. If the quantum piece collapses into a piece type with no possible moves, then the player's move is over.

Pieces in classical states with no possible moves cannot be chosen. All pieces move as they would in classical chess with some of the following exceptions:

Pieces can also be captured as normal, and quantum pieces collapse from their superposition state and are removed from play.

If a player touches a quantum piece that collapses into a state that puts the opponent's king in check, their move is over. The opponent, however, is not required to get out of check in such circumstances.

Pawns that reach the opposite side of the board can be promoted to aqueen, bishop, rook, or knight, regardless of the number of pieces of that type already in the game. Also, if a piece in the quantum state on the far row is touched and revealed to be a pawn, it is promoted, but the promotion takes up the turn. The superimposed piece type is not affected.

To win the game, each player must capture the enemy's king, as a checkmate does not happen in Quantum Chess. For this reason, kings can actually move into a position that would normally be considered check.

Games are considered a draw if both opponents are left with only their king in play or 100 consecutive moves have been made with no captures or pawn movements by either player.

It was recently announced that the world's first Quantum Chess tournament had been won by Aleksander Kubica, a postdoctoral fellow at Canada's Perimeter Institute for Theoretical Physics and Institute for Quantum Computing. The tournament was held on the 9th of December 2020 at the Q2B 2020 conference.

The tournament games are timed, and Kubica managed to beat his opponent, Google's Doug Strain, by letting him run out of time. This currently makes Kubica officially the best Quantum Chess player in the world.

Not a bad way to see out one of the worst years in living memory.

And that, ladies and gentlemen, is a wrap.

If you like the sound of playing Quantum Chess, why not check out either of the versions we have discussed above in this article. Who knows, you might get proficient enough to challenge Kubica for the title in the not too distant future?

Follow this link:
What the Hell Is Quantum Chess? | IE - Interesting Engineering

Tech trends to watch in 2021 – India Today

The year 2020 has been one of the most unpredictable years and in parallel, we have seen the transition of technology in various sectors that has really helped humanity predict & prepare for any catastrophic condition. Considering the Covid-19 pandemic as one of the situations, many Scientists, Engineers & other techies have realized that a lot of development is still required to make life easier with accessible technology. Therefore, we bring to you some of the top tech trends to watch in 2021:

In the last decade, we have seen that there is no limit for technology & with the rise of digitalization in India, there will be a need for Quantum computing in order to protect Banking systems & IT security from cybercrime. With database processing as a critical strength of quantum computing, technologies such as artificial intelligence will be one application that will get significant benefit from the superior processing of Quantum computers.

Therefore, it can be seen that there will be massive competition among the big IT giants to provide services in cybersecurity, drug development, climatic condition prediction, etc., with the help of quantum computing.

In IoT applications, there were two challenges: the range and battery life. These two challenges are now overcome with the help of NB IoT. Considering the fact that approximately 21 billion devices will be connected by 2025, there will be a huge competition between Telecoms like Jio, Airtel, Vodafone, and others to provide cost-effective & efficient solution to their consumers in SaaS (Software as a Service) and PaaS (Platform as a Service) model. Moreover, India is working actively on NB IoT. In a first, BSNL with Skylo has launched the world's first satellite-based on NB-IoT to streamline various sectors, including fishers, farmers, construction, mining and logistics enterprises.

Tech trends to watch in 2021 | Representational image

IPA is the advanced version of RPA i.e., Robotics Process Automation. It is actually a combination of RPA & Machine Learning. Due to the outbreak of COVID-19, most of the IT industry has given intimation that there is a possibility to announce permanent work from home & some of the companies have already declared the same, including TCS, Deloitte, Twitter, etc. It is imperative for any industry to check the activeness, productivity & relative output from the workforce during this scenario.

Therefore, IPA techniques will be expected to increase process efficiency, better customer experience, optimize workforce productivity, and generate a relatively surge in revenue generation. In 2019-2020 we saw how chatbot helped the firms automate customer interaction & thereby reducing the operational cost. Similarly, various IPA techniques will help firms of any kind to construct any raw data into a structured one. In consequence, IPA techniques will be going to reducing human error & enhancing customer satisfaction.

Artificial Intelligence will expand its footprints in various sectors, including military, defence, agriculture, automotive, education, medical, construction, etc. power & scope of AI is unimaginable; it's endless. According to Fox News, the artificial intelligence algorithm, developed by heron systems, swept a human F-16 pilot in a simulated dogfight 5-0 on August 2020. Additionally, with the launch of GPT-3, an autoregressive language model that uses deep learning to produce human-like text developed by the OpenAI lab team.

This model expects to generate excellent quality text, making it difficult to distinguish whether the text is generated by humans or machines. In the agriculture sector, too, there will be some expectation to increase crop productivity with the help of AI techniques & thereby increasing farmers' income.

With the announcement of NEP 2020 by the Ministry of Education, there will be a change in all the institutions' learning patterns. We can see the rise in technologies such as Artificial Intelligence, Machine Learning, Big Data, blockchain, etc. Hence, the education ministry will put strenuous effort into upgrading India's education quality to make a skilled workforce.

Most awaited 5G or the 5th generation cellular network technology services is expected to launch in 2021 as telecom giants including Bharti Airtel, Jio, Vodafone Idea is ramping up to move early trails to commercialization with their respective partners.

Meanwhile, Reliance CMD Mr. Mukesh Ambani has already declared that Jio is ready with the infrastructure & Jio will pioneer the 5G revolution in India in the second half of 2021. This way, we can realize that technologies have made our lives easier and better in many misfortune situations. Hence, it will be our primary need in the future to let humans and machines work together to protect humans.

-Article by Abhishek Gupta, CEO & Co-founder, Hex N Bit

READ | Education sector's 2021 outlook and trends to keep in mind

READ | 5 edtech startups to watch out for in 2021

READ | What skills are Indians learning for 2021

See the rest here:
Tech trends to watch in 2021 - India Today

Encryption, zero trust and the quantum threat security predictions for 2021 – BetaNews

We've already looked at the possible cybercrime landscape for 2021, but what about the other side of the coin? How are businesses going to set about ensuring they are properly protected next year?

Josh Bregman, COO of CyGlass thinks security needs to put people first, "2020 has been incredibly stressful. Organizations should therefore look to put people first in 2021. Cybersecurity teams are especially stressed. They've been tasked with securing a changing environment where more people than ever before are working remotely. They've also faced new threats as cyber criminals have looked to take advantage of the pandemic: whether through phishing attacks or exploiting weaknesses in corporate infrastructure. Being proactive, encouraging good cyber hygiene and executing a well thought out cyber program will go a long way towards promoting a peaceful and productive 2021, not least because it will build resiliency."

Mary Writz, VP of product management at ForgeRock thinks quantum computing will change how we think about secure access, "When quantum becomes an everyday reality, certain types of encryption and thereby authentication (using encrypted tokens) will be invalidated. Public Key Infrastructure (PKI) and digital signatures will no longer be considered secure. Organizations will need to be nimble to modernize identity and access technology."

Gaurav Banga, CEO and founder of Balbix, also has concerns over quantum computing's effect on encryption, "Quantum computing is likely to become practical soon, with the capability to break many encryption algorithms. Organizations should plan to upgrade to TLS 1.3 and quantum-safe cryptographic ciphers soon. Big Tech vendors Google and Microsoft will make updates to web browsers, but the server-side is for your organization to review and change. Kick off a Y2K-like project to identify and fix your organization's encryption before it is too late."

Sharon Wagner, CEO of Sixgill predicts greater automation, "We'll see organizations ramp up investment in security tools that automate tasks. The security industry has long been plagued by talent shortages, and companies will look toward automation to even the playing field. While many of these automated tools were previously only accessible to large enterprises, much of this technology is becoming available to businesses of all sizes. With this, security teams will be able to cover more assets, eliminate blindspots at scale, and focus more on the most pressing security issues."

Michael Rezek, VP of cybersecurity strategy at Accedian sees room for a blend of tools and education, "As IT teams build out their 2021 cybersecurity strategy, they should look most critically to network detection & response solutions (NDR), and other complementary solutions like endpoint security platforms that can detect advanced persistent threats (APT) and malware. For smaller companies, managed security services such as managed defense and response are also good options. However, a comprehensive security strategy must also include educating all employees about these threats and what to watch out for. Simple cybersecurity practices like varying and updating passwords and not clicking on suspicious links can go a long way in defending against ransomware. Perhaps most importantly, since no security plan is foolproof, companies should have a plan in the event of a ransomware attack. This is especially important since attackers might perform months of reconnaissance before actually striking. Once they have enough data, they'll typically move laterally inside the network in search of other prized data. Many cybercrime gangs will then install ransomware and use the stolen data as a back-up plan in case the organization refuses to pay. The more rapidly you can detect a breach and identify what information was exploited, the better your changes of mitigating this type of loss. Having a plan and the forensic data to back it up will ensure your organization and its reputation are protected."

Amir Jerbi, CTO at Aqua Security, sees more automation too, "As DevOps moves more broadly to use Infrastructure as Code (IaC) to automate provisioning of cloud native platforms, it is only a matter of time before vulnerabilities in these processes are exploited. The use of many templates leaves an opening for attackers to embed deployment automation of their own components, which when executed may allow them to manipulate the cloud infrastructure of their attack targets."

Marlys Rodgers, chief information security officer and head of technology oversight at CSAA Insurance Group, inaugural member of the AttackIQ Informed Defenders Council says, "Despite the global COVID-19 pandemic, businesses still have to function and deliver on their promises to customers. This means adapting and finding new ways to enable employees to be productive from the safety of their homes. As CISO and Head of Technology Oversight for my company, I am dedicated to structuring and sustaining a security program that enables the business, as opposed to restricting capabilities in the name of minimizing risk. Additionally, I believe in complete transparency regarding the company's security posture across all levels, including the C-suite and board, so that we may work together to understand our risk and prioritize security investments accordingly. These two guiding principles have served me well throughout my career, but in 2020 especially, they allowed my company to innovate to better serve our customers while simultaneously scaling the security program."

Devin Redmond CEO and co-founder of Theta Lake believes we'll see more focus on the security of collaboration tools, "Incumbent collaboration tools (Zoom, Teams, Webex) are going to get dragged into conversations about privacy law and big tech, further pressuring them to stay on top of security and compliance capabilities. At least two regulatory agencies will make explicit statements about regulatory obligations to retain and supervise collaboration conversations. Additionally, collaboration tools will replace many call center interactions and force organizations on related compliance, privacy, and security risks."

Cybersecurity needs to become 'baked in' according to Charles Eagan, CTO at BlackBerry:

Cybersecurity is, in all too many ways, an after-market add-on. But this kind of model can become a roadblock to comprehensive security -- like plugging the sink while the faucet is already on.

Take, for instance, the connected vehicle market: vehicles continue to make use of data-rich sensors to deliver safety and comfort features to the driver. But if these platforms aren't built with security as a prerequisite, it's easy to open up a new cyberattack vector with each new feature. In many cases, the data that drives Machine Learning and AI is only useful -- and safe -- if it cannot be compromised. Cybersecurity must become a pillar of product and platform development from day one, instead of added on after the architecture is established.

Tony Lauro, Akamai's director of security technology and strategy thinks multi-factor authentication must become the norm, "Over the past 12 months, attacks against remote workers have increased dramatically, and the techniques used to do so have also increased in complexity. In 2021 security-conscious organizations will be compelled to re-evaluate their requirements for using multi-factor authentication (MFA) technology for solutions that incorporate a strong crypto component to defend against man in the middle and phishing-based 2FA bypasses."

Jerry Ray, COO of enterprise data security and encryption company SecureAge, thinks we'll see greater use of encryption, "Throughout most of 2020, VPNs, access controls, and zero trust user authentication became all the rage in the immediate push to allow employees to work from home. As the year ends and 2021 unfolds, though, a greater appreciation for data encryption has been slowly coming to life. As work from home will continue throughout 2021 and the ploys used by hackers to get into the untamed endpoints become more refined and clever, data that can't be used even if stolen or lost will prove the last, best line of defense."

MikeRiemer, global chief technology officer of Ivanti thinks organizations must adopt zero trust, "As employees continue to work from home, enterprises must come to terms with the reality that it may not be just the employee accessing a company device. Other people, such as a child or spouse, may use a laptop, phone, or tablet and inadvertently download ransomware or other types of software malware. Then, when the employee starts using the device to access a corporate network or specific corporate cloud application, it becomes a rogue device. Without having eyes on employees, how do businesses ensure the user and device are trusted? And what about the application, data and infrastructure? All of these components must be verified on a continual basis every few minutes to maintain a superior secure access posture. That is why organizations must adopt a Zero Trust Access solution capable of handling the hyper-converged technology and infrastructure within today's digital workplace by providing a unified, cloud-based service that enables greater accessibility, efficiency, and risk reduction."

Casey Ellis, CTO, founder, and chairman of Bugcrowd thinks more governments around the world will adopt vulnerability disclosure as a default:

Governments are collectively realizing the scale and distributed nature of the threats they face in the cyber domain, as well as the league of good-faith hackers available to help them balance forces. When you're faced with an army of adversaries, an army of allies makes a lot of sense.

Judging by the language used in the policies released in 2020, governments around the world (including the UK) are also leaning in to the benefit of transparency inherent to a well-run VDP to create confidence in their constituents (neighborhood watch for the internet). The added confidence, ease of explanation, and the fact that security research and incidental discovery of security issues happen whether there is an invitation or not is making this an increasingly easy decision for governments to make.

Image credit: photousvp77/depositphotos.com

Read more here:
Encryption, zero trust and the quantum threat security predictions for 2021 - BetaNews

Two Years into the Government’s National Quantum Initiative – Nextgov

Monday markedtwo years since the passage of the National Quantum Initiative, or NQI Actand in that time, federal agencies followed through on its early calls and helped lay the groundwork for new breakthroughs across the U.S. quantum realm.

Now, the sights of those helping implement the law are set on the future.

I would say in five years, something we'd love to see is ... a better idea of, What are the applications for a quantum computer thats buildable in the next fiveto 10 years, that would be beneficial to society? the Office of Science and Technology Policy Assistant Director for Quantum Information Science Dr. Charles Tahan told Nextgov in an interview Friday. He also serves as the director of the National Quantum Coordination Officea cooperation-pushing hub established by the legislation.

Tahan reflected on some foundational moves made over the last 24 months and offered a glimpse into his teams big-ticket priorities for 2021.

Quantum devices and technologies are among an ever-evolving field that hones in on phenomena at the atomic scale. Potential applications are coming to light, and are expected to radically reshape science, engineering, computing, networking, sensing, communication and more. They offer promises like unhackable internet or navigation support in places disconnected from GPS.

Federal agencies have a long history of exploring physical sciences and quantum-related pursuitsbut previous efforts were often siloed. Signed by President Donald Trump in 2018, the NQI Act sought to provide for a coordinated federal program to accelerate quantum research and development for the economic and national security of America. It assigned specific jobs for the National Institute of Standards and Technology, Energy Department and National Science Foundation, among others, and mandated new collaborations to boost the nations quantum workforce talent pipeline and strengthen societys grasp of this relatively fresh area of investment. The functions of the National Quantum Coordination Office, or NQCO, were also set forth in the bill, and it was officially instituted in early 2019. Since then, the group has helped connect an array of relevant stakeholders and facilitate new initiatives proposed by the law.

Now, everything that's been called out in the act has been establishedits started up, Tahan explained. He noted the three agencies with weighty responsibilities spent 2019 planning out their courses of action within their communities, and this year, subsequently launched weighty new efforts.

One of the latest was unveiled in August by the Energy Department, which awarded $625 million over five yearssubject to appropriationsto its Argonne, Brookhaven, Fermi, Oak Ridge and Lawrence Berkeley national laboratories to establish QIS Research Centers. In each, top thinkers will link up to push forward collaborative research spanning many disciplines. Academic and private-sector institutions also pledged to provide $340 million in contributions for the work.

These are about $25 million eachthat's a tremendous amount of students, and postdocs, and researchers, Tahan said. And those are spread out across the country, focusing on all different areas of quantum: computing, sensing and networking.

NSF this summer also revealed the formation of new Quantum Leap Challenge Institutes to tackle fundamental research hurdles in quantum information science and engineering over the next half-decade. The University of Colorado, University of Illinois-Urbana-Champaign, and University of California, Berkeley are set to head and house the first three institutes, though Tahan confirmed more could be launched next year. The initiative is backed by $75 million in federal fundingand while it will take advantage of existing infrastructures, non-governmental entities involved are also making their own investments and constructing new facilities.

That's the foundation, you know, Tahan said. The teams have been formed, the research plans have been writtenthat's a tremendous amount of workand now they're off actually working. So now, we start to reap the rewards because all the heavy lifting of getting people organized has been done.

Together with NSF, OSTP also helped set in motion the National Q-12 Education Partnership. It intends to connect public, private and academic sector quantum players and cohesively create and release learning materials to help U.S. educators produce new courses to engage students with quantum fields. The work is ultimately meant to spur K-12 students' interest in the emerging areas earlier into their education, and NSF will award nearly $1 million across QIS education efforts through the work.

And beyond the governments walls and those of academia, the NQI Act also presented new opportunities for industry. Meeting the laws requirements, NIST helped convene a consortium of cross-sector stakeholders to strategically confront existing quantum-related technology, standards and workforce gaps, and needs. This year, that groupthe Quantum Economic Development Consortium, or QED-Cbloomed in size, established a more formal membership structure and announced companies that make up its steering committee.

It took a year or more to get all these companies together and then write partnership agreements. So, that partnership agreement was completed towards the beginning of summer, and the steering committee signed it over the summer, and now there are I think 100 companies or so who have signed it, Tahan said. So, it's up and running. It's a real economic development consortiumthats a technical thingand that's a big deal. And how big it is, and how fast it's growing is really, really remarkable.

This fall also brought the launch of quantum.gov, a one-stop website streamlining federal work and policies. The quantum coordination office simultaneously released a comprehensive roadmap pinpointing crucial areas of needed research, deemed the Quantum Frontiers Report.

That assessment incorporates data collected from many workshops, and prior efforts OSTP held to promote the national initiative and establishes eight frontiers that contain core problems with fundamental questions confronting QIS today and must be addressed to push forward research and development breakthroughs in the space. They include expanding opportunities for quantum technologies to benefit society, characterizing and mitigating quantum errors, and more.

It tries to cut through the hype a little bit, Tahan explained. It's a field that requires deep technical expertise. So, it's easy to be led in the wrong direction if you don't have all the data. So we try to narrow it down into here are the important problems, here's what we really don't know, heres what we do know, and go this way, and that will, hopefully benefit the whole enterprise.

Quantum-focused strides have also been made by the U.S. on the international front. Tahan pointed to the first quantum cooperation agreement signed between America and Japan late last year, which laid out basic core values guiding their working together.

We've been using that as a model to engage with other countries. We've had high-level meetings with Australia, industry collaborations with the U.K., and we're engaging with other countries. So, that's progressing, Tahan said. Many countries are interested in quantum as you can guesstheres a lot of investments around the worldand many want to work with us on going faster together.

China had also made its own notable quantum investments (some predating the NQI Act), and touted new claims of quantum supremacy, following Google, on the global stage this year.

I wouldn't frame it as a competition ... We are still very much in the research phase here, and we'll see how those things pan out, Tahan said. I think we're taking the right steps, collectively. The U.S. ecosystem of companies, nonprofits and governments arebased on our strategy, both technical and policiesgoing in the right direction and making the right investments.

Vice President-elect Kamala Harris previously put forthlegislationto broadly advance quantum research, but at this point, the Biden administration hasnt publicly shared any intentions to prioritize government-steered ongoing or future quantum efforts.

[One of] the big things we're looking towards in the next year, is workforce development. We have a critical shortage or need for talent in this space. Its a very diverse set of skills. With these new centers, just do the math. How many students and postdocs are you going to need to fill up those, to do all that research? It's a very large number, Tahan said. And so we're working on something to create that pipeline.

In that light, the team will work to continue to develop NSFs ongoing, Q-12 partnership. Theyll also reflect on whats been built so far through the national initiative to identify any crucial needs that may have been looked over.

As you stand something up thats really big, you're always going to make some mistakes. What have you missed? Tahan noted.

And going forward, the group plans to hone deeper in on balancing the economic and security implications of the burgeoning fields.

As the technology gets more and more advanced, how do we be first to realize everything but also protect our investments? Tahan said. And getting that balance right is going to require careful policy thinking about how to update the way the United States does things.

Go here to see the original:
Two Years into the Government's National Quantum Initiative - Nextgov

Here’s Why the Quantum World Is Just So Strange – Walter Bradley Center for Natural and Artificial Intelligence

In this weeks podcast, Enrique Blair on quantum computing, Walter Bradley Center director Robert J. Marks talks with fellow computer engineer Enrique Blair about why Quantum mechanics pioneer Niels Bohr said, If quantum mechanics hasnt profoundly shocked you, you havent understood it yet. Lets look at some of the reasons he said that:

The Show Notes and transcript follow.

Enrique Blair: Its really quite different from our daily experience. Quantum mechanics really is a description of the world at the microscopic scale. And its really weird, because there are things that initially we thought maybe were particles but then we learned that they have wave-like behaviors. And there are other things that we thought were waves and then we discovered they have particle-like behaviors.

But thats hardly the strangest part. The strangest part is that a quantum particle does not actually have a position until we measure it, according to the generally accepted Copenhagen interpretation of quantum mechanics.

Robert J. Marks: Whats the Copenhagen interpretation?

Enrique Blair (pictured): Its that the quantum mechanical wave function describes measurement outcomes in probabilities. You cant predict with certainty the outcome of a measurement. Which is really shocking, because in the classical world, if you have a particle and you know its position and its velocity, you can predict where its going to be in the next second or minute or hour. Now in quantum mechanics, the really weird thing is, we say that a particle doesnt even have a position until you measure its position.

Robert J. Marks: It doesnt exist?

Enrique Blair: Not that it doesnt exist, but its position is not defined.

Dr. Marks compared quantum mechanics (QM) to one of the characters in a 1999 film, Mystery Men, featuring inept amateur superheroes, including one who says, Im invisible as long as nobodys looking at me. With QM, thats not a joke. The quantum particle doesnt have a position until we measure it. But how did we discover this? The story goes back to the early 1800s when British physicist Thomas Young (17731829) did a famous experiment with a card held up to a small window

Enrique Blair: Youngs double-slit experiment goes all the way back to 1801, where Young shot light at a couple of slits and then the light passing through the slits would show up on a screen behind them.

So light behaves like a wave, with interference patterns. But what happens when we try doing the same thing with a single particle of lighta photon? Thats something we can do nowadays.

Enrique Blair: We can reduce a beam of light so that its single photon. One photon is emitted at a time, and were shooting it at our double slit again.

What happens when each particle of light goes through these slits? Well, each particle splats up against this screen, and so you can know where the photon hits. But if you do this over a long period of time, the interference pattern shows up again. You have particles hitting the screen, so we see the particle behavior. But we also see the interference pattern which suggests that okay, weve got some wave interference going on here.

So the only way to explain both of these at the same time is that each photon, which is an indivisible packet of light, has to go through both slits at the same time and interfere with itself, and then the buildup of many, many photons gives you that interference pattern.

Robert J. Marks: A particle was hypothesized to go through both slits?

Enrique Blair: Yes, and thats the mind-blowing ramification of this thing.

Robert J. Marks: How do we decide which slit the particles go through? Suppose we went down and we tried to measure? We put out one photon and we put it through the double slit. Weve tried to measure which slit it went through. If its a particle, it can only go through one, right?

Enrique Blair: Right. That introduces this concept of measurement. Like you said, which slit does it go through? Now the interesting thing is, if we know which slit it goes through maybe we set up a detector and we say, Hey, did it go through Slit One or Slit Two? we detect that, we measure it and the interference pattern goes away because now its gone through one slit only, not both.

Robert J. Marks: Just by the act of observation, we are restricting that photon to go through one slit or the other. Observation really kind of screws things up.

Enrique Blair: Thats right. This is one of the things that is hard to understand about quantum mechanics. In the classical world that we deal with every day, we can just observe something and we dont have to interact with it. So we can measure somethings position or its velocity without altering it. But in quantum mechanics, observation or measurement inherently includes interacting with that thing, that particle.

Again, youve got this photon that goes through both slits, but then you measure it and it actually ends up going through oneonce you measure it.

Robert J. Marks: This reminds me again of Invisible Boy in Mystery Men. The photon goes through one of the two slits while youre looking at it. Unless you look away. Then it goes through both slits.

Enrique Blair: Right. Very tricky, those photons.

Next: How scientists have learned to work with the quantum world

Note: The illustration of the double-slit experiment in physics is courtesy NekoJaNekoJa and Johannes Kalliauer (CC BY-SA 4.0).

You may also enjoy: A materialist gives up on determinism. Evolutionary biologist Jerry Coyne undercuts his own argument against free will by admitting that quantum phenomena are real (Michael Egnor)

Quantum randomness gives nature free will. Whether or not quantum randomness explains how our brains work, it may help us create unbreakable encryption codes (Robert J. Marks)

Podcast Transcript Download

Read more:
Here's Why the Quantum World Is Just So Strange - Walter Bradley Center for Natural and Artificial Intelligence

Quantum Computing Market Share Will Exhibit a Prominent Uptick and Experience Great Demand: D-Wave Solutions, IBM, Google, Microsoft – Murphy’s Hockey…

The following report offers a comprehensive and thorough assessment of the Quantum Computing and focuses on the key growth contributors of the market to help the clients better understand this scenario of the market while taking into consideration the history of the market over the past years moreover because the longer-term scope of growth and forecast that have also been discussed comprehensively within the subsequent report.

The report covers most of the worldwide regions like APAC, North America, South America, Europe, Near East, and Africa, hence ensuring a worldwide and evenly distributed growth curve because the market matures over time.

Top Market Players covered during this report are:D-Wave SolutionsIBMGoogleMicrosoftRigetti ComputingIntelOrigin Quantum Computing TechnologyAnyon Systems Inc.Cambridge Quantum Computing Limited

The report takes into consideration the important factors and aspects that are crucial to the client to post good growth and establish themselves within the Quantum Computing. variety of those aspects are sales, revenue, market size, mergers and acquisitions, risks, demands, new trends and technologies and much more are taken into consideration to relinquish a complete and detailed understanding of the market conditions.

Get Sample PDF[emailprotected]https://www.reportsintellect.com/sample-request/1420966?aaash

Description:

This report has the updated data on the Quantum Computing and since the international markets are changing very rapidly over the past few years the markets have gotten tougher to urge a grasp of and hence the analysts here at Reports Intellect have prepared an in-depth report while taking into consideration the history of the market and a very detailed forecast along with the market issues and their solution. The given report focuses on the key aspects of the markets to substantiate maximum benefit and growth potential for clients and our extensive analysis of the market will help the clients to understand this rather more efficiently. The report has been prepared by using primary similarly as secondary analysis in accordance with porters five force analysis which has been a game-changer for several within the Quantum Computing. The research sources and tools familiar with assessing the report are highly reliable and trustworthy.

Product Type SegmentationHardwareSoftwareCloud Service

Industry SegmentationMedicalChemistryTransportationManufacturing

Market Segment by Regions and Nations included:

North America

South America

Asia

Europe

Discount PDF Brochure @https://www.reportsintellect.com/discount-request/1420966?aaash

Analysis:

The report offers effective guidelines and suggestions for players to secure a footing of strength within the Quantum Computing. The newly arrived players within the market can up their growth potential by a decent amount and also these dominators of the market can sustain their dominance for an extended time by the use of this report. The report includes a close description of mergers and acquisitions which will facilitate your to induce a whole idea of the market competition and also give you extensive knowledge on the way to excel ahead and grow within the market.

Reasons to buy:

About Us:

Reports Intellect is your one-stop solution for everything related to research and market intelligence. We understand the importance of market intelligence and its need in todays competitive world. Our professional team works hard to fetch the foremost authentic research reports backed with impeccable data figures which guarantee outstanding results whenever for you. So whether it is the newest report from the researchers or a custom requirement, our team is here to help you in the best way.

Contact Us:

[emailprotected]

Phone No: + 1-706-996-2486

US Address: 225 Peachtree Street NE, Suite 400, Atlanta, GA 30303

View post:
Quantum Computing Market Share Will Exhibit a Prominent Uptick and Experience Great Demand: D-Wave Solutions, IBM, Google, Microsoft - Murphy's Hockey...

Quantum computing now is a bit like SQL was in the late 80s: Wild and wooly and full of promise – ZDNet

Quantum computing is bright and shiny, with demonstrations by Google suggesting a kind of transcendent ability to scale beyond the heights of known problems.

But there's a real bummer in store for anyone with their head in the clouds: All that glitters is not gold, and there's a lot of hard work to be done on the way to someday computing NP-hard problems.

"ETL, if you get that wrong in this flow-based programming, if you get the data frame wrong, it's garbage in, garbage out," according to Christopher Savoie, who is the CEO and a co-founder of a three-year-old startup, Zapata Computing of Boston, Mass.

"There's this naive idea you're going to show up with this beautiful quantum computer, and just drop it in your data center, and everything is going to be solved it's not going to work that way," said Savoie, in a video interview with ZDNet. "You really have to solve these basic problems."

"There's this naive idea you're going to show up with this beautiful quantum computer, and just drop it in your data center, and everything is going to be solved it's not going to work that way," said Savoie, in a video interview with ZDNet. "You really have to solve these basic problems."

Zapata sells a programming tool for quantum computing, called Orquestra. It can let developers invent algorithms to be run on real quantum hardware, such as Honeywell's trapped-ion computer.

But most of the work of quantum today is not writing pretty algorithms, it's just making sure data is not junk.

"Ninety-five percent of the problem is data cleaning," Savoie told ZDNet. "There wasn't any great toolset out there, so that's why we created Orquestra to do this."

The company on Thursday announced it has received a Series B round of investment totaling $38 million from large investors that include Honeywell's venture capital outfit and returning Series A investors Comcast Ventures, Pitango, and Prelude Ventures, among others. The company has now raised $64.4 million.

Also:Honeywell introduces quantum computing as a service with subscription offering

Zapata was spun out of Harvard University in 2017 by scholars including Aln Aspuru-Guzik, who has done fundamental work on quantum. But a lot of what is coming up are the mundane matters of data prep and other gotchas that can be a nightmare in a bold new world of only partially-understood hardware.

Things such as extract, transform, load, or ETL, which become maddening when prepping a quantum workload.

"We had a customer who thought they had a compute problem because they had a job that was taking a long time; it turned out, when we dug in, just parallelizing the workflow, the ETL, gave them a compute advantage," recalled Savoie.

Such pitfalls, said Savoie, are thingsthat companies don't know are an issue until they get ready to spend valuable time on a quantum computer and code doesn't run as expected.

"That's why we think it's critical for companies to start now," he said, even though today's noisy intermediate-scale quantum, or NISQ, machines have only a handful of qubits.

"You have to solve all these basic problems we really haven't even solved yet in classical computing," said Savoie.

The present moment in time in the young field of quantum sounds a bit like the early days of microcomputer-based relational databases. And, in fact, Savoie likes to make an analogy to the era of the 1980s and 1990s, when Oracle database was taking over workloads from IBM's DB/2.

Also:What the Google vs. IBM debate over quantum supremacy means

"Oracle is a really good analogy, he said. "Recall when SQL wasn't even a thing, and databases had to be turned on a per-on-premises, as-a-solution basis; how do I use a database versus storage, and there weren't a lot of tools for those things, and every installment was an engagement, really," recalled Savoie.

"There are a lot of close analogies to that" with today's quantum, said Savoie. "It's enterprise, it's tough problems, it's a lot of big data, it's a lot of big compute problems, and we are the software company sitting in the middle of all that with a lot of tools that aren't there yet."

Mind you, Savoie is a big believer in quantum's potential, despite pointing out all the challenges. He has seen how technologies can get stymied, but also how they ultimately triumph. He helped found startup Dejima, one of the companies that became a component of Apple's Siri voice assistant, in 1998. Dejima didn't produce an AI wave, it sold out to database giant Sybase.

"We invented this natural language understanding engine, but we didn't have the great SpeechWorks engine, we didn't have 3G, never mind 4G cell phones or OLED displays," he recalled. "It took ten years from 1998 till it was a product, till it was Siri, so I've seen this movie before I've been in that movie."

But the technology of NLP did survive and is now thriving. Similarly, the basic science of quantum, as with the basic science of NLP, is real, is validated. "Somebody is going to be the iPhone" of quantum, he said, although along the way there may be a couple Apple Newtons, too, he quipped.

Even an Apple Newton of quantum will be a breakthrough. "It will be solving real problems," he said.

Also: All that glitters is not quantum AI

In the meantime, handling the complexity that's cropping up now, with things like ETL, suggests there's a role for a young company that can be for quantum what Oracle was for structured query language.

"You build that out, and you have best practices, and you can become a great company, and that's what we aspire to," he said.

Zapata has fifty-eight employees and has had contract revenue since its first year of operations, and has doubled each year, said Savoie.

View post:
Quantum computing now is a bit like SQL was in the late 80s: Wild and wooly and full of promise - ZDNet

NTTs Kazuhiro Gomi says Bio Digital Twin, quantum computing the next-gen tech – Backend News

At the recently concluded Philippine Digital Convention (PH Digicon 2020) by PLDT Enterprise, Kazuhiro Gomi, president and CEO, NTT Research, shared the fundamental research milestones coming out of its three labs: the Physics and Informatics (PHI) Lab, the Cryptography and Information Security (CIS) Lab, and the Medical and Health Informatics (MEI) Lab, that are hoped to lead to monumental tech innovations.

The three-day virtual convention drew in more than 3,000 views during the live stream broadcast of the plenary sessions and breakout sessions covering various topics.

Gomi headlined the second day with his topic Upgrading Reality, a glimpse into breakthrough research that NTT Research is currently working on that could hasten digital transformations.

PLDT sets up Data Privacy and Information Security Committee

PLDT Home broadband service expands 46% nationwide

In a discussion with Cathy Yap-Yang, FVP and head Corporate Communications, PLDT, Gomi elaborated on next-generation technologies, particularly the Bio Digital Twin project, that could potentially be game-changing in the medical field, quantum computing, and advanced cryptography.

Bido Digital Twin

The Bio Digital Twin is an initiative where a digital replica of a patients internal system functions first as a model for possible testing of procedures and chemical reactions and seeing possible results before actual application to the person.

We are trying to create an electronic replica of the human body. If we are able to create something like that, the future of clinical and medical activities will be very different, Gomi said. If we have a precise replica of your human body, you can predict what type of disease or what type of problem you might have maybe three years down the road. Or, if your doctor needs to test a new drug for you, he can do so onto the digital twin.

NTT Research is a fundamental research organization in Silicon Valley that carries out advanced research for some of the worlds most important and impactful technologies, including quantum computing, cryptography, information security, and medical and health informatics.

Computing power

However, to get there and make the Bio Digital Twin possible, there are hurdles from various disciplines, including the component of computing power.

Gomi explained that people believed that todays computers can do everything, but in reality, it might actually take years to solve complex problems, whereas a quantum computer could solve these problems in seconds.

There are different kinds of quantum computers, but all are based upon quantum physics. At NTT Research, Gomi revealed that their group is working on a quantum computer called a coherent Ising machine which could solve combinatorial optimization problems.

We may be able to bring those superfast machines to market, to reality, much quicker. That is what we are aiming for, he said.

Basically, the machine, using many parameters and complex optimization, finds the best solution in a matter of seconds which may take months or years using conventional computers.

Some examples where quantum computing may be applied include lead optimization problems such as effects on small molecule drugs, peptide drugs, and Biocatalyst, or resource optimization challenges such as logistics, traffic control, or using wireless networks. Gomi also expounded on compressed sensing cases, including use in astronomical telescopes, magnetic resonance imaging (MRI), and computed tomography.

Quantum computing

Apart from quantum computing, Gomi reiterated the issues of cybersecurity and privacy. Today, encryption is able to address those challenges but it would soon require a more advanced and sophisticated type of technology if we are to upgrade reality.

From the connected world, obviously we want to exchange more data among each other, but we have to make sure that security and privacy are maintained. We have to have those things together to get the best out of a connected world, he said.

Among next-generation advanced encryptions, Gomi highlighted Attribute-Based Encryption where various decryption keys define access control of the encrypted data. For example, depending on the user (or the type of key he/she has) what they are allowed to view is different or controlled by the key issuers.

He noted that in the next couple of years, we should be able to commercialize this type of technology. We can maintain privacy while encouraging the sharing of data with this mechanism.

Gomi reiterated that we are at the stage of all kinds of digital transformations.

Digital transformation

Those digital transformations are making our lives so much richer and business so much more interesting and efficient. I would imagine those digital transformations will continue to advance even more, he said.

However, there are limiting factors that could impede or slow down those digital transformations such as energy consumption, Moores law of limitation as we cannot expect too much of the capacities of the electronic chips from current computers, and the issues on privacy and security. Hence, we need to address those factors.

PH Digicon 2020 is the annual convention organized by PLDT Enterprise which gathered global industry leaders to speak on the latest advancements in the digital landscape. This years roster of speakers included tech experts and heads from Cisco, Nokia, Salesforce, NTT Research, and goop CEO and multi-awarded Hollywood actress Gwyneth Paltrow who headlined the first virtual run.

Related

More here:
NTTs Kazuhiro Gomi says Bio Digital Twin, quantum computing the next-gen tech - Backend News