The Computational Limits of Deep Learning Are Closer Than You Think – Discover Magazine

Deep in the bowels of the Smithsonian National Museum of American History in Washington DC sits a large metal cabinet the size of a walk-in wardrobe. The cabinet houses a remarkable computer the front is covered in dials, switches and gauges and inside it is filled with potentiometers controlled by small electric motors. Behind one of the cabinet doors is a 20 x 20 array of light sensitive cells, a kind of artificial eye.

This is the Perceptron Mark I, a simplified electronic version of a biological neuron. It was designed by the American psychologist Frank Rosenblatt at Cornell University in the late 1950s who taught it to recognize simple shapes such as triangles.

Rosenblatts work is now widely recognized as the foundation of modern artificial intelligence but at the time it was controversial. Despite the original success, researchers were unable to build on this, not least because more complex pattern recognition required vastly more computational power than was available at the time. This insatiable appetite prevented further study of artificial neurons and the networks they create.

Todays deep learning machines also eat power, lots of it. And that raises an interesting question about how much they will need in future. Is this appetite sustainable as the goals of AI become more ambitious?

Today we get an answer thanks to the work of Neil Thompson at the Massachusetts Institute of Technology in Cambridge and several colleagues. This team has measured the improved performance of deep learning systems in recent years and show that how it depends on increases in computing power.

By extrapolating this trend, they say that future advances will soon become unfeasible. Progress along current lines is rapidly becoming economically, technically, and environmentally unsustainable, say Thompson and colleagues, echoing the problems that emerged for Rosenblatt in the 1960s.

The teams approach is relatively straightforward. They analyzed over 1000 papers on deep learning to understand how learning performance scales with computational power. The answer is that the correlation is clear and dramatic.

In 2009, for example, deep learning was too demanding for the computer processors of the time. The turning point seems to have been when deep learning was ported to GPUs, initially yielding a 5 15 speed-up, they say.

This provided the horsepower for a neural network called AlexNet, which famously triumphed in a 2012 image recognition challenge where it wiped out the opposition. The victory created huge and sustained interest in deep neural networks that continues to this day.

But while deep learning performance increased by 35x between 2012 and 2019, the computational power behind it increased by an order of magnitude each year. Indeed, Thompson and co say this and other evidence suggests the computational power for deep learning has increased 9 orders of magnitude faster than the performance.

So how much computational power will be required in future? Thompson and co say that error rate for image recognition is currently 11.5 percent using 10^14 gigaflops of computational power at a cost of millions of dollars (ie 10^6 dollars).

They say achieving an error rate of just 1 per cent will require 10^28 gigaflops. And extrapolating at the current rate, this will cost 10^20 dollars. By comparison, the total amount of money in the world right now is measured in trillions ie 10^12 dollars.

Whats more, the environmental cost of such a calculation will be enormous, an increase in the amount of carbon produced of 14 orders of magnitude. Progress along current lines is rapidly becoming economically, technically, and environmentally unsustainable, conclude Thompson and colleagues.

The future isnt entirely bleak, however. Thompson and cos extrapolations assume that future deep learning systems will use the same kinds of computers that are available today.

But various new approaches offer much more efficient computation. For example, in some tasks the human brain can outperform the best supercomputers while running on little more than a bowl of porridge. Neuromorphic computing attempts to copy this. And quantum computing promises orders of magnitude more computing power with relatively little increase in power consumption.

Another option is to abandon deep learning entirely and concentrate on other forms of machine learning that are less power hungry.

Of course, there is no guarantee that these new techniques and technologies will work. But if they dont, its hard to see how artificial intelligence will get much better than it is now.

Curiously, something like this happened after the Perceptron Mark I first appeared, a period that lasted for decades and is now known as the AI winter. The Smithsonian doesnt currently have it on display, but it is surely marks a lesson worth remembering.

Ref: arxiv.org/abs/2007.05558 : The Computational Limits of Deep Learning

Follow this link:
The Computational Limits of Deep Learning Are Closer Than You Think - Discover Magazine

Quantum Computing: Navigating Towards The Future Of Computers – Analytics Insight

Computing power has reached its saturation point. If we continue following the same path soon, we may not have enough power to run the machines of the world. The solution to this lies in quantum computing. The origins of quantum computing go back in 1981 when renowned physicist Richard Feynman asked in a Massachusetts Institute of Technology conference that, Can we simulate physics on a computer? While it is not totally based on physics, quantum computing does work on the principles of quantum mechanics. Here it uses two properties called superposition and entanglement.

Current conventional computer systems are built around the idea of binary bits and Boolean logic. A bit can be physically represented as a switch with a value of 0 (off) or 1 (on). When these switches are connected using Boolean logic gates (and, or, xor, and others) they can perform all the complex operations of a modern microprocessor. In contrast, quantum computer use qubits (quantum bits) can also be in both states at the same time, a quantum property called superposition. In addition, qubits are also capable of pairing, which is known as entanglement. Here, the state of one qubit cannot be described independently of the state of the others that allows instantaneous communication. As per anIDC report, 25 percent of the Fortune Global 500 will gain a competitive edge from quantum computing by 2023.

Meanwhile, tech giants like Google, Microsoft, and IBM are battling to be the first to make a working, practically useful quantum computer. Every month there are extensive updates from these companies about their work. Recently Google had announced its quantum computer (which uses quantum annealing) is 100 million times faster than any classical computer in its lab. Further, the interest in quantum computing has been mirrored by investments in this field by players from a broad array of industries.

Quantum computers have four fundamental capabilities that differentiate them from todays conventional computers:

1. quantum simulation, in which quantum computers model complex molecules;

2. optimization (that is, solving multivariable problems with unprecedented speed);

3. quantum artificial intelligence, with better algorithms that could transform machine learning across industries as diverse as pharma and automotive;

4. prime factorization, which could revolutionize encryption.

These advanced computers are predicted to solve previously unapproachable problems, creating valuable solutions for industry and will disrupt current techniques. For instance, NASA is looking at using quantum computing for analyzing the enormous amount of data they collect about the universe, as well asresearch better and safer methods of space travel.Auto manufacturer leader, Volkswagen is using quantum computers to develop battery, transportation, and self-driving technology. It can be utilized to boost security since it can enhance the accuracy of measurements and enable new modalities for sensors and measurements.For example, it can accurately detect masses moving underwater, such as submarines.

Oil and gas companies can employ quantum computing to calculate the ways how atoms and molecules can be configured to protect equipment from corrosion. Even in pharmaceuticals, the discovery of new drugs, minimal development time, potential ways to synthesize new compounds are possible due to this technology. In the chemicals industry, it is used to provide a better understanding of catalytic reactions, reducing the cost of industrial processes. Quantum entanglement has also led to the possibility of quantum teleportation.

It is important to note that quantum computers are very fragile. Any vibration will impact the atoms and cause decoherence. Also, at present, quantum computers need highly sophisticated hardware and supporting infrastructure. For this, some of the existing models use superconductivity to create and maintain a quantum state. This implies that qubits must be kept at a temperature near absolute zero using a dilution refrigerator. This is why theinside of D-Wave Systems quantum computeris -460 degrees Fahrenheit. So, companies may need a cloud model to access quantum services instead of installing their own version of quantum computers on-premises. Therefore, not all can have their quantum systems, at least not in the near future.

Moreover, people need to realize that while quantum computers are the future, but they do not replace the standard ones either. Instead, they should be thought of as devices that enhance the usability of conventional general-purpose computers. According to this model, a core application is executed on a traditional computer that can also handle data storage and other infrastructure-related tasks. At the same time, the quantum part can be applied to deal with only the subset of the overall responsibility thats best suited to its particular strengths.

Original post:
Quantum Computing: Navigating Towards The Future Of Computers - Analytics Insight

Hear how three startups are approaching quantum computing differently at TC Disrupt 2020 – TechCrunch

Quantum computing is at an interesting point. Its at the cusp of being mature enough to solve real problems. But like in the early days of personal computers, there are lots of different companies trying different approaches to solving the fundamental physics problems that underly the technology, all while another set of startups is looking ahead and thinking about how to integrate these machines with classical computers and how to write software for them. At Disrupt 2020 on September 14-18, we will have a panel with D-Wave CEO Alan Baratz, Quantum Machines co-founder and CEO Itamar Sivan and IonQ president and CEO Peter Chapman. The leaders of these three companies are all approaching quantum computing from different angles, yet all with the same goal of making this novel technology mainstream.

D-Wave may just be the best-known quantum computing company thanks to an early start and smart marketing in its early days. Alan Baratz took over as CEO earlier this year after a few years as chief product officer and executive VP of R&D at the company. Under Baratz, D-Wave has continued to build out its technology and especially its D-Wave quantum cloud service. Leap 2, the latest version of its efforts, launched earlier this year. D-Waves technology is also very different from that of many other efforts thanks to its focus on quantum annealing. That drew a lot of skepticism in its early days but its now a proven technology and the company is now advancing both its hardware and software platform.

Like Baratz, IonQs Peter Chapman isnt a founder either. Instead, he was the engineering director for Amazon Prime before joining IonQ in 2019. Under his leadership, the company raised a $55 million funding round in late 2019, which the company extended by another $7 million last month. He is also continuing IonQs bet on its trapped ion technology, which makes it relatively easy to create qubits and which, the company argues, allows it to focus its efforts on controlling them. This approach also has the advantage that IonQs machines are able to run at room temperature, while many of its competitors have to cool their machines to as close to zero Kelvin as possible, which is an engineering challenge in itself, especially as these companies aim to miniaturized their quantum processors.

Quantum Machines plays in a slightly different part of the ecosystem from D-Wave and IonQ. The company, which recently raised $17.5 million in a Series A round, is building a quantum orchestration platform that combines novel custom hardware for controlling quantum processors because once quantum machines reach a bit more maturity, a standard PC wont be fast enough to control them with a matching software platform and its own QUA language for programming quantum algorithms.Quantum Machines is Itamar Sivans first startup, which he launched with his co-founders after getting his Ph.D. in condensed matter and material physics at the Weizman Institute of Science.

Come to Disrupt 2020 and hear from these companies and others on September 14-18. Get a front-row seat with your Digital Pro Pass for just $245 or with a Digital Startup Alley Exhibitor Package for $445. Prices are increasing next week, so grab yours today to save up to $300.

See original here:
Hear how three startups are approaching quantum computing differently at TC Disrupt 2020 - TechCrunch

Pros and Cons to Buying Microsoft (MSFT) Stock – WTOP

Twenty years ago, Microsoft Corp. (ticker: MSFT) was the most valuable company in the world. Today, along with competitors Apple

Twenty years ago, Microsoft Corp. (ticker: MSFT) was the most valuable company in the world.

Today, along with competitors Apple ( AAPL) and Amazon.com ( AMZN), Microsoft is worth around $1.5 trillion, and tech titans like Alphabet ( GOOG, GOOGL), Facebook ( FB) and Netflix ( NFLX) are not too far behind. The company that brought you Windows may not be alone at the top anymore, but Microsoft is far from obsolete and continues to remain relevant in markets around the world.

Is Microsoft stock still a buy in mid-2020? Heres a look at the biggest pros and cons associated with MSFT.

Microsoft Stock at a Glance

Rising to prominence in the late 1970s and early 1980s, Microsofts software became the industry standard for early PCs made by the likes of IBM ( IBM) and Apple. This gave Microsoft a crucial first-mover advantage.

By the 1990s, computers became small enough and economical enough for the average American household or typical elementary school to afford one. The end market wasnt just corporations and academia anymore, propelling Microsoft further.

As home computers became commonplace, so too was the operating system they used: Windows, the pre-installed, Microsoft-made software. Consumers loved the Windows user experience and its practical capabilities, especially the Microsoft Office suite of applications such as Word, Excel and PowerPoint.

By earning a hefty licensing fee on each computer sold with Windows and Office, Microsoft was able to achieve previously unimaginable scale over a short period.

[Sign up for stock news with our Invested newsletter.]

A few decades later and Windows is still a major cash cow for Microsoft. But the company has also been able to diversify, and its most exciting future growth prospects are expected to come from other areas like cloud computing, social networking, remote work apps and video games.

Pros of Buying Microsoft Stock

There have been three CEOs since Microsoft was founded in 1975: co-founder Bill Gates (1975-2000), Steve Ballmer (2000-2014) and Satya Nadella. Gates tenure was characterized by a company that experienced virtually unprecedented growth, making him the richest person in the world by the 1990s. Ballmers tenure was a struggle, as Microsoft failed to stay at the forefront of tech, largely missing the boat on huge growth industries it was perfectly positioned to dominate like smartphones, search engines and social networks.

Since 2014, Microsoft has been led by Nadella, a period that thus far has been characterized by a return to Wall Street prominence, outperformance, revenue diversification and its biggest theme: cloud computing.

Today, one of Microsofts biggest pros is essentially the same as what it was 20 years ago: The company has an unbelievable moat, a high barrier to entry. Many users around the world have learned everything they know about computers using Microsofts Windows operating system.

If you dont have an Apple computer, Windows is by far the operating system of choice for manufacturers and consumers alike, holding the majority of the desktop market share worldwide.

But Microsoft doesnt have to release a new version of Windows just to profit from Windows. In the fourth quarter of its fiscal 2020 year, Microsofts Windows original equipment manufacturer (OEM) revenue increased by 7%.

[READ 2020s Dividend Aristocrats List: All 66 Stocks]

Amid the pandemic, the tech companys Windows OEM non-Pro revenue increased 34% thanks to consumer demand driven by remote work needs for employees staying out of the office and remote learning scenarios, illustrating the strength of the Windows brand and the value of the product.

Windows is part of a business segment Microsoft labels More Personal Computing, and the segment also accounts for the Xbox and associated services, sales of the Surface tablet and advertising revenue from Bing.

Besides Bing, a perennial loser lagging behind Googles search engine, More Personal Computing saw great success in the fourth quarter. People quarantined at home sought escape in video games, sending gaming revenue up 64% and Xbox content and services revenue up 65%; meanwhile, stay-at-home orders also encouraged consumers to snag a Surface tablet for remote work and education, resulting in a 28% increase in Surface revenue.

Combined with Windows, these diverse businesses propelled revenue in the More Personal Computing segment up 14% in a quarter when the vast majority of companies around the world could only dream of such returns. But the big growth driver at Microsoft right now is the cloud.

The second major pro to buying Microsoft stock is its growing focus on the cloud. The company does this in two ways: First, it offers its suite of productivity applications, Microsoft Office, as a cloud-based software as a service offering. Instead of earning a one-time cut when someone buys a Windows- and Office-equipped computer, consumers now pay Microsoft $99.99 a year to use Office across all devices.

The second way Microsoft is cashing in on the cloud is with its cloud computing offering Azure. Its the second-largest player in the rapidly growing field, trailing only Amazon and its Amazon Web Services. In the fourth quarter, Microsoft Azure revenue grew by 47% year over year, fueling the 19% increase in server products and cloud services revenue year over year.

But the strength of Azure and Microsofts cloud services was enough to propel the Intelligent Cloud segment to 17% revenue growth this past quarter, and that growth will likely only continue thanks to key contracts like Microsofts recent $10 billion deal with the U.S. Department of Defense.

The cloud, personal computing and Microsofts final segment, Productivity and Business Processes, all enjoyed strong revenue growth in the fourth quarter of fiscal 2020 and combined to push Microsofts fourth-quarter revenue up 13% year over year. As for the fiscal year itself, Microsofts revenue increased by 14% year over year, while earnings per share increased by 14%, too.

MSFT shareholders who have spent 2020 watching their portfolios take a roller coaster ride must be relieved that their investments include a company as stable as Microsoft. This brings up the final pro for investing in the house that Gates built: stability.

[READ: 15 of the Best Dividend Stocks to Buy for 2020.]

The risk you take on by investing in Microsoft is fairly low for long-term investors. Not only is Microsoft notably absent from the U.S. governments looming antitrust investigations into Big Tech peers Facebook, Alphabet and Amazon, but Microsoft is one of just two U.S. companies that all major credit rating agencies actually consider to be a lower default risk than the federal government.

Thats right: Microsoft, along with Johnson & Johnson ( JNJ), is more likely to pay back your loan than Uncle Sam. Its hard to be much more financially secure than that.

Cons of Buying Microsoft Stock

The cons to buying Microsoft stock? Those are a bit harder to find.

The most glaring risk might seem trite, but in simple terms, its that MSFT stock may be too high right now. By traditional metrics like the price-earnings ratio (PE) and price-earnings to growth ratio (PEG), Microsoft is trading at richer valuations than the S&P 500.

Theres nothing wrong with that on its face. Most growth stocks trade for higher multiples than the market at large, for the rational reason that earnings are expected to grow more quickly than the wider market.

The question, however, is whether a trillion-dollar company like Microsoft can still be expected to grow at a quick enough rate to justify its PE of 35. Back in the 80s and 90s, it wasnt unusual for earnings to double every two years or so, and its much easier to go from numbers like $100 million to $200 million than $1 trillion to $2 trillion. Theres only so much money, and so much growth, in the world especially considering the FAANG (Facebook, Apple, Amazon, Netflix, Google) stocks that are constantly vying for Microsofts business.

Speaking of FAANG stocks, theres another potential risk to keep abreast of: If a competitor develops a breakthrough in something like quantum computing, artificial intelligence, smart home devices or entertainment where Microsoft shouldve been competing more aggressively, thats a missed opportunity. But Nadella is far less likely to miss those massive paradigm shifts than the less technologically sophisticated Ballmer.

That said, Microsoft faces stiff competition in nearly every industry in which it dabbles. Surface sales may increase, but its doubtful theyll ever eclipse the iPad. Google is unlikely to lose out to Bing anytime soon. Azure is steadily gaining ground, but Amazon still remains the market leader. There are always new competitors ready to take on Microsofts dominance Slack ( WORK) is challenging Microsoft Teams, while the new Xbox Series X will face off against Sonys ( SNE) new PlayStation 5 this holiday season.

The key to Microsofts ongoing success remains Windows and the Office suite of products. That was true in the 1990s, and it is still true in 2020. As long as Microsoft remains dominant in those markets, it will be a viable company with a bright future ahead but investors should always be wary of new competitors lurking just over the horizon.

The Bottom Line on Microsoft Stock

The fact that the biggest risks associated with Microsoft stock are mostly just the usual risks associated with buying any stock is a remarkable statement.

For a company of its size to not have extreme legal or antitrust woes or hardcore competition threatening its bread and butter is remarkable. The fact that its financial security is considered safer than U.S. bonds is almost without parallel.

Microsoft has a great moat in an industry that will almost certainly still be around a decade from now; on top of that, at the time of this writing, it pays a modest 0.96% dividend. Thats slightly more than the 10-year Treasury at 0.6% right now. So if you can sit on your hands with 10-year Treasurys, you might as well buy some Microsoft youll get the dividend, and likely some sizable capital gains unless something goes horribly wrong, or Nadella decides to channel his inner Ballmer.

When you look at the risk versus reward, Microsoft is a phenomenal stock to own.

More from U.S. News

The Complete Berkshire Hathaway Portfolio

10 of the Best Stocks to Buy for 2020

9 Top Robinhood Stocks to Buy That Analysts Also Love

Pros and Cons to Buying Microsoft (MSFT) Stock originally appeared on usnews.com

Update 07/23/20: This story was published at an earlier date and has been updated with new information.

View post:
Pros and Cons to Buying Microsoft (MSFT) Stock - WTOP

UVA Pioneers Study of Genetic Diseases With Mind-Bending Quantum Computing – University of Virginia

University of Virginia School of Medicinescientists are harnessing the mind-bending potential of quantum computers to help us understand genetic diseases even before quantum computers are a thing.

UVAs Stefan Bekiranov and colleagues have developed an algorithm to allow researchers to study genetic diseases using quantum computers once there are much more powerful quantum computers to run it. The algorithm, a complex set of operating instructions, will help advance quantum computing algorithm development and could advance the field of genetic research one day.

Quantum computers are still in their infancy. But when they come into their own, possibly within a decade, they may offer computing power on a scale unimaginable using traditional computers.

We developed and implemented a genetic sample classification algorithm that is fundamental to the field of machine learning on a quantum computer in a very natural way using the inherent strengths of quantum computers, Bekiranov said. This is certainly the first published quantum computer study funded by the National Institute of Mental Health and may be the first study using a so-called universal quantum computer funded by the National Institutes of Health.

Traditional computer programs are built on 1s and 0s, either-or. But quantum computers take advantage of a freaky fundamental of quantum physics: Something can be and not be at the same time. Rather than 1 or 0, the answer, from a quantum computers perspective, is both, simultaneously. That allows the computer to consider vastly more possibilities, all at once.

The challenge is that the technology is, to put it lightly, technically demanding. Many quantum computers have to be kept at near absolute zero, the equivalent of more than 450 degrees below zero Fahrenheit. Even then, the movement of molecules surrounding the quantum computing elements can mess up the calculations, so algorithms not only have to contain instructions for what to do, but for how to compensate when errors creep in.

Our goal was to develop a quantum classifier that we could implement on an actual IBM quantum computer. But the major quantum machine learning papers in the field were highly theoretical and required hardware that didnt exist. We finally found papers from Dr. Maria Schuld, who is a pioneer in developing implementable, near-term, quantum machine-learning algorithms. Our classifier builds on those developed by Dr. Schuld, Bekiranov said. Once we started testing the classifier on the IBM system, we quickly discovered its current limitations and could only implement a vastly oversimplified, or toy, problem successfully, for now.

The new algorithm essentially classifies genomic data. It can determine if a test sample comes from a disease or control sample exponentially faster than a conventional computer. For example, if they used all four building blocks of DNA (A, G, C or T) for the classification, a conventional computer would execute 3 billion operations to classify the sample. The new quantum algorithm would need only 32.

That will help scientists sort through the vast amount of data required for genetic research. But its also proof-of-concept of the usefulness of the technology for such research.

Bekiranov and collaborator Kunal Kathuria were able to create the algorithm because they were trained in quantum physics, a field that even scientists often find opaque. Such algorithms are more likely to emerge from physics or computer science departments than medical schools. (Both Bekiranov and Kathuria conducted the study in the School of MedicinesDepartment of Biochemistry and Molecular Genetics. Kathuria is currently at the Lieber Institute for Brain Development.)

Because of the researchers particular set of skills, officials at the National Institutes of Healths National Institute of Mental Health supported them in taking on the challenging project. Bekiranov and Kathuria hope what they have developed will be a great benefit to quantum computing and, eventually, human health.

Relatively small-scale quantum computers that can solve toy problems are in existence now, Bekiranov said. The challenges of developing a powerful universal quantum computer are immense. Along with steady progress, it will take multiple scientific breakthroughs. But time and again, experimental and theoretical physicists, working together, have risen to these challenges. If and when they develop a powerful universal quantum computer, I believe it will revolutionize computation and be regarded as one of greatest scientific and engineering achievements of humankind.

The scientists have published their findings in the scientific journalQuantum Machine Intelligence. The algorithm-development team consisted of Kathuria, Aakrosh Ratan, Michael McConnell and Bekiranov.

The work was supported by NIH grants 3U01MH106882-04S1, 5U01MH106882-05 and P30CA044579.

To keep up with the latest medical research news from UVA, subscribe to theMaking of Medicineblog.

Excerpt from:
UVA Pioneers Study of Genetic Diseases With Mind-Bending Quantum Computing - University of Virginia

The Hyperion-insideHPC Interviews: Dr. Michael Resch Talks about the Leap from von Neumann: ‘I Tell My PhD Candidates: Go for Quantum’ – insideHPC

Dr. Michael M. Resch of the University of Stuttgart has professorships, degrees, doctorates and honorary doctorates from around the world, he has studied and taught in Europe and the U.S., but for all the work he has done in supercomputing for the past three-plus decades, he boils down his years in HPC to working with the same, if always improving, von Neumann architecture. Hes eager for the next new thing: quantum. Going to quantum computing, we have to throw away everything and we have to start anew, he says. This is a great time.

In This Update. From The HPC User Forum Steering Committee

By Steve Conway and Thomas Gerard

After the global pandemic forced Hyperion Research to cancel the April 2020 HPC User Forum planned for Princeton, New Jersey, we decided to reach out to the HPC community in another way by publishing a series of interviews with members of the HPC User Forum Steering Committee. Our hope is that these seasoned leaders perspectives on HPCs past, present and future will be interesting and beneficial to others. To conduct the interviews, Hyperion Research engaged insideHPC Media.

We welcome comments and questions addressed to Steve Conway, sconway@hyperionres.com or Earl Joseph, ejoseph@hyperionres.com.

This interview is with Michael M. Resch. Prof. Dr. Dr. h.c. mult. He is dean of the faculty for energy-process and biotechnology of the University of Stuttgart, director of the High Performance Computing Center Stuttgart (HLRS), the Department for High Performance Computing, and the Information Center (IZUS), all at the University of Stuttgart, Germany. He was an invited plenary speaker at SC07. He chairs the board of the German Gauss Center for Supercomputing (GCS) and serves on the advisory councils for Triangle Venture Capital Group and several foundations. He is on the advisory board of the Paderborn Center for Parallel Computing (PC2). He holds a degree in technical mathematics from the Technical University of Graz, Austria and a Ph.D. in engineering from the University of Stuttgart. He was an assistant professor of computer science at the University of Houston and was awarded honorary doctorates by the National Technical University of Donezk (Ukraine) and the Russian Academy of Science.

He was interviewed by Dan Olds, HPC and big data consultant at Orionx.net.

The HPC User Forum was established in 1999 to promote the health of the global HPC industry and address issues of common concern to users. More than 75 HPC User Forum meetings have been held in the Americas, Europe and the Asia-Pacific region since the organizations founding in 2000.

Olds: Hello, Im Dan Olds on behalf of Hyperion Research and insideHPC, and today Im talking to Michael Resch, who is an honorable professor at the HPC Center in Stuttgart, Germany. How are you, Michael?

Resch: I am fine, Dan. Thanks.

Olds: Very nice to talk to you. I guess lets start at the beginning. How did you get involved in HPC in the first place?

Resch: That started when I was a math student and I was invited to work as a student research assistant and, by accident, that was roughly the month when a new supercomputer was coming into the Technical University of Graz. So, I put my hands on that machine and I never went away again.

Olds: You sort of made that machine yours, I guess?

Resch: We were only three users. There were three user groups and I was the most important user of my user group because I did all the programming.

Olds: Fantastic, thats a way to make yourself indispensable, isnt it?

Resch: In a sense.

Olds: So, can you kind of summarize your HPC background over the years?

Resch: I started doing blood flow simulations, so I at first looked into this very traditional Navier-Stokes equation that was driving HPC for a long time. Then I moved on to groundwater flow simulations pollution of groundwater, tunnel construction work, and everything until after like five years I moved to the University of Stuttgart, where I started to work with supercomputers, more focusing on the programming side, the performance side, than on the hardware side. This is sort of my background in terms of experience.

In terms of education, I studied a mixture of mathematics, computer science and economics, and then did a Ph.D. in engineering, which was convenient if youre working in Navier-Stokes equations. So, I try to bring all of these things together to make an impact in HPC.

Olds: What are some of the biggest changes youve seen in HPC over your career?

Resch: Well, the biggest change is probably that when I started, as I said, there were three user groups. These were outstanding experts in their field, but supercomputing was nothing for the rest of the university. Today, everybody is using HPC. Thats probably the biggest change, that we are moving from something where you had one big system and a few experts around that system, and you moved to a larger number of systems and tens of thousands of experts working with them.

Olds: And, so, the systems have to get bigger, of course.

Resch: Well, certainly, they have to get bigger. And they have to get, I would say, more usable. Thats another feature, that now things are more hidden from the user, which makes it easier to use them. But at the same time, it takes away some of the performance. There is this combination of hiding things away from the user and then the massive parallelism that we saw, and thats the second most important thing that I think we saw in the last three decades. That has made it much more difficult to get high sustained performance.

Olds: Where do you see HPC headed in the future? Is there anything that has you particularly excited or concerned?

Resch: [Laughs] Im always excited and concerned. Thats just normal. Thats what happens when you go into science and thats normal when you work with supercomputers. I see, basically, two things happening. The first thing is that people will merge everything that has to do with data and everything that has to do with simulation. I keep saying its data analytics, machine learning, artificial intelligence. Its sort of a development from raw data to very intelligent handling of data. And these data-intensive things start to merge with simulation, like we see people trying to understand what they did over the last 20 years by employing artificial intelligence to work its way through the data trying to find what we have already done and what should we do next, things like that.

The second thing that is exciting is quantum computing. Its exciting because its out of the ordinary, in a sense. You might say that over the last 32 years the only thing I did was work with improved technology and improved methods and improved algorithms or whatever, but I was still working in the same John von Neumann architecture concept. Going to quantum computing we have to throw away everything and we have to start anew. This is a great time. I keep telling my Ph.D. candidates, go for quantum computing. This is where you make an impact. This is where you have a wide-open field of things you can explore and this is what is going to make the job exciting for the next 10, 12, 15 years or so.

Olds: Thats fantastic and your enthusiasm for this really comes through. Your enthusiasm for HPC, for the new computing methods, and all that. And, thank you so much for taking the time.

Resch: It was a pleasure. Thank you.

Olds: Thank you, really appreciate it.

See more here:
The Hyperion-insideHPC Interviews: Dr. Michael Resch Talks about the Leap from von Neumann: 'I Tell My PhD Candidates: Go for Quantum' - insideHPC

Microsoft Executive Vice President Jason Zander: Digital Transformation Accelerating Across the Energy Spectrum; Being Carbon Negative by 2030; The…

WASHINGTON--(BUSINESS WIRE)--Microsoft Executive Vice President Jason Zander says the company has never been more busy partnering with the energy industry on cloud technologies and energy transition; the combination of COVID-19 and the oil market shock has condensed years of digital transformation into a two-month period; the companys return to its innovative roots and its goal to have removed all of the companys historic carbon emissions by 2050 in the latest edition of CERAWeek Conversations.

In a conversation with IHS Markit (NYSE: INFO) Vice Chairman Daniel Yergin, Zanderwho leads the companys cloud services business, Microsoft Azurediscusses Microsofts rapid and massive deployment of cloud-based apps that have powered work and commerce in the COVID-19 economy; how cloud technologies are optimizing business and vaccine research; the next frontiers of quantum computing and its potential to take problems that would take, literally, a thousand years, you might be able to solve in 10 seconds, and more.

The complete video is available at: http://www.ceraweek.com/conversations

Selected excerpts:Interview Recorded Thursday, July 16, 2020

(Edited slightly for brevity only)

Watch the complete video at: http://www.ceraweek.com/conversations

Weve already prepositioned in over 60 regions around the world hundreds of data center, millions and millions of server nodestheyre already there. If you can imagine COVID, if you had to go back and do a procurement exercise and figure out a place to put the equipment, and the supply chains were actually shut down for a while because of COVID. Thats why I say, even three to five years ago we as industries would have been pretty challenged to respond as quickly as we had.

Thats on the more tactical end of the spectrum. On the other end weve also done a lot of things around data sets and advanced data work. How do we find a cure? Weve done things like [protein] folding at home and making sure that those things could be hosted on the cloud. These are thingsthat will be used in the search of a vaccine for the virus. Those are wildly different spectrums from the tactical 'we need to manage and do logistics' to 'we need a search for things that are going to get us all back to basically normal.'

Theres also a whole bunch of stimulus packages and payment systems that are getting created and deployed. Weve had financial services companies that run on top of the cloud. They may have been doing a couple of hundred big transactions a day; weve had them do tens to hundreds of thousands a day when some of this kicked in.

The point is with the cloud I can just go to the cloud, provision it, use it, and eventually when things cool back down, I can just shut it off. I dont have to worry about having bought servers, find a place for them to live, hiring people to take care of them.

There was disruption in supply chain also. Many of us saw this at least in the Statesif you think even the food supply chain, every once in a while, youd see some hiccups. Theres a whole bunch of additional work that weve done around how do we do even better planning around that, making sure we can hit the right levels of scale in the future? God forbid we should have another one of these, but I think we can and should be responsible to make sure that weve got it figured out.

The policy and investment sideit has never been more important for us to collaborate with healthcare, universities, and with others. Weve kicked off a whole bunch of new partnerships and work that will benefit us in the future. This was a good wake up call for all of us in figuring out how to marshal and be able to respond even better in the future.

Weve had a lot of cases where people have been moving out of their own data centers and into ours. Let us basically take care of that part of the system. We can run it cheaply and efficiently. Im seeing a huge amount of data center accelerationfolks that really want to move even faster on getting their workloads removed. Thats true for oil and gas but its also true for the financial sector and retail.

Specifically, for oil and gas, one of the things that were trying to do in particular is bring this kind of cloud efficiency, this kind of AI, and especially help out with places where you are doing exploration. What these have in common is the ability to take software especially from the [independent software vendors] that work in the spacereservoir simulation, explorationand marry that to these cloud resources where I can spin things up and spin things down. I can take advantage of that technology that Ive got, and I am more efficient. I am not spending capex; I can perhaps do even more jobs than I was doing before. That allows me to go do that scale. If youre going to have less resources to do something, you of course want to increase your hit rate; increase your efficiency. Those are some of the core things that were seeing.

A lot of folks, especially in oil and gas, have some of the most sophisticated high-performance computing solutions that are out there today. What we want to be able to do with the cloud is to be able to enable you to do even more of those solutions in a much more efficient way. Weve got cases where people have been able to go from running one reservoir simulation job a day on premises [to] where they can actually go off to the cloud and since we have all of this scale and all of this equipment, you can spin up and do 100 in one day. If that is going to be part of how you drive your efficiency, then being able to subscribe to that and go up and down its helping you do that job much more efficiently than you used to and giving you a lot more flexibility.

Were investing in a $1 billion fund over the next four years for carbon removal technology. We also are announcing a Microsoft sustainability calculator for cloud customers. Basically, you can help get transparency into your Scope 1,2, and 3 carbon emissions to get control. You can think of us as we want to hit this goal, we want to do it ourselves, we want to figure out how we build technology to help us do that and then we want to share that technology with others. And then all along the way we want to partner with energy companies so that we can all be partnering together on this energy transition.

From a corporate perspective weve made pledges around being carbon negative, but then also working with our energy partners. The way that we look at this is youre going to have continued your requirements and improvements in standards of living around the entire planet. One of the core, critical aspects to that is energy. The world needs more energy, not less. There are absolutely the existing systems that we have out there that we need to continue to improve, but they are also a core part of how things operate.

What we want to do is have a very responsible program where were doing things like figuring out how to go carbon negative and figuring out ways that we as a company can go carbon negative. At the same time, taking those same techniques and allowing others to do the same and then partnering with energy companies around energy transformation. We still want the investments in renewables. We want to figure out how to be more efficient at the last mile when we think about the grid. I generally find that when you get that comprehensive answer back to our employees, they understand what we are doing and are generally supportive.

Coming up is a digital feedback loop where you get enough data thats coming through the system that you can actually start to be making smart decisions. Our expectation is well have an entire connected environment. Now we start thinking about smart cities, smart factories, hospitals, campuses, etc. Imagine having all of that level of data thats coming through and the ability to do smart work shedding or shaping of electrical usage, things where I can actually control brownout conditions and other things based on energy usage. Theres also the opportunity to be doing smart sharing of systems where we can do very efficient usage systemsintelligent edge and edge deployments are a core part of that.

How do we keep all the actual equipment that people are using safe? If you think about 5G and additional connectivity, were getting all this cool new technology thats there. You have to figure out a way in which youre leveraging silicon, youre leveraging software and the best in securityand were investing in all three.

The idea of being able to harness particle physics to do computing and be able to figure out things in minutes that would literally take centuries to go pull off otherwise in classical computing is kind of mind-blowing. Were actually working with a lot of the energy companies on figuring out how could quantum inspired algorithms make them more efficient today. As we get to full scale quantum computing then they would run natively in hardware and would be able to do even more amazing things. That one has just the potential to really, really change the world.

The meta point is problems that would take, literally, a thousand years, you might be able to solve in 10 seconds. Weve proven how that kind of technology can work. The quantum-inspired algorithms therefore allow us to take those same kind of techniques, but we can run them on the cloud today using some of the classic cloud computers that are there. Instead of taking 1,000 years, maybe its something that we can get done in 10 days, but in the future 10 seconds.

About CERAWeek Conversations:

CERAWeek Conversations features original interviews and discussion with energy industry leaders, government officials and policymakers, leaders from the technology, financial and industrial communitiesand energy technology innovators.

The series is produced by the team responsible for the worlds preeminent energy conference, CERAWeek by IHS Markit.

New installments will be added weekly at http://www.ceraweek.com/conversations.

Recent segments also include:

A complete video library is available at http://www.ceraweek.com/conversations.

About IHS Markit (www.ihsmarkit.com)

IHS Markit (NYSE: INFO) is a world leader in critical information, analytics and solutions for the major industries and markets that drive economies worldwide. The company delivers next-generation information, analytics and solutions to customers in business, finance and government, improving their operational efficiency and providing deep insights that lead to well-informed, confident decisions. IHS Markit has more than 50,000 business and government customers, including 80 percent of the Fortune Global 500 and the worlds leading financial institutions. Headquartered in London, IHS Markit is committed to sustainable, profitable growth.

IHS Markit is a registered trademark of IHS Markit Ltd. and/or its affiliates. All other company and product names may be trademarks of their respective owners 2020 IHS Markit Ltd. All rights reserved.

More here:
Microsoft Executive Vice President Jason Zander: Digital Transformation Accelerating Across the Energy Spectrum; Being Carbon Negative by 2030; The...

Tapping into Quantum Computing to Study Genetic Diseases – Genetic Engineering & Biotechnology News

Researchers at the University of Virginia School of Medicine say they are tapping into the potential of quantum computers to help us understand genetic diseases.

Stefan Bekiranov, PhD, and colleagues have report the development of an algorithm in their new study, Implementation of a Hamming distancelike genomic quantum classifier using inner products on ibmqx2 and ibmq_16_melbourne published in Quantum Machine Intelligence, to allow researchers to study genetic diseases using quantum computers, once there are much more powerful quantum computers to run it. The algorithm, a complex set of operating instructions, will help advance quantum computing algorithm development and could advance the field of genetic research one day, according to Bekiranov.

Quantum computers are still in their infancy. But when they come into their own, possibly within a decade, they may offer computing power on a scale unimaginable using traditional computers.

We developed and implemented a genetic sample classification algorithm that is fundamental to the field of machine learning on a quantum computer in a very natural way using the inherent strengths of quantum computers, Bekiranov said. This is certainly the first published quantum computer study funded by the National Institute of Mental Health and may be the first study using a so-called universal quantum computer funded by the National Institutes of Health.

Traditional computer programs are built on 1s and 0s, either-or. But quantum computers take advantage of a fundamental of quantum physics: Something can be and not be at the same time. Rather than 1 or 0, the answer, from a quantum computers perspective, is both, simultaneously. That allows the computer to consider vastly more possibilities, all at once.

The challenge is that the technology is technically demanding. Many quantum computers have to be kept at near absolute zero, the equivalent of more than 450 degrees below zero on the Fahrenheit scale. Even then, the movement of molecules surrounding the quantum computing elements can disturb the calculations, so algorithms not only have to contain instructions for what to do, but for how to compensate when errors occur.

Our goal was to develop a quantum classifier that we could implement on an actual IBM quantum computer. But the major quantum machine learning papers in the field were highly theoretical and required hardware that didnt exist. We finally found papers from Maria Schuld, PhD, who is a pioneer in developing implementable, near-term, quantum machine learning algorithms. Our classifier builds on those developed by Schuld, Bekiranov said. Once we started testing the classifier on the IBM system, we quickly discovered its current limitations and could only implement a vastly oversimplified, or toy, problem successfully, for now.

The new algorithm essentially classifies genomic data. It can determine if a test sample comes from a disease or control sample exponentially faster than a conventional computer. For example, if they used all four building blocks of DNA for the classification, a conventional computer would execute 3 billion operations to classify the sample. The new quantum algorithm would need only 32.

That will help scientists sort through the vast amount of data required for genetic research. But its also proof-of-concept of the usefulness of the technology for such research.

Bekiranov and collaborator Kunal Kathuria, PhD, were able to create the algorithm because they were trained in quantum physics. Such algorithms are more likely to emerge from physics or computer science departments than medical schools. (Both Bekiranov and Kathuria conducted the study in the School of Medicines Department of Biochemistry and Molecular Genetics. Kathuria is currently at the Lieber Institute for Brain Development.)

Because of the researchers particular set of skills, officials at the National Institutes of Healths National Institute of Mental Health supported them in taking on the challenging project. Bekiranov and Kathuria hope what they have developed will be a great benefit to quantum computing and, eventually, human health.

Relatively small-scale quantum computers that can solve toy problems are in existence now, Bekiranov said. The challenges of developing a powerful universal quantum computer are immense. Along with steady progress, it will take multiple scientific breakthroughs. But time and again, experimental and theoretical physicists, working together, have risen to these challenges. If and when they develop a powerful universal quantum computer, I believe it will revolutionize computation and be regarded as one of greatest scientific and engineering achievements of humankind.

Link:
Tapping into Quantum Computing to Study Genetic Diseases - Genetic Engineering & Biotechnology News

UC Berkeley to lead $25 million quantum computing center – UC Berkeley

Artists rendition of quantum entanglement. (NSF image by Nicolle R. Fuller)

As part of the federal governments effort to speed the development of quantum computers, the National Science Foundation (NSF) has awarded the University of California, Berkeley, $25 million over five years to establish a multi-university institute focused on advancing quantum science and engineering and training a future workforce to build and use quantum computers.

The UC Berkeley-led center is one of three Quantum Leap Challenge Institutes (QLCI) announced today (Tuesday, July 21) by NSF and represents a $75 million investment. The initiatives are a central part of the National Quantum Initiative Act of 2018, the White Houses Industries of the Future program and NSFs ongoing Quantum Leap effort.

The QLCI for Present and Future Quantum Computation connects UC Berkeley, UCLA, UC Santa Barbara and five other universities around the nation, harnessing a wealth of experimental and theoretical quantum scientists to improve and determine how best to use todays rudimentary quantum computers, most of them built by private industry or government labs. The goal, ultimately, is to make quantum computers as common as the mobile phones, which are digital computers, in our pockets.

There is a sense that we are on the precipice of a really big move toward quantum computing, said Dan Stamper-Kurn, UC Berkeley professor of physics and director of the institute. We think that the development of the quantum computer will be a real scientific revolution, the defining scientific challenge of the moment, especially if you think about the fact that the computer plays a central role in just about everything society does. If you have a chance to revolutionize what a computer is, then you revolutionize just about everything else.

Situated near the heart of todays computer industry, Silicon Valley, and at major California universities and national labs, this center establishes California as the world center for research in quantum computing, he said.

Quantum computers are fundamentally different from the digital computers in our cellphones, laptops, cars and appliances. You can think of digital computers as a collection of millions of independent bits either ones or zeros that flip back and forth every billionth of a second based on a series of instructions called an algorithm. The harder the problem, the longer the list of instructions.

In a quantum computer, each bit is linked to every other bit quantumly entangled so that the description of the state of even 100 quantum bits would be far larger than could be stored on the biggest classical digital computer.

Translating this remarkable ability of quantum computers into actually solving a computational problem is very challenging and requires a completely new way of thinking about algorithms, said Umesh Vazirani, UC Berkeley professor of computer science and co-director of the institute. Designing effective quantum algorithms is a key challenge in realizing the enormous potential of quantum computers.

IBMs quantum computer, called Q. (Photo courtesy of IBM)

Theoretical work has shown that quantum computers are the best way to do some important tasks: factoring large numbers, encrypting or decrypting data, searching databases or finding optimal solutions for problems. Using quantum mechanical principles to process information offers an enormous speedup over the time it takes to solve many computational problems on current digital computers.

Scientific problems that would take the age of the universe to solve on a standard computer potentially could take only a few minutes on a quantum computer, said Eric Hudson, UCLA professor of physics and co-director of the new institute. We may get the ability to design new pharmaceuticals to fight diseases on a quantum computer instead of in a laboratory. Learning the structure of molecules and designing effective drugs, each of which has thousands of atoms, are inherently quantum challenges. A quantum computer potentially could calculate the structure of molecules and how molecules react and behave.

I think quantum computing is inevitable, Stamper-Kurn added. I dont know the time scale Is it 100 years or 10 years? but we are talking about exponential increases in capability.

At the moment, quantum computers typically yoke together a paltry 50 or fewer quantum bits, or qubits. But that is quite an achievement, Stamper-Kurn said, considering that it came about rapidly over the past decade and has already spawned a nascent quantum computer industry. In light of these advances, scientists and the federal government anticipate even faster progress if the government invests in basic research and education that complement the technical progress being made by companies such as Google, Microsoft Corp., Intel and IBM.

Googles Sycamore chip, a quantum computer, is kept cool inside their quantum cryostat. (Image by Eric Lucero/Google, Inc.)

The new institute, which also includes the University of Southern California, California Institute of Technology, University of Texas at Austin, Massachusetts Institute of Technology and University of Washington, Seattle, will tackle some of the major challenges in the field.

We know that quantum computers are on their way there are researchers across the country working to build and test them, said NSF program director Henry Warchall. But anyone with a computer will tell you that hardware isnt useful without software to run on it and thats where this center will lead us toward solutions. The NSF Quantum Leap Challenge Institute for Present and Future Quantum Computing will help us have key programming elements in place when quantum computing hardware is in place.

One of the institutes first challenges is to identify the applications for which current quantum computers are most suited, in order to make full use of todays first generation computers.

People talk about noisy intermediate scale quantum computers, or NISQ devices, which is what we have at present. They are pretty limited in what they can do, most importantly because they dont know how to correct the errors that come up during the computation, Stamper-Kurn said. They are going to be useful for short-scale or small-scale computation, but it is critical that we find ways to use them productively, because that will stimulate the whole field.

The institute will also address the long-term challenge of developing algorithms for the next generation of quantum computers that will enable critical scientific, economic and societal advances, as well as navigate the boundary between quantum and classical computational capabilities.

Realizing the full power of quantum computation requires development of efficient schemes for correction of errors during operation of quantum machines, as well as protocols for testing and benchmarking, Vazirani said.

A vacuum chamber with an ion trap in the center. In this instance, calcium ions are held 100 micrometers above the surface by means of electrical fields. The ions are observed from the top. (UC Berkeley photo courtesy of Hartmut Haffner)

Understanding the computational capabilities of quantum computers is one of the most important challenges for the field and will be an important driver of progress moving forward. This will require a major increase in the number of computer scientists engaged in these questions.

The Simons Institute for the Theory of Computing at UC Berkeley is uniquely poised to create this engagement, said Vazirani, who is leading the quantum computing effort at the institute. The Simons Institute is a mecca for the foundations of computing and will host a number of researchers in quantum computing and facilitate the kind of intense, in-person, cross-disciplinary collaboration that can result in rapid progress.

Also key is a partnership with UCLAs Institute for Pure and Applied Mathematics, which will help apply mathematical and data science tools to the field.

The magnitude of the challenge also requires input of domain expertise from scientific and mathematical/computational disciplines, to allow quantum algorithm design to be tailored to specific problems.

Quantum algorithm design is entering an era of co-design, where the specific scientific and computational constraints and the need to preserve the fragile quantum coherence underlying quantum algorithms are leveraged to generate an efficient solution to a particular scientific problem, said Birgitta Whaley, UC Berkeley professor of chemistry and co-director of the institute. We know how to do this for small systems, but scaling up to large quantum machines brings new challenges for implementation. That is something we will tackle in the new institute.

Experimentalists and theorists from the fields of chemistry, physics, materials science, engineering, mathematics and computer science will tackle some of these outstanding problems in particular, how to scale up computers from tens to millions of qubits without losing the quantum properties of the ensemble of qubits.

The big question is: How do you make a quantum system bigger and bigger without making it perform worse and worse? Stamper-Kurn said. What people see is that, as the thing gets bigger, more noise creeps in, calibration is more difficult, connectivity is difficult it is hard to get one part of the computer to talk to the other.

The group plans to focus on three experimental platforms that use different quantum systems as qubits: trapped ions, trapped atoms and superconducting circuits.

Some of these systems work great at a small number of qubits, so we can work really hard on increasing their fidelity, making them more accurate. Some naturally operate at a larger number of qubits, and we can test out ideas about how to operate a quantum computer with a limited range of controls, but a lot of qubits to work with, Stamper-Kurn said. They are all early in the technological development curve, so by introducing some new technologies with the help of engineers, we can improve our ability to operate a lot of systems at once.

An argon plasma discharge is used to clean an ion trap to allow for better coherence during quantum information transfer. Trapped ions are one of the most advanced candidates for a scalable quantum processing device. (UC Berkeley photo courtesy of Hartmut Hffner)

The grant will foster interactions among researchers and doctoral students from many fields with the help of fellowships, conferences and workshops. But a major component will be training a future workforce akin to the way computer science training at universities like UC Berkeley and Stanford fueled Silicon Valleys rise to become a tech giant. UCLA will pilot a masters degree program in quantum science and technology to train a quantum-smart workforce, while massive online courses, or MOOCs, will help spread knowledge and understanding of quantum computers even for high school students.

The team hopes to partner with Department of Energy laboratories, such as Lawrence Berkeley National Laboratory, which in 2018 launched an Advanced Quantum Testbed to further quantum computation based on superconducting circuits.

The project came to fruition, in part, thanks to a UC-wide consortium, the California Institute for Quantum Entanglement, funded by UCs Multicampus Research Programs and Initiatives (MRPI).

The award recognizes the teams vision of how advances in computational quantum science can reveal new fundamental understanding of phenomena at the tiniest length-scale that can benefit innovations in artificial intelligence, medicine, engineering and more, said Theresa Maldonado, UCs vice president for research and innovation. We are proud to lead the nation in engaging excellent students from diverse backgrounds into this field of study.

Co-directors of the institute are UCLAs Eric Hudson; Whaley, who is co-director of the Berkeley Quantum Information & Computation Center (QBIC); Vazirani, the Roger A. Strauch Professor of Electrical Engineering and Computer Sciences and co-director of the Berkeley Quantum Information & Computation Center (QBIC); and Hartmut Hffner, UC Berkeley associate professor and the Mike Gyorgy Chair in Physics.

The two other $25 million Quantum Leap Challenge Institutes announced today are centered at the University of Colorado, Boulder, and the University of Illinois, Urbana-Champaign, and will focus on quantum sensing and quantum networks, respectively.

Read more here:
UC Berkeley to lead $25 million quantum computing center - UC Berkeley

Quantum Computing Market Segmentation By Qualitative And Quantitative Research Incorporating Impact Of Economic and Non-Economic Aspects By 2027 -…

New Jersey, United States,- The recent report on Quantum Computing Market offered by Verified Market Research, comprises of a comprehensive investigation into the geographical landscape, industry size along with the revenue estimation of the business. Additionally, the report also highlights the challenges impeding market growth and expansion strategies employed by leading companies in the Quantum Computing market.

This is the most recent report inclusive of the COVID-19 effects on the functioning of the market. It is well known that some changes, for the worse, were administered by the pandemic on all industries. The current scenario of the business sector and pandemics impact on the past and future of the industry are covered in this report.

In market segmentation by manufacturers, the report covers the following companies-

Exploring the growth rate over a period

Business owners looking to scale up their business can refer this report that contains data regarding the rise in sales within a given consumer base for the forecast period, 2020 to 2027. Product owners can use this information along with the driving factors such as demographics and revenue generated from other products discussed in the report to get a better analysis of their products and services. Besides, the research analysts have compared the market growth rate with product sales to enable business owners to determine the success or failure of a specific product or service.

By Type

Type 1

Type 2

By Application

Application1

Application 2

Global Quantum Computing Market Report 2020 Market Size, Share, Price, Trend and Forecast is a professional and in-depth study on the current state of the global Quantum Computing industry.

The report at a glance

The Quantum Computing market report focuses on economic developments and consumer spending trends across different countries for the forecast period 2019 to 2026. The research further reveals which countries and regions will have a better standing in the years to come. Apart from this, the study talks about the growth rate, market share as well as the recent developments in the Quantum Computing industry worldwide. Besides, the special mention of major market players adds importance to the overall market study.

Market segment by Region/Country including:

North America (United States, Canada and Mexico)Europe (Germany, UK, France, Italy, Russia and Spain etc.)Asia-Pacific (China, Japan, Korea, India, Australia and Southeast Asia etc.)South America (Brazil, Argentina, Colombia and Chile etc.)Middle East & Africa (South Africa, Egypt, Nigeria and Saudi Arabia etc.)

The research provides answers to the following key questions:

What is the expected growth rate of the Quantum Computing market? What will be the market size for the forecast period, 20202027?

What are the major driving forces responsible for transforming the trajectory of the industry?

Who are major vendors dominating the Quantum Computing industry across different regions? What are their winning strategies to stay ahead in the competition?

What are the market trends business owners can rely upon in the coming years?

What are the threats and challenges expected to restrict the progress of the industry across different countries?

What are the key opportunities that business owners can bank on for the forecast period, 20202027?

Why Choose Verified Market Research?

To summarize, the global Quantum Computing market report studies the contemporary market to forecast the growth prospects, challenges, opportunities, risks, threats, and the trends observed in the market that can either propel or curtail the growth rate of the industry. The market factors impacting the global sector also include provincial trade policies, international trade disputes, entry barriers, and other regulatory restrictions.

About us:

Verified Market Research is a leading Global Research and Consulting firm servicing over 5000+ customers. Verified Market Research provides advanced analytical research solutions while offering information enriched research studies. We offer insight into strategic and growth analyses, Data necessary to achieve corporate goals, and critical revenue decisions.

Our 250 Analysts and SMEs offer a high level of expertise in data collection and governance use industrial techniques to collect and analyze data on more than 15,000 high impact and niche markets. Our analysts are trained to combine modern data collection techniques, superior research methodology, expertise, and years of collective experience to produce informative and accurate research.

Contact us:

Mr. Edwyne Fernandes

US: +1 (650)-781-4080UK: +44 (203)-411-9686APAC: +91 (902)-863-5784US Toll-Free: +1 (800)-7821768

Email: [emailprotected]

Read more here:
Quantum Computing Market Segmentation By Qualitative And Quantitative Research Incorporating Impact Of Economic and Non-Economic Aspects By 2027 -...