Quantum startup CEO suggests we are only five years away from a quantum desktop computer – TechCrunch

Today at TechCrunch Disrupt 2020, leaders from three quantum computing startups joined TechCrunch editor Frederic Lardinois to discuss the future of the technology. IonQ CEO and president Peter Chapman suggested we could be as little as five years away from a desktop quantum computer, but not everyone agreed on that optimistic timeline.

I think within the next several years, five years or so, youll start to see [desktop quantum machines]. Our goal is to get to a rack-mounted quantum computer, Chapman said.

But that seemed a tad optimistic to Alan Baratz, CEO at D-Wave Systems. He says that when it comes to developing the super-conducting technology that his company is building, it requires a special kind of rather large quantum refrigeration unit called a dilution fridge, and that unit would make a five-year goal of having a desktop quantum PC highly unlikely.

Itamar Sivan, CEO at Quantum Machines, too, believes we have a lot of steps to go before we see that kind of technology, and a lot of hurdles to overcome to make that happen.

This challenge is not within a specific, singular problem about finding the right material or solving some very specific equation, or anything. Its really a challenge, which is multidisciplinary to be solved here, Sivan said.

Chapman also sees a day when we could have edge quantum machines, for instance on a military plane, that couldnt access quantum machines from the cloud efficiently.

You know, you cant rely on a system which is sitting in a cloud. So it needs to be on the plane itself. If youre going to apply quantum to military applications, then youre going to need edge-deployed quantum computers, he said.

One thing worth mentioning is that IonQs approach to quantum is very different from D-Waves and Quantum Machines .

IonQ relies on technology pioneered in atomic clocks for its form of quantum computing. Quantum Machines doesnt build quantum processors. Instead, it builds the hardware and software layer to control these machines, which are reaching a point where that cant be done with classical computers anymore.

D-Wave, on the other hand, uses a concept called quantum annealing, which allows it to create thousands of qubits, but at the cost of higher error rates.

As the technology develops further in the coming decades, these companies believe they are offering value by giving customers a starting point into this powerful form of computing, which when harnessed will change the way we think of computing in a classical sense. But Sivan says there are many steps to get there.

This is a huge challenge that would also require focused and highly specialized teams that specialize in each layer of the quantum computing stack, he said. One way to help solve that is by partnering broadly to help solve some of these fundamental problems, and working with the cloud companies to bring quantum computing, however they choose to build it today, to a wider audience.

In this regard, I think that this year weve seen some very interesting partnerships form which are essential for this to happen. Weve seen companies like IonQ and D-Wave, and others partnering with cloud providers who deliver their own quantum computers through other companies cloud service, Sivan said. And he said his company would be announcing some partnerships of its own in the coming weeks.

The ultimate goal of all three companies is to eventually build a universal quantum computer, one that can achieve the goal of providing true quantum power. We can and should continue marching toward universal quantum to get to the point where we can do things that just cant be done classically, Baratz said. But he and the others recognize we are still in the very early stages of reaching that end game.

Read this article:
Quantum startup CEO suggests we are only five years away from a quantum desktop computer - TechCrunch

Are We Close To Realising A Quantum Computer? Yes And No, Quantum Style – Swarajya

Scientists have been hard at work to get a new kind of computer going for about a couple of decades. This new variety is not a simple upgrade over what you and I use every day. It is different. They call it a quantum computer.

The name doesnt leave much to the imagination. It is a machine based on the central tenets of the most successful theory of physics yet devised quantum mechanics. And since it is based on such a powerful theory, it promises to be so advanced that a conventional computer, the one we know and recognise, cannot keep up with it.

Think of the complex real-world problems that are hard to solve and its likely that quantum computers will throw up answers to them someday. Examples include simulating complex molecules to design new materials, making better forecasts for weather, earthquakes or volcanoes, map out the reaches of the universe, and, yes, demystify quantum mechanics itself.

One of the major goals of quantum computers is to simulate a quantum system. It is probably the reason why quantum computation is becoming a major reality, says Dr Arindam Ghosh, professor at the Department of Physics, Indian Institute of Science.

Given that the quantum computer is full of promise, and work on it has been underway for decades, its fair to ask do we have one yet?

This is a million-dollar question, and there is no simple answer to it, says Dr Rajamani Vijayaraghavan, the head of the Quantum Measurement and Control Laboratory at the Tata Institute of Fundamental Research (TIFR). Depending on how you view it, we already have a quantum computer, or we will have one in the future if the aim is to have one that is practical or commercial in nature.

We have it and dont. That sounds about quantum.

In the United States, Google has been setting new benchmarks in quantum computing.

Last year, in October, it declared quantum supremacy a demonstration of a quantum computers superiority over its classical counterpart. Googles Sycamore processor took 200 seconds to make a calculation that, the company claims, would have taken 10,000 years on the worlds most powerful supercomputer.

This accomplishment came with conditions attached. IBM, whose supercomputer Summit (the worlds fastest) came second-best to Sycamore, contested the 10,000-year claim and said that the calculation would have instead taken two and a half days with a tweak to how the supercomputer approached the task.

Some experts suggested that the nature of the task, generating random numbers in a quantum way, was not particularly suited to the classical machine. Besides, Googles quantum processor didnt dabble in a real-world application.

Yet, Google was on to something. For even the harsh critic, it provided a glimpse of the spectacular processing power of a quantum computer and whats possible down the road.

Google did one better recently. They simulated a chemical reaction on their quantum computer the rearrangement of hydrogen atoms around nitrogen atoms in a diazene molecule (nitrogen hydride or N2H2).

The reaction was a simple one, but it opened the doors to simulating more complex molecules in the future an eager expectation from a quantum computer.

But how do we get there? That would require scaling up the system. More precisely, the number of qubits in the machine would have to increase.

Short for quantum bits, qubits are the basic building blocks of quantum computers. They are equivalent to the classical binary bits, zero and one, but with an important difference. While the classical bits can assume states of zero or one, quantum bits can accommodate both zero and one at the same time a principle in quantum mechanics called superposition.

Similarly, quantum bits can be entangled. That is when two qubits in superposition are bound in such a way that one dictates the state of the other. It is what Albert Einstein in his lifetime described, and dismissed, as spooky action at a distance.

Qubits in these counterintuitive states are what allow a quantum computer to work its magic.

Presently, the most qubits, 72, are found on a Google device. The Sycamore processor, the Google chip behind the simulation of a chemical reaction, has a 53-qubit configuration. IBM has 53 qubits too, and Intel has 49. Some of the academic labs working with quantum computing technology, such as the one at Harvard, have about 40-50 qubits. In China, researchers say they are on course to develop a 60-qubit quantum computing system within this year.

The grouping is evident. The convergence is, more or less, around 50-60 qubits. That puts us in an interesting place. About 50 qubits can be considered the breakeven point the one where the classical computer struggles to keep up with its quantum counterpart, says Dr Vijayaraghavan.

It is generally acknowledged that once qubits rise to about 100, the classical computer gets left behind entirely. That stage is not far away. According to Dr Ghosh of IISc, the rate of qubit increase is today faster than the development of electronics in the early days.

Over the next couple of years, we can get to 100-200 qubits, Dr Vijayaraghavan says.

A few more years later, we could possibly reach 300 qubits. For a perspective on how high that is, this is what Harvard Quantum Initiative co-director Mikhail Lukin has said about such a machine: If you had a system of 300 qubits, you could store and process more bits of information than the number of particles in the universe.

In Indian labs, we are working with much fewer qubits. There is some catching up to do. Typically, India is slow to get off the blocks to pursue frontier research. But the good news is that over the years, the pace is picking up, especially in the quantum area.

At TIFR, researchers have developed a unique three-qubit trimon quantum processor. Three qubits might seem small in comparison to examples cited earlier, but together they pack a punch. We have shown that for certain types of algorithms, our three-qubit processor does better than the IBM machine. It turns out that some gate operations are more efficient on our system than the IBM one, says Dr Vijayaraghavan.

The special ingredient of the trimon processor is three well-connected qubits rather than three individual qubits a subtle but important difference.

Dr Vijayaraghavan plans to build more of these trimon quantum processors going forward, hoping that the advantages of a single trimon system spill over on to the larger machines.

TIFR is simultaneously developing a conventional seven-qubit transmon (as opposed to trimon) system. It is expected to be ready in about one and a half years.

About a thousand kilometres south, at IISc, two labs under the Department of Instrumentation and Applied Physics are developing quantum processors too, with allied research underway in the Departments of Computer Science and Automation, and Physics, as well as the Centre for Nano Science and Engineering.

IISc plans to develop an eight-qubit superconducting processor within three years.

Once we have the know-how to build a working eight-qubit processor, scaling it up to tens of qubits in the future is easier, as it is then a matter of engineering progression, says Dr Ghosh, who is associated with the Quantum Materials and Devices Group at IISc.

It is not hard to imagine India catching up with the more advanced players in the quantum field this decade. The key is to not think of India building the biggest or the best machine it is not necessary that they have the most number of qubits. Little scientific breakthroughs that have the power to move the quantum dial decisively forward can come from any lab in India.

Zooming out to a global point of view, the trajectory of quantum computing is hazy beyond a few years. We have been talking about qubits in the hundreds, but, to have commercial relevance, a quantum computer needs to have lakhs of qubits in its armoury. That is the challenge, and a mighty big one.

It isnt even the case that simply piling up qubits will do the job. As the number of qubits go up in a system, it needs to be ensured that they are stable, highly connected, and error-free. This is because qubits cannot hang on to their quantum states in the event of environmental noise such as heat or stray atoms or molecules. In fact, that is the reason quantum computers are operated at temperatures in the range of a few millikelvin to a kelvin. The slightest disturbance can knock the qubits off their quantum states of superposition and entanglement, leaving them to operate as classical bits.

If you are trying to simulate a quantum system, thats no good.

For that reason, even if the qubits are few, quantum computation can work well if the qubits are highly connected and error-free.

Companies like Honeywell and IBM are, therefore, looking beyond the number of qubits and instead eyeing a parameter called quantum volume.

Honeywell claimed earlier this year that they had the worlds highest performing quantum computer on the basis of quantum volume, even though it had just six qubits.

Dr Ghosh says quantum volume is indeed an important metric. Number of qubits alone is not the benchmark. You do need enough of them to do meaningful computation, but you need to look at quantum volume, which measures the length and complexity of quantum circuits. The higher the quantum volume, the higher is the potential for solving real-world problems.

It comes down to error correction. Dr Vijayaraghavan says none of the big quantum machines in the US today use error-correction technology. If that can be demonstrated over the next five years, it would count as a real breakthrough, he says.

Guarding the system against faults or "errors" is the focus of researchers now as they look to scale up the qubits in a system. Developing a system with hundreds of thousands of qubits without correcting for errors cancels the benefits of a quantum computer.

As is the case with any research in the frontier areas, progress will have to accompany scientific breakthroughs across several different fields, from software to physics to materials science and engineering.

In light of that, collaboration between academia and industry is going to play a major role going forward. Depending on each of their strengths, academic labs can focus on supplying the core expertise necessary to get a quantum computer going while the industry can provide the engineering muscle to build the intricate stuff. Both are important parts of the quantum computing puzzle. At the end of the day, the quantum part of a quantum computer is tiny. Most of the machine is high-end electronics. The industry can support that.

It is useful to recall at this point that even our conventional computers took decades to develop, starting from the first transistor in 1947 to the first microprocessor in 1971. The computers that we use today would be unrecognisable to people in the 1970s. In the same way, how quantum computing in the future, say, 20 years down the line, is unknown to us today.

However, governments around the world, including India, are putting their weight behind the development of quantum technology. It is clear to see why. Hopefully, this decade can be the springboard that launches quantum computing higher than ever before. All signs point to it.

See more here:
Are We Close To Realising A Quantum Computer? Yes And No, Quantum Style - Swarajya

Spin-Based Quantum Computing Breakthrough: Physicists Achieve Tunable Spin Wave Excitation – SciTechDaily

Magnon excitation. Credit: Daria Sokol/MIPT Press Office

Physicists from MIPT and the Russian Quantum Center, joined by colleagues from Saratov State University and Michigan Technological University, have demonstrated new methods forcontrolling spin waves in nanostructured bismuth iron garnet films via short laser pulses. Presented inNano Letters, the solution has potential for applications in energy-efficient information transfer and spin-based quantum computing.

Aparticles spin is its intrinsic angular momentum, which always has a direction. Inmagnetized materials, the spins all point in one direction. A local disruption of this magnetic order is accompanied by the propagation of spin waves, whose quanta are known as magnons.

Unlike the electrical current, spin wave propagation does not involve a transfer of matter. Asaresult, using magnons rather than electrons to transmit information leads to much smaller thermal losses. Data can be encoded in the phase or amplitude of a spin wave and processed via wave interference or nonlinear effects.

Simple logical components based on magnons are already available as sample devices. However, one of the challenges of implementing this new technology is the need to control certain spin wave parameters. Inmany regards, exciting magnons optically is more convenient than by other means, with one of the advantages presented in the recent paper in Nano Letters.

The researchers excited spin waves in a nanostructured bismuth iron garnet. Even without nanopatterning, that material has unique optomagnetic properties. It is characterized by low magnetic attenuation, allowing magnons topropagate over large distances even at room temperature. It is also highly optically transparent in the near infrared range and has a high Verdet constant.

The film used in the study had an elaborate structure: a smooth lower layer with a one-dimensional grating formed on top, with a 450-nanometer period (fig.1). This geometry enables the excitation ofmagnons with a very specific spin distribution, which is not possible for an unmodified film.

To excite magnetization precession, the team used linearly polarized pump laser pulses, whose characteristics affected spin dynamics and the type of spin waves generated. Importantly, wave excitation resulted from optomagnetic rather than thermal effects.

Schematic representation of spin wave excitation by optical pulses. The laser pump pulse generates magnons by locally disrupting the ordering of spins shown as violet arrows in bismuth iron garnet (BiIG). A probe pulse is then used to recover information about the excited magnons. GGG denotes gadolinium gallium garnet, which serves as the substrate. Credit: Alexander Chernov et al./Nano Letters

The researchers relied on 250-femtosecond probe pulses to track the state of the sample and extract spin wave characteristics. Aprobe pulse can be directed to any point on the sample with adesired delay relative to the pump pulse. This yields information about the magnetization dynamics in a given point, which can be processed to determine the spin waves spectral frequency, type, and other parameters.

Unlike the previously available methods, the new approach enables controlling the generated wave by varying several parameters of the laser pulse that excites it. In addition to that, thegeometry of the nanostructured film allows the excitation center to be localized inaspot about 10 nanometers in size. The nanopattern also makes it possible to generate multiple distinct types of spin waves. The angle of incidence, the wavelength and polarization of the laser pulses enable the resonant excitation of the waveguide modes of the sample, which are determined by the nanostructure characteristics, so the type of spin waves excited can be controlled. It is possible for each of the characteristics associated with optical excitation to be varied independently to produce the desired effect.

Nanophotonics opens up new possibilities in the area of ultrafast magnetism, said the studys co-author, Alexander Chernov, who heads the Magnetic Heterostructures and Spintronics Lab at MIPT. The creation of practical applications will depend on being able to go beyond the submicrometer scale, increasing operation speed and the capacity for multitasking. We have shown a way to overcome these limitations by nanostructuring a magnetic material. We have successfully localized light in a spot few tens of nanometers across and effectively excited standing spin waves of various orders. This type of spin waves enables the devices operating at high frequencies, up to the terahertz range.

The paper experimentally demonstrates an improved launch efficiency and ability to control spin dynamics under optical excitation by short laser pulses in a specially designed nanopatterned film of bismuth iron garnet. It opens up new prospects for magnetic data processing and quantum computing based on coherent spin oscillations.

Reference: All-Dielectric Nanophotonics Enables Tunable Excitation of the Exchange Spin Waves by Alexander I. Chernov*, Mikhail A. Kozhaev, Daria O. Ignatyeva, Evgeniy N. Beginin, Alexandr V. Sadovnikov, Andrey A. Voronov, Dolendra Karki, Miguel Levy and Vladimir I. Belotelov, 9 June 2020, Nano Letters.DOI: 10.1021/acs.nanolett.0c01528

The study was supported by the Russian Ministry of Science and Higher Education.

Follow this link:
Spin-Based Quantum Computing Breakthrough: Physicists Achieve Tunable Spin Wave Excitation - SciTechDaily

The Hyperion-insideHPC Interviews: ORNL Distinguished Scientist Travis Humble on Coupling Classical and Quantum Computing – insideHPC

Oak Ridge National Labs Travis Humble has worked at the headwaters of quantum computing research for years. In this interview, he talks about his particular areas of interest, including integration of quantum computing with classical HPC systems. Weve already recognized that we can accelerate solving scientific applications using quantum computers, he says. These demonstrations are just early examples of how we expect quantum computers can take us to the most challenging problems for scientific discovery.

In This Update. From the HPC User Forum Steering Committee

By Steve Conway and Thomas Gerard

After the global pandemic forced Hyperion Research to cancel the April 2020 HPC User Forum planned for Princeton, New Jersey, we decided to reach out to the HPC community in another way by publishing a series of interviews with leading members of the worldwide HPC community. Our hope is that these seasoned leaders perspectives on HPCs past, present and future will be interesting and beneficial to others. To conduct the interviews, Hyperion Research engaged insideHPC Media. We welcome comments and questions addressed to Steve Conway, sconway@hyperionres.com or Earl Joseph, ejoseph@hyperionres.com.

This interview is with Travis Humble, Deputy Director at the Department of Energys Quantum Science Center, a Distinguished Scientist at Oak Ridge National Laboratory, and director of the labs Quantum Computing Institute. Travis is leading the development of new quantum technologies and infrastructure to impact the DOE mission of scientific discovery through quantum computing. He is editor-in-chief for ACM Transactions on Quantum Computing, Associate Editor for Quantum Information Processing, and co-chair of the IEEE Quantum Initiative. Travis also holds a joint faculty appointment with the University of Tennessee Bredesen Center for Interdisciplinary Research and Graduate Education, where he works with students in developing energy-efficient computing solutions. He received his doctorate in theoretical chemistry from the University of Oregon before joining ORNL in 2005.

The HPC User Forum was established in 1999 to promote the health of the global HPC industry and address issues of common concern to users. More than 75 HPC User Forum meetings have been held in the Americas, Europe and the Asia-Pacific region since the organizations founding in 2000.

Doug Black: Hi, everyone. Im Doug Black. Im editor-in-chief at InsideHPC and today we are talking with Dr. Travis Humble. He is a distinguished scientist at Oak Ridge National Lab, where he is director of the labs Quantum Computing Institute. Dr. Humble, welcome. Thanks for joining us today.

Travis Humble: Thanks for having me on, Doug.

Black: Travis, tell us, if you would, the area of quantum computing that youre working in and the research that youre doing that youre most excited about, that has what you would regard as the greatest potential.

Humble: Quantum computing is a really exciting area, so its really hard to narrow it down to just one example. This is the intersection of quantum informationquantum mechanicswith computer science.

Weve already recognized that we can accelerate solving scientific applications using quantum computers. At Oak Ridge, for example, we have already demonstrated examples in chemistry, material science and high-energy physics, where we can use quantum computers to solve problems in those areas. These demonstrations are just early examples of how we expect quantum computers can take us to the most challenging problems for scientific discovery.

My own research is actually focused on how we could integrate quantum computers with high-performance computing systems. Of course, we are adopting an accelerator model at Oak Ridge, where we are thinking about using quantum processors to offload the most challenging computational tasks. Now, this seems like an obvious approach; the best of both worlds. But the truth is that there are a lot of challenges in bringing those two systems together.

Black: It sounds like sort of a hybrid approach, almost a CPU/GPU, only were talking about systems writ large. Tell us about DOEs and Oak Ridges overall quantum strategy and how the Quantum Computing Institute works with vendors and academic institutions on quantum technology development.

Humble: The Oak Ridge National Laboratory has played an important role within the DOEs national laboratory system, a leading role in both research and infrastructure. In 2018, the President announced the National Quantum Initiative, which is intended to accelerate the development of quantum science and technology in the United States. Oak Ridge has taken the lead in the development of research, especially software applications and hardware, for how quantum computing can address scientific discovery.

At the same time, weve helped DOE establish a quantum computing user program; something we call QCUP. This is administered through the Oak Ridge Leadership Computing Facility and it looks for the best of the best in terms of approaches to how quantum computers could be used for scientific discovery. We provide access to the users through the user program in order for them to test and evaluate how quantum computers might be used to solve problems in basic energy science, nuclear physics, and other areas.

Black: Okay, great. So how far would you we are from practical quantum computing and from what is referred to as quantum advantage, where quantum systems can run workloads faster than conventional or classical supercomputers?

Humble: This is such a great question. Quantum advantage, of course, is the idea that a quantum computer would be able to outperform any other conventional computing system on the planet. Very early in this fiscal year, back in October, there was an announcement from Google where they actually demonstrated an example of quantum advantage using their quantum computing hardware processor. Oak Ridge was part of that announcement, because we used our Summit supercomputer system as the baseline to compare that calculation.

But heres the rub: the Google demonstration was primarily a diagnostic check that their processor was behaving as expected, and the Summit supercomputer actually couldnt keep up with that type of diagnostic check. But when we look at the practical applications of quantum computing, still focusing on problems in chemistry, material science and other scientific disciplines, we appear to still be a few years away from demonstrating a quantum advantage for those applications. This is one of the hottest topics in the field at the moment, though. Once somebody can identify that, we expect to see a great breakthrough in how quantum computers can be used in these practical areas.

Black: Okay. So, how did you become involved in quantum in the first place? Tell us a little bit about your background in technology.

Humble: I started early on studying quantum mechanics through chemistry. My focus, early on in research, was on theoretical chemistry and understanding how molecules behave quantum mechanically. What has turned out to be one of the greatest ironies of my career is that quantum computers are actually significant opportunities to solve chemistry problems using quantum mechanics.

So I got involved in quantum computing relatively early. Certainly, the last 15 years or so have been a roller coaster ride, mainly going uphill in terms of developing quantum computers and looking at the question of how they can intersect with high-performance computing. Being at Oak Ridge, thats just a natural question for me to come across. I work every day with people who are using some of the worlds fastest supercomputers in order to solve the same types of problems that we think quantum computers would be best at. So for me, the intersection between those two areas just seems like a natural path to go down.

Black: I see. Are there any other topics related to all this that youd like to add?

Humble: I think that quantum computing has a certain mystique around it. Its an exciting area and it relies on a field of physics that many people dont yet know about, but I certainly anticipate that in the future thats going to change. This is a topic that is probably going to impact everyones lives. Maybe its 10 years from now, maybe its 20 years, but its certainly something that I think we should start preparing for in the long term, and Oak Ridge is really happy to be one of the places that is helping to lead that change.

Black: Thanks so much for your time. It was great to talk with you.

Humble: Thanks so much, Doug. It was great to talk with you, too.

See the rest here:
The Hyperion-insideHPC Interviews: ORNL Distinguished Scientist Travis Humble on Coupling Classical and Quantum Computing - insideHPC

NTT Research and University of Notre Dame Collaborate to Explore Continuous-Time Analog Computing – Business Wire

PALO ALTO, Calif.--(BUSINESS WIRE)--NTT Research, Inc., a division of NTT (TYO:9432), today announced that it has reached an agreement with the University of Notre Dame to conduct joint research between its Physics and Informatics (PHI) Lab and the Universitys Department of Physics. The five-year agreement covers research to be undertaken by Dr. Zoltn Toroczkai, a professor of theoretical physics, on the limits of continuous-time analog computing. Because the Coherent Ising Machine (CIM), an optical device that is key to the PHI Labs research agenda, exhibits characteristics related to those of analog computers, one purpose of this project is to explore avenues for improving CIM performance.

The three primary fields of the PHI Lab include quantum-to-classical crossover physics, neural networks and optical parametric oscillators. The work with Dr. Toroczkai addresses an opportunity for tradeoffs in the classical domain between analog computing performance and controllable variables with arbitrarily high precision. Interest in analog computing has rebounded in recent years thanks to modern manufacturing techniques and the technologys efficient use of energy, which leads to improved computational performance. Implemented with the Ising model, analog computing schemes now figure within some emerging quantum information systems. Special-purpose, continuous time analog devices have been able to outperform state-of-the-art digital algorithms, but they also fail on some classes of problems. Dr. Toroczkais research will explore the theoretical limits of analog computing and focus on two approaches to achieving improved performance using less precise variables, or (in the context of the CIM) a less identical pulse amplitude landscape.

Were very excited to have the University of Notre Dame and Professor Toroczkai, a specialist in analog computing, join our growing consortium of researchers engaged in rethinking the limits and possibilities of computing, said NTT Research PHI Lab Director Yoshihisa Yamamoto. We see his work at the intersection of hard, optimization problems and analog computing systems that can efficiently solve them as very promising.

The agreement identifies research subjects and project milestones between 2020 and 2024. It anticipates Dr. Toroczkai and a graduate student conducting research at Notre Dame, adjacent to South Bend, Indiana, while collaborating with scientists at the PHI Lab in California. Recent work by Dr. Toroczkai related to this topic includes publications in Computer Physics Communications and Nature Communications. Like the PHI Lab itself, he brings to his research both domain expertise and a broad vision.

I work in the general area of complex systems research, bringing and developing tools from mathematics, equilibrium and non-equilibrium statistical physics, nonlinear dynamics and chaos theory to bear on problems in a range of disciplines, including the foundations of computing, said Dr. Toroczkai, who is also a concurrent professor in the Department of Computer Science and Engineering and co-director of the Center for Network and Data Science. This project with NTT Research is an exciting opportunity to engage in basic research that will bear upon the future of computing.

The NTT Research PHI Lab has now reached nine joint research projects as part of its long-range goal to radically redesign artificial neural networks, both classical and quantum. To advance that goal, the PHI Lab has established joint research agreements with six other universities, one government agency and one quantum computing software company. Those universities are California Institute of Technology (CalTech), Cornell University, Massachusetts Institute of Technology (MIT), Stanford University, Swinburne University of Technology and the University of Michigan. The government entity is NASA Ames Research Center in Silicon Valley, and the private company is 1QBit in Canada. In addition to its PHI Lab, NTT Research has two other research labs: its Cryptography and Information Security (CIS) Lab and Medical and Health Informatics (MEI) Lab.

About NTT Research

NTT Research opened its Palo Alto offices in July 2019 as a new Silicon Valley startup to conduct basic research and advance technologies that promote positive change for humankind. Currently, three labs are housed at NTT Research: the Physics and Informatics (PHI) Lab, the Cryptography and Information Security (CIS) Lab, and the Medical and Health Informatics (MEI) Lab. The organization aims to upgrade reality in three areas: 1) quantum information, neuro-science and photonics; 2) cryptographic and information security; and 3) medical and health informatics. NTT Research is part of NTT, a global technology and business solutions provider with an annual R&D budget of $3.6 billion.

NTT and the NTT logo are registered trademarks or trademarks of NIPPON TELEGRAPH AND TELEPHONE CORPORATION and/or its affiliates. All other referenced product names are trademarks of their respective owners. 2020 NIPPON TELEGRAPH AND TELEPHONE CORPORATION

See the rest here:
NTT Research and University of Notre Dame Collaborate to Explore Continuous-Time Analog Computing - Business Wire

D-Wave Appoints Daniel Ley as Senior VP of Sales and Allison – AiThority

Strategic hires will drive global partnerships and expansion, and power adoption of D-Waves next-generation quantum technology

D-Wave Systems Inc., the leader in quantum computing systems, software, and services announced that Daniel Ley and Allison Schwartz have joined the company as Senior Vice President of Sales and Global Government Relations and Public Affairs Leader, respectively. Leys experience in technology sales and executive leadership, and Schwartzs strong background in technology policy and public affairs, make them the ideal candidates as D-Wave continues to expand its global customer and partner base and government engagement, while demonstrating business value with quantum computing today.

Recommended AI News: Comviva Receives Issuer Token Service Provider (I-TSP) Certification From Visa

Daniel Ley brings over 25 years of experience in the technology and software industries. Prior to joining D-Wave, Ley was Vice President of Global Sales for the Routing Optimization and Assurance product line at the Blue Planet Software Division of Ciena, which acquired his previous company, Packet Design, in 2018. At Packet Design, a leading network analytics and management company, Ley served as Senior Vice President of Global Sales. Before that, Daniel was Vice President of Solutions Sales at CA Technologies where he oversaw product sales and sales team integration for the Hyperformix product line, and subsequently led product sales in the Virtualization and Automation Business Unit.

D-Wave sits at the intersection of business and innovation. I know how transformative the right technology can be for enterprise success, and Im eager to bring my expertise to an ecosystem thats evolving as quickly as quantum computing is right now, said Ley. I have devoted my career to strategic technology sales and I am thrilled to now play a crucial role in expanding D-Waves customer and partner base, while driving global adoption of quantum computing via the cloud.

Recommended AI News: Hublot Becomes the Premier Leagues Official Timekeeper

Allison Schwartz brings over 25 years of public policy experience to D-Wave, with a proven track record in technology policy. Most recently, as Vice President of Government Affairs at ANDE, she worked to provide a better understanding of how the technology of Rapid DNA analysis can prevent human trafficking and unite families. Prior to ANDE, Schwartz served as Global Government Relations Leader for Dun & Bradstreet, where she was responsible for the companys public policy and government relations efforts across the globe, and won a company-wide award for innovation in 2018.

Ive dedicated my career to working with global stakeholders on strategic worldwide issues from human rights to data analytics, said Schwartz. Quantum computing is a technology that will soon change our world in profound ways. D-Wave is breaking down the access barriers for governments, corporations, academics and NGOs through their cloud initiatives. Im excited to join a team where expanding public-private partnerships and relationships with policy makers and influencers can move this industry into an unprecedented period of growth while tackling significant problems facing our global community.

Recommended AI News: VIAVI Launches Comprehensive VPN Management Solution for Large to Medium Enterprises

Read more from the original source:
D-Wave Appoints Daniel Ley as Senior VP of Sales and Allison - AiThority

Centralized databases with personal information are a looming threat to mobile ID security – Biometric Update

By Kevin Freiburger, Director of Identity Programs, Valid

The ID verification market is projected to hit $12.8 billion by 2024. Several states have joined the mobile drivers license movement and other markets, like higher education, adopt mobile IDs for physical access control for campus facilities, logical access to network and computer resources, and payment card functionality.

This rapid adoption and the many use cases in the public sector have made the data security that underpins mobile ID technology a hot topic. Many implementations rely on a centralized data store managed by the ID credential issuer that protects the sensitive, personally identifiable information (PII) using an advanced, multi-layered approach that includes encryption and other techniques.

However, even these stronger security methods are at risk due to advances in the abilities of bad actors. In fact, 72% of risk managers believe that complex risks are emerging more rapidly than their own skills are advancing, putting the PII of millions in jeopardy.

Centralized, encrypted data may be threatened by quantum computing and other vulnerabilities

Encryption is a pillar of mobile ID data security. Cracking the encryption algorithms to gain access to PII requires high levels of compute power, and todays compute resources are handicapped.

Classical computers think in 1s and 0s, and you can only have one of those states at a time. This technology caps the computational power of todays machines and makes it expensive to scale up but this cap also makes encryption safer. It is extremely expensive and difficult to create the computational power necessary to break encryption that protects data housed and stored by government institutions or other identity credential issuers. However, not all encryption is created equal and quantum computing makes weaker encryption vulnerable.

Quantum computing can have simultaneous states (1s and 0s at the same time). This technology enables extremely high levels of computational power. For example, Google researchers claimed that their quantum computer performed a calculation in three minutes and 20 seconds a calculation that would take other computers approximately 10,000 years to complete. In theory, this level of power could give hackers a real chance at breaking weaker encryption algorithms and gaining access to the systems storing PII.

Quantum is a risk in the future, but there are many other attack vectors that exist today which can accidentally expose PII. These vectors include misconfigured networks and firewalls, unpatched servers and software and insider threats executed by staff within the issuing organization.

How can ID verification systems thwart these existing and emerging threats?

To mitigate todays vulnerabilities and prepare for the emergence of quantum computing (and the inevitability that it ends up in the hands of bad actors), ID verification systems can follow two approaches.

1. Store PII outside of central databases. There are several implementation options that remove issuers as PII managers. However, the blockchain option exclusively allows for decentralized data storage and true decentralized identity which puts the credential holder in total control. Microsoft is currently working on such a product and other companies have similar initiatives. This unique approach decentralizes issuers, verifiers, credential holders and even Microsoft within the ecosystem. The credential owner alone manages the credentials and sensitive PII.

Credential verifiers (TSA, law enforcement, retailers and more) can trust the presented credential because of digital certificate technology and blockchain hashing. Verifiers can ensure the ID is authentic if the issuer uses a digital certificate, which acts as a unique signature or fingerprint that signs each piece of data. Mobile ID holders manage the sensitive data on a secure device like a mobile wallet and only share it with the verifiers they choose. A credential owner shares their data with a verifier and the verifier can authenticate the owners digital certificate and any issuers digital certificate using proven technology called public key infrastructure that has existed for years. Its seamless to the mobile credential holder, the verifier and the mobile credential issuer.

2. Authenticate credentials with biometrics. Storing PII off the chain solves just one set of problems. But how do you securely authenticate the credential holder presenting the digital credential? You add biometrics to the process.

One use case allows the owner to add extra security to protect the digital credential. For example, digital wallets could require that the credential holder present a fingerprint or face verification to unlock the wallet before sharing any credentials.

Another use case adds trustworthiness for verifiers. Issuers can include a photo in the digital credential upon issuing it and sign the photo with a digital certificate. Verifiers can capture a photo of the person presenting the credential and compare it to the photo that was issued with the digital credential. If the biometrics match, the person presenting the credential is verified. And with AI continuing to imitate more than just the human response to CAPTCHA, perhaps mobile ID data security will begin using physiological biometrics as well, methods like heartbeat or voice that bots cannot imitate.

Mobile IDs are gaining popularity and will continue to spread as adoption is normalized. But as with all novel technologies, data security should be a top priority for those with the responsibility of rolling the technology out to the public. Encryption is critical but we know AI and quantum threats are emerging and other vulnerabilities already exist. It is more important than ever to consider other solutions to protect sensitive PII which begins with removing PII from centralized databases.

About the author

Kevin Freiburger is Director of Identity Programs and Product Management at Valid where he leads a team that builds and delivers large-scale identity management and biometric matching solutions to public and private enterprises.

DISCLAIMER: Biometric Updates Industry Insights are submitted content. The views expressed in this post are that of the author, and dont necessarily reflect the views of Biometric Update.

authentication | biometric identification | biometrics | blockchain | data protection | decentralized ID | digital identity | encryption | identity verification | risk mitigation | Valid

Go here to read the rest:
Centralized databases with personal information are a looming threat to mobile ID security - Biometric Update

Innovation Inc: The CEOs of Chewy and Honeywell talk digital transformation – Business Insider – Business Insider

Chewy and Honeywell.

At first glance, the two companies couldn't be more different. One specializes in online pet products, while the other is an industrial giant in the midst of pivoting to software. But after getting the chance to talk to the CEOs of both organizations recently, I believe that they may have more in common than one would initially assume.

Underscoring each company's strategy is a relentless pursuit of new markets while finding ways to better serve existing customers. While that's not an entirely novel concept for multi-billion-dollar corporations, what Honeywell's Darius Adamczyk and Chewy's Sumit Singh also have in common is an appetite to double-down on technology to achieve that goal.

For Honeywell, that looks like new office automation tools and a gamble on quantum computing, while for Chewy, it's the company's first fully-automated factory.

These aren't, of course, the only initiatives underway at either firm But they're notable, largely because the efforts encapsulate so well the broader push to make digital technologies a focus particularly as the coronavirus pandemic continues to force companies to innovate faster than ever before.

And for both Honeywell and Chewy, it's as much of a cultural focus as it is on the tech itself.

Singh, for example, is making Chewy a places that embraces risk-taking and failure so that "every person inside the organization is an evangelist for inventiveness."

And at Honeywell, Adamczyk has left his mark on the organization by elevating Forge the firm's software division from a background player that worked across different verticals to its own business unit. It's one of the most visible aspects of its transition to a software company and a signal to employees that it's an important part of Honeywell's future.

Singh and Adamczyk aren't the only execs we've talked to about digital transformation lately, either: Below are a few other stories that you may have missed from the last two weeks. And as always: If you're interested in receiving this biweekly newsletter and other updates from our ongoing Innovation Inc. series, please be sure to sign up here.

Go here to see the original:
Innovation Inc: The CEOs of Chewy and Honeywell talk digital transformation - Business Insider - Business Insider

Assistant Professor in Computer Science job with Indiana University | 286449 – The Chronicle of Higher Education

The Luddy School of Informatics, Computing, and Engineering atIndiana University (IU) Bloomington invites applications for atenure track assistant professor position in Computer Science tobegin in Fall 2021. We are particularly interested in candidateswith research interests in formal models of computation,algorithms, information theory, and machine learning withconnection to quantum computation, quantum simulation, or quantuminformation science. The successful candidate will also be aQuantum Computing and Information Science Faculty Fellow supportedin part for the first three years by an NSF-funded program thataims to grow academic research capacity in the computing andinformation science fields to support advances in quantum computingand/or communication over the long term. For additional informationabout the NSF award please visit:https://www.nsf.gov/awardsearch/showAward?AWD_ID=1955027&HistoricalAwards=false.The position allows the faculty member to collaborate actively withcolleagues from a variety of outside disciplines including thedepartments of physics, chemistry, mathematics and intelligentsystems engineering, under the umbrella of the Indiana Universityfunded "quantum science and engineering center" (IU-QSEc). We seekcandidates prepared to contribute to our commitment to diversityand inclusion in higher education, especially those with experiencein teaching or working with diverse student populations. Dutieswill include research, teaching multi-level courses both online andin person, participating in course design and assessment, andservice to the School. Applicants should have a demonstrablepotential for excellence in research and teaching and a PhD inComputer Science or a related field expected before August 2021.Candidates should review application requirements, learn more aboutthe Luddy School and apply online at: https://indiana.peopleadmin.com/postings/9841.For full consideration submit online application by December 1,2020. Applications will be considered until the positions arefilled. Questions may be sent to sabry@indiana.edu. IndianaUniversity is an equal employment and affirmative action employerand a provider of ADA services. All qualified applicants willreceive consideration for employment without regard to age,ethnicity, color, race, religion, sex, sexual orientation, genderidentity or expression, genetic information, marital status,national origin, disability status or protected veteranstatus.

Go here to see the original:
Assistant Professor in Computer Science job with Indiana University | 286449 - The Chronicle of Higher Education

Key Players and Initiatives in the Quantum Technology Market 2020 – PRNewswire

DUBLIN, Sept. 10, 2020 /PRNewswire/ -- The "Quantum Computing - A New Paradigm Nears the Horizon" report has been added to ResearchAndMarkets.com's offering.

This study looks into the present perspective of quantum computing and its present state of development, as well as its future outlook.

The study proposes an accessible description of the new computing paradigm brought by quantum technology and presents the potential applications and benefits that the new approach would bring. It also focuses on the potential consequences for cybersecurity and telecommunications.

Alongside the perspective it offers on the current quantum computing ecosystem, the study outlines a vision of the current state of development of the technology. This includes an analysis of the positioning of key players (IBM, Microsoft, D-Wave) and of the investment programmes of some 12 key nations including the USA, China, parts of the EU, Russia, Japan and South Korea.)

Finally, the study analyses the likely development of the technology and its foreseeable impacts.

Key Topics Covered:

1. Executive Summary

2. Quantum Technology Definitions2.1. Quantum computing glossary2.2. Quantum properties and principles for a quantum computer

3. Quantum Computing Technologies3.1. Scope of the study3.2. The two main approaches to quantum computing3.3. Analog-quantum computing (AQC)3.4. Gate-based quantum computing3.5. Qubit: state of the art

4. State of Play of Quantum Technology4.1. Quantum computing foreseen benefits4.2. Quantum foreseen limitations4.3. The quantum computing race4.4. How to compare performance in quantum computing?4.5. Milestones and limitations for quantum computing4.6. Quantum supremacy: another milestone reached in 2019?

5. Quantum Technology Applications5.1. Quantum computing potential applications5.2. Most important QC potential applications5.3. Quantum computing: potential applications5.4. Focus: Quantum computing impact on cryptography5.5. The solution to build a post-quantum secure system

6. Key Players and Initiatives6.1. Private Player Profiles

6.2. Public Initiatives

7. Analysis and Perspectives7.1. Technology Perspective: Common misconceptions on quantum computing7.2. Ecosystems analysis7.3. Cybersecurity perspective7.4. Perspectives of development7.5. The vision of future development

For more information about this report visit https://www.researchandmarkets.com/r/y1svm5

About ResearchAndMarkets.comResearchAndMarkets.com is the world's leading source for international market research reports and market data. We provide you with the latest data on international and regional markets, key industries, the top companies, new products and the latest trends.

Research and Markets also offers Custom Research services providing focused, comprehensive and tailored research.

Media Contact:

Research and Markets Laura Wood, Senior Manager [emailprotected]

For E.S.T Office Hours Call +1-917-300-0470 For U.S./CAN Toll Free Call +1-800-526-8630 For GMT Office Hours Call +353-1-416-8900

U.S. Fax: 646-607-1907 Fax (outside U.S.): +353-1-481-1716

SOURCE Research and Markets

http://www.researchandmarkets.com

Follow this link:
Key Players and Initiatives in the Quantum Technology Market 2020 - PRNewswire