Page 68«..1020..67686970..8090..»

Category Archives: Quantum Computing

IBM quantum computing: From healthcare to automotive to energy, real use cases are in play – TechRepublic

Posted: September 4, 2021 at 6:17 am

Companies including Anthem, Daimler-Benz, BP and ExxonMobil, have big plans to deploy IBM quantum computers this decade.

Image: PhonlamaiPhoto, Getty Images/iStockphoto

Quantum computers have been receiving a lot of attention because of their potential to solve computationally difficult problems that classical computers cannot. Among those problems are the abilities to help companies reduce their carbon footprint and protect the world from the next pandemic.

SEE: The CIO's guide to quantum computing (free PDF) (TechRepublic)

Since announcing the IBM Quantum Network in 2017 with 12 initial organizations, today IBM said its commercial quantum computing program has grown to more than 150 Fortune 500 companies, academic institutions, startups and national research labs. Some 360,000 users have run nearly a trillion circuits, according to the company.

Last spring, IBM rolled out Qiskit Runtime, and the ability to speed up quantum programs on the cloud by 120x, as well as IBM's to deliver a 1,000+ qubit system by 2023.

In addition to increasing speed, Qiskit Runtime changes how IBM is able to offer quantum computing to clients and make it more widely available, the company said.

"Before we got to Runtime, clients were doing research using simulators," and now they can investigate applications for finance, machine learning and chemistry using real hardware, said Jay Gambetta, IBM fellow and vice president of quantum computing.

"To me, this is fundamentally important because a simulator can never mimic quantum computing, so you need to do your research and development on the hardware and that's what's getting enabled," Gambetta said. "I see this year as when this fully comes out of beta and will be the new way of using quantum" to ask questions such as whether quantum will scale in the way clients can use apps."

In the meantime, customers are incorporating quantum into their plans for the future. At healthcare provider Anthem, quantum computing is "an integral part of our digital platform for health," and is being used for "computationally intense and expensive tasks such as identifying anomalies, where there's tons of data and interactions," said John Utz, staff vice president of digital product management.

Quantum computers are better at that than classical computers, Utz said. Anthem is running different models on IBM's quantum cloud. Right now, company officials are building a roadmap around how Anthem wants to deliver its platform using quantum technology, so "I can't say quantum is ready for primetime yet," Utz said. "The plan is to get there over the next year or so and have something working in production."

SEE: Expert: Now is the time to prepare for the quantum computing revolution(TechRepublic)

A good place to start with anomaly detection is in finding fraud, he said. "Classical computers will tap out at some point and can't get to the same place as quantum computers."

Other use cases are around longitudinal population health modeling, meaning that as Anthem looks at providing more of a digital platform for health, one of the challenges is that there is "almost an infinite number of relationships," he said. This includes different health conditions, providers patients see, outcomes and figuring out where there are outliers, he said.

"There's only so much a classical system can do there, so we're looking for more opportunities to improve healthcare for our members and the population at large," and the ability to proactively predict risk, Utz said. Quantum computers are better at driving outcomes from the models Anthem is building, he said.

At BP, the ability "to model and predict the physical world has always been constrained by available compute power," said Morag Watson, chief digital innovation officer. As the energy company focuses on sustainability and net-zero carbon energy, the need for even higher performance computing will increase, she said.

Further, "tackling some of the world's toughest problems, including climate change, will require orders of magnitude increases of compute power over today's conventional capabilities," Watson said. BP sees quantum computing as critical to enabling it to pursue its ambition to become a net-zero company by 2050 or sooner and to help the world get to net zero, Watson said.

Specifically, BP is working on proofs of concept using quantum systems in several fields, including chemistry, materials science, optimization, machine learning and partial differential equations, she said. Echoing Anthem's Utz, Watson added, "These are problems that can't be solved on classical computers today."

Daimler AG, the parent company of Mercedes-Benz, is studying how to develop energy-dense batteries such as the lithium-sulfur battery. But going from the drawing board to a commercially viable Li-S battery is "essentially a mammoth chemistry experiment," the company said.

Engineers are testing quantum systems to distill some very abstract physics theory into a new kind of computing power that can handle what IBM calls "once-insoluble complexity." Using quantum bits known as qubits, the performance doubles, giving a substantial boost to the ability to run algorithms to speed the simulation process and test the feasibility of the battery, the company said.

Energy challenges are expected to increase as the global population grows from 7.5 billion today to a projected 9.2 billion by 2040, according to ExxonMobil. This has created what the company refers to as the "dual challenge" of providing reliable and affordable energy to a rising population while also reducing environmental impacts and the risks of climate change.

One way to tackle that challenge in the near term is to use natural gas, which emits up to 60% less greenhouse gases than coal, according to Dr. Vijay Swarup, vice president of research and development at ExxonMobil, in a statement. This creates issues with production and transportation, he said.

SEE: Startup claims new "quantum analog computer" solved the traveling salesman problem for 128 cities(TechRepublic)

It requires efficient liquified natural gas shipping, but finding optimal routes for a fleet of LNG ships to transport critical fuel supplies is a "mind-bendingly complex optimization problem." It involves accounting for each ship's position every day of the year along with the LNG requirements of each delivery site.

This type of problem cannot be solved exactly with classical computing, IBM said. So ExxonMobil, in tandem with IBM Research, is using a combination of classical and quantum computers to address the complexity. Teams are modeling maritime inventory routing on quantum devices, analyzing the strengths and tradeoffs of different strategies for vehicle and inventory routing, and laying the foundation for constructing practical solutions for their operations, IBM said.

Swarup said ExxonMobil's goal is to increase its ability to tackle more complex optimizations and previously insoluble routing problems as IBM's quantum hardware scales from small prototype systems to larger devices.

Be in the know about smart cities, AI, Internet of Things, VR, AR, robotics, drones, autonomous driving, and more of the coolest tech innovations. Delivered Wednesdays and Fridays

See the rest here:

IBM quantum computing: From healthcare to automotive to energy, real use cases are in play - TechRepublic

Posted in Quantum Computing | Comments Off on IBM quantum computing: From healthcare to automotive to energy, real use cases are in play – TechRepublic

Fast Tool Developed for Quantum Computing and Communication – HPCwire

Posted: at 6:17 am

JOHANNESBURG, Sept. 1, 2021 Isaac Nape, an emerging South African talent in the study of quantum optics, is part of a crack team of Wits physicists who led an international study that revealed the hidden structures of quantum entangled states. The study was published in the renowned scientific journal, Nature Communications, on Aug. 27, 2021.

Nape is pursuing his Ph.D. at Wits University and focuses on harnessing structuredpatterns of lightfor high dimensional information encoding and decoding for use inquantum communication.

Earlier this year he scooped up two awards at the South African Institute of Physics (SAIP) conference to add to his growing collection of accolades in the field of optics and photonics. He won the award for Best Ph.D. oral presentation in applied physics, and jointly won the award for Best Ph.D. oral presentation in photonics.

In May, he was also awarded the prestigious 2021 Optics and Photonics Education Scholarship from the SPIE, the international society for optics and photonics, for his potential contributions to the field of optics, photonics or related field.

Faster and More Secure Computing

Now Nape and his colleagues at Wits, together with collaborators from Scotland and Taiwan offer a new and fast tool for quantum computing and communication. Quantum states that are entangled in many dimensions are key to our emerging quantum technologies, where more dimensions mean a higher quantum bandwidth (faster) and better resilience to noise (security), crucial for both fast and secure communication and speed up in error-freequantum computing.

What we have done here is to invent a new approach to probing these high-dimensional quantum states, reducing the measurement time from decades to minutes, Nape explains.

Nape worked with Distinguished Professor Andrew Forbes, lead investigator on this study and Director of the Structured Light Laboratory in the School of Physics at Wits, as well as postdoctoral fellow Dr. Valeria Rodriguez-Fajardo, visiting Taiwanese researcher Dr. Hasiao-Chih Huang, and Dr. Jonathan Leach and Dr. Feng Zhu from Heriot-Watt University in Scotland.

Are You Quantum or Not?

In their paper titled Measuring dimensionality and purity of high-dimensional entangled states, the team outlined a new approach to quantum measurement, testing it on a 100 dimensional quantum entangled state.

With traditional approaches, the time of measurement increases unfavorably with dimension, so that to unravel a 100-dimensional state by a full quantum state tomography would take decades. Instead, the team showed that the salient information of the quantum systemthe number of dimensions entangled and their level of puritycould be deduced in just minutes. The new approach requires only simple projections that could easily be done in most laboratories with conventional tools. Using light as an example, the team using an all-digital approach to perform the measurements.

The problem, explains Nape, is that while high-dimensional states are easily made, particularly with entangled particles of light (photons), they are not easy to measurethe existing toolbox for measuring and controlling them is almost empty.

You can think of a high-dimensional quantum state like faces of a dice. A conventional dice has six faces, numbered one through six, for a six-dimensional alphabet that can be used for computing or for transferring information in communication. To make high-dimensional dice means creating dice with many more faces: 100 dimensions equals 100 facesa rather complicated polygon.

In our everyday world, it would be easy to count the faces to know what sort of resource we had available to us, but not so in the quantum world. In the quantum world, you can never see the whole die, so counting the faces is very difficult. The way we get around this is to do a tomography, as they do in the medical world, building up a picture from many, many slices of the object, explains Nape.

But the information in quantum objects can be enormous, so the time for this process is prohibitive. A faster approach is a Bell measurement, a famous test to tell if what you have in front of you is entangled, like asking it are you quantum or not? But while this confirms quantum correlations of the dice, it doesnt say much about the number of faces it has.

Chance Discovery

Our work circumvented the problem by a chance discovery, that there is a set of measurements that is not a tomography and not a Bell measurement, but that holds important information of both, says Nape. In technical parlance, we blended these two measurement approaches to do multiple projections that look like a tomography but measuring the visibilities of the outcome, as if they were Bell measurements. This revealed the hidden information that could be extracted from the strength of the quantum correlations across many dimensions.

First and Fast

The combination of speed from the Bell-like approach and information from the tomography-like approach meant that key quantum parameters such as dimensionality and the purity of the quantum state could be determined quickly and quantitatively, the first approach to do so.

We are not suggesting that our approach replace other techniques, says Forbes. Rather, we see it as a fast probe to reveal what you are dealing with, and then use this information to make an informed decision on what to do next. A case of horses-for-courses.

For example, the team see their approach as changing the game in real-world quantum communication links, where a fast measurement of how noisy thatquantum statehas become and what this has done to the useful dimensions is crucial.

To read the study in Nature Communications, visit: DOI: 10.1038/s41467-021-25447-0.

Go here to see the original:

Fast Tool Developed for Quantum Computing and Communication - HPCwire

Posted in Quantum Computing | Comments Off on Fast Tool Developed for Quantum Computing and Communication – HPCwire

Leading Chinese researchers are looking at the coming quantum revolution – The Press Stories

Posted: at 6:17 am

Quantum technologies refer to engineering systems that use the quantum properties of photons, electrons, atoms or molecules. For example, Radio China reports that lasers, magnetic resonance imaging and the global stabilization system are closely linked to quantum technology.

Zhao Fuan, a recent graduate of the Chinese University of Science and Technology and a doctoral candidate in optical engineering, left a well-paying job with an annual salary of 600,000 yuan ($ 92,300) and decided to start his own business in quantum computing. . The foundation of quantum computing technology. For example, when a new object is discovered, we can assess whether the equipment is working properly by detecting any change in the electromagnetic field around it, he explains.

Currently, common methods for checking the quality of electrical equipment, such as temperature measurement and ultraviolet radiation, are good for detecting large thermal defects, but are not as effective for small ones or for predicting potential problems that may develop rapidly.

According to Zhao Bowen, in terms of quantum technology, China already has its advantages over relatively good countries, but lags behind in some areas, such as sensors used for facial recognition.

Another young expert who has linked his career with quantum technology is Ha Yu from Sichuan Province. Before completing his doctorate, he launched a new product in the field of quantum sensors, which has great potential in detecting microscopic structures such as cellular and protein molecules. In 2016, he founded CIQTEK, which specializes in quantum computing, which is gaining desirable interest from investors. We have investors like iFlytek and Hillhouse Capital behind us, and the government has given us some support, he said.

Zhao Bowen and Ha Yu are a growing army of experts in quantum physics, quantum computing and related industries in China. According to statistics, in the first half of this year, more than 4,000 companies related to quantum technology were created, an increase of 652% on an annual basis.

Originally posted here:

Leading Chinese researchers are looking at the coming quantum revolution - The Press Stories

Posted in Quantum Computing | Comments Off on Leading Chinese researchers are looking at the coming quantum revolution – The Press Stories

Bull Of The Day: AMD (AMD) – Yahoo Finance

Posted: at 6:17 am

AMD AMD has taken the world of advanced chip technology by storm, with revolutionary CEO Lisa Su transforming this discount semiconductor enterprise into a leading-edge innovator. Since Lisa Su took the helm in 2014, AMD shares have skyrocketed an incomprehensible 3,200% (a $1000 investment would have yielded you $32,000 in returns).

The pandemic's digitalizing economic impact pulled forward an enormous amount of demand for AMD's innovation-driven chips, demand that will only grow from here. This semiconductor powerhouse has produced record top and bottom-line results for the past 4 consecutive quarters, blowing past analysts' estimates each time. AMD achieved record profit margins, and management raised its guidance for the remainder of the 2021. Now, analysts across the board are driving up their EPS estimates propelling AMD into a Zacks Rank #1 (Strong Buy).

The last time AMD reached a Zacks Rank #1, it shot up 67.5% in just 1.5 months (July 17th to September 1st, 2020). AMD's August consolidation looks to presenting us with an excellent entry point as demand for next-generation chip technology continues to soar, providing AMD with pricing power and an incentive to push the boundaries of innovation.

AMD's Sights Set To The Future

AMD is already pushing the limits of possibilities with its latest patent filing, which unveiled a quantum-computing processor that would utilize teleportation. This patent addresses the stability and scalability issues that current quantum-computing frameworks have been struggling with and could revolutionize the world of computing if achieved. The technology may still be years away from commercial viability, but this patent filing illustrates AMD's focus on the 4th Industrial Revolution.

Quantum computing is a nascent space, but there is an enormous amount of capital flowing into its development due to the astronomical competitive advantage it would provide. In 2019, Google's GOOGL quantum-computer Sycamore proved its ability to solve a complex mathematical equation 158 million times faster than the world's fastest binary supercomputer (IBM's Summit). If AMD could attain a competitive edge in the quantum-computing space, the profit potential would be boundless.

Story continues

As for near-term goals, AMD is expected to release its 5nm 'Zen 4' high-performance CPU in the second quarter of 2022, which will sustain this chip designer's high-performance leadership in the space. This next-generation computer processor will be up to 40% faster than the currently available 'Zen 3,' and will almost certainly be the go-to CPU for data centers (EPYC) and consumer desktops & mobile processors (RYZEN) alike, as Intel lags the innovative curve.

AMD Takeover

While Intel INTC has seemingly fallen asleep at the wheel with its once leading CPUs, AMD was provided with the rare opportunity to jump in the driver's seat of a market that had been monopolized for half a century. Intel's inability to match Taiwan Semi's TSM third party manufacturing abilities (AMD's preferred fabricator) with its one in-house operations combined with other systemic supply chain issues has propelled AMD at least 3 years ahead of it (on a generous scale).

Following a strongly worded letter from an activist investor group, Intel decided enough was enough and brought Pat Gelsinger on as the new CEO in February of this year. The company will be hard-pressed in this game of innovative catch-up to maintain its long-standing corporate relationships with AMD's CPU technology, showing clear performance advantages.

According to PassMark, AMD now controls 40% of the total CPU space, while Intel sits at 60%. AMD has more than doubled its market share in the past 5 years and will progressively control more in the coming years as Intel attempts to restore its leadership. TSMC's accelerating capabilities will be the backbone to AMD's future success, and I don't see Intel's in-house manufacturing catching up to TSMC anytime soon.

AMD is also a leader in the graphic processing unit (GPU) duopoly with NVIDIA NVDA. However, they have not been as successful in competing with this revolutionary chip maker, who has been taking a growing portion of market share in this space. Still, its GPU segment provides AMD with a further diversified product portfolio that provides a hedge for the volatile chip business cycles.

The Financials

AMD has demonstrated accelerating revenue growth with its sales swelling by 99% in this past quarter, which flowed down to margin expanded profits that drove up 350% from a year prior. This chip innovator is expected to see double-digit annualized growth on both its top and bottom-line for years to come.

AMD's balance sheet is a fortress, with more liquid capital than total liabilities, meaning the risk of default is effectively 0, especially when factoring in its exponentially appreciating quarterly cash-flows.

AMD is a seemingly expensive stock with a forward P/E of 38.4x, far above the semiconductor industry average of 22x. However, when you factor growth into this valuation multiple (PEG), the company is trading at a discount to both the chip sector and its own 3-year average.

Final Thoughts

My bet in AMD is a bet on Lisa Su. She has been AMD's innovation catalyzer and invigorated this discount chipmaker into a high-performance, high-growth market leader. I am confident that she will continue to drive this technological backbone above and beyond expectations.

17 out of 25 analysts call AMD a buy today (0 sell ratings), with recent price targets being raised as high as $150 a share (over 35% upside from here). The 4th Industrial Revolution is upon us, and it's time to start investing in it.

Want the latest recommendations from Zacks Investment Research? Today, you can download 7 Best Stocks for the Next 30 Days. Click to get this free reportIntel Corporation (INTC) : Free Stock Analysis ReportAdvanced Micro Devices, Inc. (AMD) : Free Stock Analysis ReportNVIDIA Corporation (NVDA) : Free Stock Analysis ReportTaiwan Semiconductor Manufacturing Company Ltd. (TSM) : Free Stock Analysis ReportAlphabet Inc. (GOOGL) : Free Stock Analysis ReportTo read this article on Zacks.com click here.Zacks Investment Research

Read the original here:

Bull Of The Day: AMD (AMD) - Yahoo Finance

Posted in Quantum Computing | Comments Off on Bull Of The Day: AMD (AMD) – Yahoo Finance

Top 10 Data Center Stories of the Month: August 2021 – Data Center Knowledge

Posted: at 6:17 am

It May Be Too Early to Prepare Your Data Center for Quantum Computing:Before some fundamental questions are answered, it's hard to predict what shape quantum computing will take at scale.

Google, Amazon, Microsoft Share New Security Efforts After White House Summit:The news arrives after tech company leaders met with President Biden to discuss the public-private partnership needed to address security threats.

What Has to Happen for Quantum Computing to Hit Mainstream?Data Center World keynote: It's still early days for quantum computing, where the fundamental technology remains unsettled, and the nature of workloads is fuzzy.

Open Compute Project: Redefining Open Source for the Data Center:OCP expanded the meaning of "open source" beyond software to address the same problems open source software is meant to address.

Taking a Close Look at the $2B for Cybersecurity in the $1T US Infrastructure Bill:The $1 trillion spending package includes funds for bolstering cybersecurity posture in critical digital infrastructure.

The Intersection of Colocation and Hybrid Cloud Remains in Flux:All colo providers recognize a business opportunity in the hybrid cloud trend. How theyre going after it differs widely.

How Much Does Hard Disk Temperature Matter?Tracking hard disk temperature can help avoid disk failure--and the consequences of disk failure.

Digital Realtys Hybrid Cloud Strategy Rests On Connectivity, Partnerships:The companys focus is on making connectivity easier for customers, while partners enable hybrid architecture solutions.

Pilot in Austin to Offer Early Look at Edge Computing at Scale:A group is deploying dozens of nodes that combine compute, connectivity, and sensors in a uniform fashion.

Nvidia Gives Upbeat Forecast Even as Supplies Remain Tight:Its data center unit, which sells GPU accelerators for supercomputers and AI, had sales of $2.37 billion in the quarter, up 35% from a year earlier.

Will Cloudflares Zero-Carbon Pledge Make a Real Impact?Its commitment to 100% renewable energy operations and removing historic emissions is laudable, but complex challenges limit its ambitions compared with hyperscalers.

Read the rest here:

Top 10 Data Center Stories of the Month: August 2021 - Data Center Knowledge

Posted in Quantum Computing | Comments Off on Top 10 Data Center Stories of the Month: August 2021 – Data Center Knowledge

Quantum computers could read all your encrypted data. This ‘quantum-safe’ VPN aims to stop that – ZDNet

Posted: August 28, 2021 at 11:49 am

The trial successfully demonstrated, according to Verizon, that it is possible to replace current security processes with protocols that are quantum-proof.

To protect our private communications from future attacks by quantum computers, Verizon is trialing the use of next-generation cryptography keys to protect the virtual private networks (VPNs) that are used every day by companies around the world to prevent hacking.

Verizon implemented what it describes as a "quantum-safe" VPN between one of the company's labs in London in the UK and a US-based center in Ashburn, Virginia, using encryption keys that were generated thanks to post-quantum cryptography methods meaning that they are robust enough to withstand attacks from a quantum computer.

According to Verizon, the trial successfully demonstrated that it is possible to replace current security processes with protocols that are quantum-proof.

VPNs are a common security tool used to protect connections made over the internet, by creating a private network from a public internet connection. When a user browses the web with a VPN, all of their data is redirected through a specifically configured remote server run by the VPN host, which acts as a filter that encrypts the information.

This means that the user's IP address and any of their online activities, from sending emails to paying bills, come out as gibberish to potential hackers even on insecure networks like public WiFi, where eavesdropping is much easier.

Especially in the last few months, which have seen many employees switching to full-time working from home,VPNs have become an increasingly popular tool to ensure privacy and security on the internet.

The technology, however, is based on cryptography protocols that are not un-hackable. To encrypt data, VPN hosts use encryption keys that are generated by well-established algorithms such as RSA (RivestShamirAdleman). The difficulty of cracking the key, and therefore of reading the data, is directly linked to the algorithm's ability to create as complicated a key as possible.

In other words, encryption protocols as we know them are essentially a huge math problem for hackers to solve. With existing computers, cracking the equation is extremely difficult, which is why VPNs, for now, are still a secure solution. But quantum computers are expected to bring about huge amounts of extra computing power and with that, the ability to hack any cryptography key in minutes.

"A lot of secure communications rely on algorithms which have been very successful in offering secure cryptography keys for decades," Venkata Josyula, the director of technology at Verizon, tells ZDNet. "But there is enough research out there saying that these can be broken when there is a quantum computer available at a certain capacity. When that is available, you want to be protecting your entire VPN infrastructure."

One approach that researchers are working on consists ofdeveloping algorithms that can generate keys that are too difficult to hack, even with a quantum computer. This area of research is known as post-quantum cryptography, and is particularly sought after by governments around the world.

In the US, for example, the National Institute of Standards and Technology (NIST) launched a global research effort in 2016 calling on researchers to submit ideas for algorithms that would be less susceptible to a quantum attack. A few months ago, the organization selected a group of 15 algorithms that showed the most promise.

"NIST is leading a standardization process, but we didn't want to wait for that to be complete because getting cryptography to change across the globe is a pretty daunting task," says Josyula. "It could take 10 or even 20 years, so we wanted to get into this early to figure out the implications."

Verizon has significant amounts of VPN infrastructure and the company sells VPN products, which is why the team started investigating how to start enabling post-quantum cryptography right now and in existing services, Josyula adds.

One of the 15 algorithms identified by NIST, called Saber, was selected for the test. Saber generated quantum-safe cryptography keys that were delivered to the endpoints in London and Ashburn of a typical IPsec VPN through an extra layer of infrastructure, which was provided by a third-party vendor.

Whether Saber makes it to the final rounds of NIST's standardization process, in this case, doesn't matter, explains Josyula. "We tried Saber here, but we will be trying others. We are able to switch from one algorithm to the other. We want to have that flexibility, to be able to adapt in line with the process of standardization."

In other words, Verizon's test has shown that it is possible to implement post-quantum cryptography candidates on infrastructure links now, with the ability to migrate as needed between different candidates for quantum-proof algorithms.

This is important because, although a large-scale quantum computer could be more than a decade away, there is still a chance that the data that is currently encrypted with existing cryptography protocols is at risk.

The threat is known as "harvest now, decrypt later" and refers to the possibility that hackers could collect huge amounts of encrypted data and sit on it while they wait for a quantum computer to come along that could read all the information.

"If it's your Amazon shopping cart, you may not care if someone gets to see it in ten years," says Josyula. "But you can extend this to your bank account, personal number, and all the way to government secrets. It's about how far into the future you see value for the data that you own and some of these have very long lifetimes."

For this type of data, it is important to start thinking about long-term security now, which includes the risk posed by quantum computers.

A quantum-safe VPN could be a good start even though, as Josyula explains, many elements still need to be smoothed out. For example, Verizon still relied on standard mechanisms in its trial to deliver quantum-proof keys to the VPN end-points. This might be a sticking point, if it turns out that this phase of the process is not invulnerable to quantum attack.

The idea, however, is to take proactive steps to prepare, instead of waiting for the worst-case scenario to happen. Connecting London to Ashburn was a first step, and Verizon is now looking at extending its quantum-safe VPN to other locations.

Read more from the original source:

Quantum computers could read all your encrypted data. This 'quantum-safe' VPN aims to stop that - ZDNet

Posted in Quantum Computing | Comments Off on Quantum computers could read all your encrypted data. This ‘quantum-safe’ VPN aims to stop that – ZDNet

Expert: Now is the time to prepare for the quantum computing revolution – TechRepublic

Posted: at 11:49 am

Though quantum computing is likely five to 10 years away, waiting until it happens will put your organization behind. Don't play catch-up later.

TechRepublic's Karen Roby spoke with Christopher Savoie, CEO and co-founder of Zapata Computing, a quantum application company, about the future of quantum computing. The following is an edited transcript of their conversation.

SEE: The CIO's guide to quantum computing (free PDF) (TechRepublic)

Christoper Savoie: There are two types of quantum-computing algorithms if you will. There are those that will require what we call a fault-tolerant computing system, one that doesn't have error, for all intents and purposes, that's corrected for error, which is the way most classical computers are now. They don't make errors in their calculations, or at least we hope they don't, not at any significant rate. And eventually we'll have these fault-tolerant quantum computers. People are working on it. We've proven that it can happen already, so that is down the line. But it's in the five- to 10-year range that it's going to take until we have that hardware available. But that's where a lot of the promises for these exponentially faster algorithms. So, these are the algorithms that will use these fault-tolerant computers to basically look at all the options available in a combinatorial matrix.

So, if you have something like Monte Carlo simulation, you can try significantly all the different variables that are possible and look at every possible combination and find the best optimal solution. So, that's really, practically impossible on today's classical computers. You have to choose what variables you're going to use and reduce things and take shortcuts. But with these fault-tolerant computers, for significantly many of the possible solutions in the solution space, we can look at all of the combinations. So, you can imagine almost an infinite amount or an exponential amount of variables that you can try out to see what your best solution is. In things like CCAR [Comprehensive Capital Analysis and Review], Dodd-Frank [Dodd-Frank Wall Street Reform and Consumer Protection Act] compliance, these things where you have to do these complex simulations, we rely on a Monte Carlo simulation.

So, trying all of the possible scenarios. That's not possible today, but this fault tolerance will allow us to try significantly all of the different combinations, which will hopefully give us the ability to predict the future in a much better way, which is important in these financial applications. But we don't have those computers today. They will be available sometime in the future. I hate putting a date on it, but think about it on the decade time horizon. On the other hand, there are these nearer-term algorithms that run on these noisy, so not error-corrected, noisy intermediate-scale quantum devices. We call them NISQ for short. And these are more heuristic types of algorithms that are tolerant to noise, much like neural networks are today in classical computing and [artificial intelligence] AI. You can deal a little bit with the sparse data and maybe some error in the data or other areas of your calculation. Because it's an about-type of calculation like neural networks do. It's not looking at the exact answers, all of them and figuring out which one is definitely the best. This is an approximate algorithm that iterates and tries to get closer and closer to the right answer.

SEE: Hiring Kit: Video Game Designer (TechRepublic Premium)

But we know that neural networks work this way, deep neural networks. AI, in its current state, uses this type of algorithm, these heuristics. Most of what we do in computation nowadays and finance is heuristic in its nature and statistical in its nature, and it works good enough to do some really good work. In algorithmic trading, in risk analysis, this is what we use today. And these quantum versions of that will also be able to give us some advantage and maybe an advantage overwe've been able to show in recent workthe purely classical version of that. So, we'll have some quantum-augmented AI, quantum-augmented [machine learning] ML. We call it a quantum-enhanced ML or quantum-enhanced optimization that we'll be able to do.

So, people think of this as a dichotomy. We have these NISQ machines, and they're faulty, and then one day we'll wake up and we'll have this fault tolerance, but it's really not that way. These faulty algorithms, if you will, these heuristics that are about, they will still work and they may work better than the fault-tolerant algorithms for some problems and some datasets, so this really is a gradient. It really is. You'd have a false sense of solace, maybe two. "Oh well, if that's 10 years down the road we can just wait and let's wait till we wake up and have fault tolerance." But really the algorithms are going to be progressing. And the things that we develop now will still be useful in that fault-tolerant regime. And the patents will all be good for the stuff that we do now.

So, thinking that, "OK, this is a 10 year time horizon for those fault-tolerant computers. Our organization is just going to wait." Well, if you do, you get a couple of things. You're not going to have the workforce in place to be able to take advantage of this. You're probably not going to have the infrastructure in place to be able to take advantage of this. And meanwhile, all of your competitors and their vendors have acquired a portfolio of patents on these methodologies that are good for 20 years. So, if you wait five years from now and there's a patent four years down the line, that's good for 24 years. So there really is, I think, an incentive for organizations to really start working, even in this NISQ, this noisier regime that we're in today.

Karen Roby: You get a little false sense of security, as you mentioned, of something, oh, you say that's 10 years down the line, but really with this, you don't have the luxury of catching up if you wait too long. This is something that people need to be focused on now for what is down the road.

SEE: Quantum entanglement-as-a-service: "The key technology" for unbreakable networks (TechRepublic)

Christoper Savoie: Yes, absolutely. And in finance, if you have a better ability to detect risks then than your competitors; you're at a huge advantage to be able to find alpha in the market. If you can do that better than others, you're going to be at a huge advantage. And if you're blocked by people's patents or blocked by the fact that your workforce doesn't know how to use these things, you're really behind the eight ball. And we've seen this time and time again with different technology evolutions and revolutions. With big data and our use of big data, with that infrastructure, with AI and machine learning. The organizations that have waited generally have found themselves behind the eight ball, and it's really hard to catch up because this stuff is changing daily, weekly, and new inventions are happening. And if you don't have a workforce that's up and running and an infrastructure ready to accept this, it's really hard to catch up with your competitors.

Karen Roby: You've touched on this a little bit, but really for the finance industry, this can be transformative, really significant what quantum computing can do.

Christoper Savoie: Absolutely. At the end of the day, finance is math, and we can do better math and more accurate math on large datasets with quantum computing. There is no question about that. It's no longer an "if." Google has, with their experiment, proven that at some point we're going to have a machine that is definitely going to be better at doing math, some types of math, than classical computers. With that premise, if you're in a field that depends on math, that depends on numbers, which is everything, and statistics, which is finance, no matter what side you're on. If you're on the risk side or the investing side, you're going to need to have the best tools. And that doesn't mean you have to be an algorithmic trader necessarily, but even looking at tail risk and creating portfolios and this kind of thing. You're dependent on being able to quickly ascertain what that risk is, and computing is the only way to do that.

SEE: The quantum decade: IBM predicts the 2020s will see quantum begin to solve real problems (TechRepublic)

And on the regulatory side, I mentioned CCAR. I think as these capabilities emerge, it allows the regulators to ask for even more scenarios to be simulated, those things that are a big headache for a lot of companies. But it's important because our global financial system depends on stability and predictability, and to be able to have a computational resource like quantum that's going to allow us to see more variables or more possibilities or more disaster scenarios. It can really help. "What is the effect of, say, a COVID-type event on the global financial system?" To be more predictive of that and more accurate at doing that is good for everybody. I think all boats rise, and quantum is definitely going to give us that advantage as well.

Karen Roby: Most definitely. And Christopher, before I let you go, if you would just give us a quick snapshot of Zapata Computing and the work that you guys do.

Christoper Savoie: We have two really important components to try and make this stuff reality. On the one hand, we've got over 30 of the brightest young minds and algorithms, particularly for these near-term devices and how to write those. We've written some of the fundamental algorithms that are out there to be used on quantum computers. On the other hand, how do you make those things work? That's a software engineering thing. That's not really quantum science. How do you make the big data work? And that's all the boring stuff of ETL and data transformation and digitalization and cloud and multicloud and all this boring but very important stuff. So basically Zapata is a company that has the best of the algorithms, but also best-of-breed means of actually software engineering that in a modern, multicloud environment that particularly finance companies, banks, they're regulated companies with a lot of data that is sensitive and private and proprietary. So, you need to be able to work in a safe and secure multicloud environment, and that's what our software engineering side allows us to do. We have the best of both worlds there.

Learn the latest news and best practices about data science, big data analytics, and artificial intelligence. Delivered Mondays

Image: sakkmesterke, Getty Images/iStockphoto

Continue reading here:

Expert: Now is the time to prepare for the quantum computing revolution - TechRepublic

Posted in Quantum Computing | Comments Off on Expert: Now is the time to prepare for the quantum computing revolution – TechRepublic

Who will dominate the tech arms race? – The Jerusalem Post

Posted: at 11:49 am

It is almost impossible to overstate what a quantum computer will be able to do, Christopher Monroe told the Magazine in a recent interview.

Monroe a professor at both the University of Maryland and Duke University, as well as co-founder of the quantum computing company IonQ discussed how quantum computing will change the face of the planet, even if this might take some more time.

The Magazine also interviewed four other experts in the quantum field and visited seven of their labs at the University of Maryland.

cnxps.cmd.push(function () { cnxps({ playerId: '36af7c51-0caf-4741-9824-2c941fc6c17b' }).render('4c4d856e0e6f4e3d808bbc1715e132f6'); });

These labs the full likes of which do not yet exist in Israel hosted all kinds of qubits (the basis of quantum computers), lasers blasting targets to cause plasma to come off to form distinctive films, infrared lasers, furnaces reaching 2,000C, a tetra arc furnace for growing silicon crystals, special dilution refrigerators to achieve cryostorage (deep freezing) and a variety of vacuum chambers that would seem like an alternate reality to the uninitiated.

Before entering each lab, there needed to be a conversation about whether this reporter should be wearing the special goggles that were handed out to avoid getting blinded.

One top quantum official at Maryland, Prof. Dr. Johnpierre Paglione, assured the Magazine that the ultrahazardous materials warning on many of the lab doors was not a concern at that moment.

From cracking the Internet as we know it, to military and economic dominance, to changing the way people manage their lives, quantum computers are predicted to make mincemeat of todays supercomputers. Put simply, they are made out of and operate from a completely different kind of material and set of principles connected to qubits and quantum mechanics, with computing potential that dwarfs classical computers capabilities.

But lets say the US wins the race who in the US would win it? Would it be giants like Google, Microsoft, Amazon, IBM and Honeywell? Or might it be a lean and fast solely quantum-focused challenger like Monroes IonQ?

At first glance, Google has no real challenger. In 2019, Google said it achieved quantum supremacy when its quantum computer became the first to perform a calculation that would be practically impossible for a classical machine, by checking the outputs from a quantum random-number generator.

The search-engine giant has already built a 54-qubit computer whereas IonQs largest quantum computer only has 32 qubits. Google has also promised to achieve the holy grail of quantum computing, a system large enough to revolutionize the Internet, military and economic issues, by 2029. Although China recently reproduced Googles experiment, Google is still regarded as ahead of the game.

Why is a 32-qubit quantum computer better than a 54-qubit one?

So why is Monroe so confident that his company will finish the race long before Google?

First, he takes a shot at the Google 2019 experiment.

It was a fairly academic exercise. The problem they attacked was one of those rare problems where you can prove something and you can prove the super computer cannot do it. Quantum mechanics works. It is not a surprise. The problem Google tackled was utterly useless. The system was not flexible enough to program to hit other problems. So a big company did a big academic demonstration, he said with a sort of whoop-dee-do tone and expression on his face.

Google had to repeat its experiment millions of times The signal went down by orders of magnitude. There are special issues to get the data. There are general problems where it cannot maintain [coherence]. The Google experiment and qubits decayed by seven times the constant. We gauge on one time for the constant and we can do 100 operations, with IonQs quantum computers.

In radioactive decay, the time constant is related to the decay constant and essentially represents the average lifetime of a decaying system, such as an atom. Some of the tactics for potentially overcoming decay go back to the lasers, vacuum chambers and cryostorage refrigerators mentioned above.

Monroe said from a business perspective, the experiment was a big distraction, and you will hear this from Google computer employees. They had to run simulations to prove how hard it would be to do what they were doing with old computers instead of building better quantum computers and solving useful algorithms.

We believe quantum computers work now it is time to build them, he stressed.

Describing IonQs quantum computers, Monroe said, The 32-qubit computer is fifth generation. The third and fourth generation is available to [clients of] Microsoft, Amazon and Google Cloud. It is 11 qubits, which is admittedly small, but it still runs more than any IBM machine can run. An 11-qubit computer is very clean operationally. It can run 100 or so ops [operations] before the laser noise causes coherence to be lost [before the qubits stop working]. That is many more ops [operations] than superconductors. If [a computer] has one million qubits, but can only run a few ops [operations], it is boring. But trapped ions adding more qubits at the same time makes things cheaper.

He added, The 32-qubit computer is not yet on the cloud. We are working in private with customers financials, noting that a future publication will discuss the baby version of an algorithm which could be very interesting when you start to scale it up. Maybe in the next generation, we can engineer it to solve an optimization problem something we dont get from the cloud, where we dont get any telemetry, which would be an unusual benefit for clients.

According to Monroe, that he will be able to build a 1,000-qubit computer by 2025 practically tomorrow in the sphere of new inventions will in and of itself be game-changing. This is true even if it is not yet capable of accomplishing all the extreme miracles that much larger quantum computers may someday accomplish.

A major innovation or risk (depending on your worldview) by Monroe is how he treats the paramount challenge of quantum computers and error correction basically the idea that for quantum computers to work, some process must be conceived to prevent qubits from decaying at the rate they currently decay at otherwise crucial calculations get interrupted mid-calculation.

Here, Monroe critiques both the Google approach and responds to criticism from some of his academic colleagues about his approach to error correction. Google is trying to get to one million qubits that do not work well together.

In contrast, a special encoding process could allow IonQ to create what Monroe called a single sort of super qubit, which would eliminate 99.9% of native errors. This is the easiest way to get better at quantum computing, as opposed to the quantity over quality path Google is pursuing.

But he has to defend himself from others poking holes in his approach as unrealistic, including some of his colleagues at University of Maryland (all sides still express great respect for each other). Confronted by this criticism, he responded that their path of attack was based on the theory of error correction. It implies that you will do indefinitely long computations, [but] no one will ever need this high a standard to do business.

We do not use error correction on our CPU [central processing unit] because silicon is so stable. We call it OK if it fails in one year, since that is more than enough time to be economically worthwhile. Instead of trying to eliminate errors, his strategy is to gradually add more qubits, which achieves slightly more substantial results. His goal is to work around the error-correction problem.

Part of the difference between Monroe and his academic colleagues relates to his having crossed over into a mix of business and academia. Monroes view on this issue? Industry and academia do not always see things the same way. Academics are trained to prove everything we do. But if a computer works better to solve a certain problem, we do not need to prove it.

For example, if a quantum computer doubled the value of a financial portfolio compared to a super computers financial recommendations, the client is thrilled even if no one knows how.

He said that when shortcuts solve problems and certain things cannot be proven but where quantum computing finds value academics hate it. They are trained to be pessimists. I do believe quantum computers will find narrow applications within five years.

Besides error correction, another question is what the qubits themselves, the basis of different kinds of quantum computers, should be made out of. The technique that many of his competitors are using to make computers out of a particular kind of qubit has the benefit of being not hard to do, inexpensive and representing beautiful physics.

However, he warned, No one knows where to find it if it exists So stay in solid-state physics and build computers out of solid-state systems. Google, Amazon and others are all invested in solid-state computers. But I dont see it happening without fundamental physics breakthroughs. If you want to build and engineer a device if you want to have a business you should not be reliant on physics breakthroughs.

Instead of the path of his competitors, Monroe emphasized working with natural quantum atoms and tricking and engineering them to act how he wants using low pressure instead of low temperatures.

I work with charged atoms or ions. We levitate them inside a vacuum chamber which is getting smaller every year. We have a silicon chip. Just electrodes, electric force fields are holding up these atoms. There are no solids and no air in the vacuum chamber, which means the atoms remain extremely well isolated. They are the most perfect atoms we know, so we can scale without worrying about the top of the noise [the threshold where qubits decay]. We can pick qubit levels that do not yet decay.

Why are Google and IBM investing in natural qubits? Because they have a blind spot. They have been first in solid-state physics and engineering for 50 years. If there is a silicon solid-state quantum computer, Intel will make that, but I dont see how it will be scaled, he declared.

MONROE IS far from the full quantum show at Maryland.

Paglione has been a professor at University of Maryland for 13 years and the director of the Maryland Quantum Materials Center for the last five years.

In 1986, the center was working on high-temperature superconductors, Paglione said, noting that work on quantum computers is a more recent development. The development has not merely altered the focus of the centers research. According to Paglione, it has also helped grow the center from around seven staff members 30 years ago to around 100 staff members when all of the affiliate members, students and administrative staff are taken into account.

Similarly, Dr. Gretchen Campbell, director of the Joint Quantum Institute, told the Magazine that a big part of her institutions role and her personal role has been to first bring together people from atomic physics and condensed-matter physics even within physics, we do not always talk to each other, followed by connecting these experts with computer science experts.

Campbell explained it was crucial to explore the interaction between the quantum realm and quantum algorithms, for which they needed more math and computer science backgrounds and to continue to move from laboratories to real-world applications to translating into technology and interacting more with industry.

She also guided the Magazine, adorning goggles, through a lab with a digital micromirror device and laser beams relating to atom clouds and light projectors.

Add in some additional departments at Maryland as well as a partnership with the National Institute of Standards and Technology (NIST) and the number of staff swells way past 100. What are their many different teams working on? The lab studies and experiments are as varied as the different disciplines, with Paglione talking about possibilities for making squid devices or sensitive magnetic sensors that could be constructed by using a superconducting quantum interference device.

Paglione said magnetometer systems could be used with squids to sense the magnetic field of samples. These could be used as detectors in water. If they were made sensitive enough, they could sense changes in a magnetic field, such as when a submarine passes by and generates a changed magnetic field.

This has drawn attention from the US Department of Defense.

A multidisciplinary mix of Pagliones team recently captured the most direct evidence to date of a quantum quirk, which permits particles to tunnel through a barrier as if it is not even there. The upshot could be assisting engineers in designing more uniform components to build both future quantum computers and quantum sensors (reported applications could detect not only submarines but aircraft).

Pagliones team, headed by Ichiro Takeuchi, a professor of materials science and engineering at Maryland, successfully carried out a new experiment in which they observed Klein tunneling. In the quantum world, tunneling enables particles, such as electrons, to pass through a barrier even if they lack sufficient energy to actually climb over it. A taller barrier usually makes climbing over harder and fewer particles are able to cross through. The phenomenon, known as Klein tunneling, happens when the barrier becomes completely transparent and opens up a portal that particles can traverse regardless of the barriers height.

Scientists and engineers from Marylands Center for Nanophysics and Advanced Materials, the Joint Quantum Institute and the Condensed Matter Theory Center along with the Department of Materials Science and Engineering and Department of Physics, succeeded in making the most compelling measurements of the phenomenon to date.

Given that Klein tunneling was initially predicted to occur in the world of high-energy quantum particles moving close to the speed of light, observing the effect was viewed as impossible. That was until scientists revealed that some of the rules governing fast-moving quantum particles can also apply to the comparatively sluggish particles traveling near the surface of some highly unusual materials.

It was a piece of serendipity that the unusual material and an elemental relative of sorts shared the same crystal structure, said Paglione. However, the multidisciplinary team we have was one of the keys to this success. Having experts on topological physics, thin-film synthesis, spectroscopy and theoretical understanding really got us to this point.

Bringing this back to quantum computing, the idea is that interactions between superconductors and other materials are central ingredients in some quantum computer architectures and precision-sensing devices. Yet, there has always been a problem that the junction, or crossover spot, where they interact is slightly different. Takeuchi said this led to sucking up countless amounts of time and energy tuning and calibrating to reach the best performance.

Takeuchi said Klein tunneling could eliminate this variability, which has played havoc with device-to-device interactions.

AN ENTIRELY separate quantum application could be physics department chairman Prof. Steve Rolstons work on establishing a quantum communications network. Rolston explained that when a pair of photons are quantum entangled you can achieve quantum encryption over a communications network, by using entangled particles to create secure keys that cannot be hacked. There are varying paths to achieve such a quantum network and Rolston is skeptical of others in the field who could be seen as cutting corners.

He also is underwhelmed by Chinas achievements in this area. According to Rolston, no one has figured out how to extend a secure quantum network over any space sizable enough to make the network usable and marketable in practical terms.

Rather, he said existing quantum networks are either limited to very small spaces, or to extend their range they must employ gimmicks that usually impair how secure they are. Because of these limitations, Rolston went as far as to say that his view is that the US National Security Agency views the issue as a distraction.

In terms of export trade barriers or issues with China, he said he opposes controls and believes cooperation in the quantum realm should continue, especially since all of his centers research is made public anyway.

Rolston also lives up to Monroes framing of the difference between academics and industry-focused people. He said that even Monroe would have to admit that no one is close to the true holy grail of quantum computers computers with a massive number of qubits and that the IonQ founder is banking on interesting optimization problems being solvable for industry to an extent which will justify the hype instead.

In contrast, Rolston remained pessimistic that such smaller quantum computers would achieve sufficient superiority at optimization issues in business to justify a rushed prediction that transforming the world is just around the corner.

In Rolstons view, the longer, more patient and steadier path is the one that will eventually reap rewards.

For the moment, we do not know whether Google or IonQ, or those like Monroe or Rolston will eventually be able to declare they were right. We do know that whoever is right and whoever is first will radically change the world as we know it.

View post:

Who will dominate the tech arms race? - The Jerusalem Post

Posted in Quantum Computing | Comments Off on Who will dominate the tech arms race? – The Jerusalem Post

IBM partners with the University of Tokyo on quantum computer – Illinoisnewstoday.com

Posted: at 11:49 am

Tokyo IBM and the University of Tokyo have announced one of Japans most powerful quantum computers.

According to IBM, IBM Quantum System One is part of the Japan-IBM quantum partnership between the University of Tokyo and IBM, advancing Japans quest for quantum science, business and education.

IBM Quantum System One is currently in operation for researchers at both Japanese scientific institutions and companies, and access is controlled by the University of Tokyo.

IBM is committed to growing the global quantum ecosystem and facilitating collaboration between different research communities, said Dr. Dario Gil, director of IBM Research.

According to IBM, quantum computers combine quantum resources with classical processing to provide users with access to reproducible and predictable performance from high-quality qubits and precision control electronics. Users can safely execute algorithms that require iterative quantum circuits in the cloud.

see next: IBM partners with Atos on contract with Dutch Ministry of Defense

IBM Quantum System One in Japan is IBMs second system built outside the United States. In June, IBM unveiled the IBM Quantum System One, managed by the scientific research institute Fraunhofer Geselleschaft, in Munich, Germany.

IBMs commitment to quantum is aimed at advancing quantum computing and fostering a skilled quantum workforce around the world.

We are thrilled to see Japans contributions to research by world-class academics, the private sector, and government agencies, Gil said.

Together, we can take a big step towards accelerating scientific progress in different areas, Gil said.

Teruo Fujii, President of the University of Tokyo, said, In the field of rapidly changing quantum technology, it is very important not only to develop elements and systems related to quantum technology, but also to develop the next generation of human resources. To achieve a high degree of social implementation.

Our university has a wide range of research capabilities and has always promoted high-level quantum education from the undergraduate level. Now, with IBM Quantum System One, we will develop the next generation of quantum native skill sets. Further refine it.

In 2020, IBM and the University of Tokyo Quantum Innovation Initiative Consortium (QIIC) aims to strategically accelerate the research and development activities of quantum computing in Japan by bringing together the academic talents of universities, research groups and industries nationwide.

Last year, IBM also announced partnerships with several organizations focusing on quantum information science and technology. Cleveland Clinic, NS Science and Technology Facilities Council in the United Kingdom, And that University of Illinois at Urbana-Champaign..

see next: Public cloud computing provider

See original here:

IBM partners with the University of Tokyo on quantum computer - Illinoisnewstoday.com

Posted in Quantum Computing | Comments Off on IBM partners with the University of Tokyo on quantum computer – Illinoisnewstoday.com

Why Quantum Resistance Is the Next Blockchain Frontier – Tech Times

Posted: at 11:49 am

(Photo : Why Quantum Resistance Is the Next Blockchain Frontier)

As decentralized networks secured by potentially thousands of miners and/or nodes, blockchains are widely considered to be an incredibly secure example of distributed ledger technology.

On the back of this, they also have dozens of potential applications - ranging from decentralized content storage networks, to medical records databases, and supply chain management. But to this day, they're most commonly thought of as the ideal platform hosting the financial infrastructure of tomorrow - such as decentralized exchanges and payment settlement networks.

But there's a problem. While the blockchains of today are practically unhackable - due to the type of encryption they use to secure private keys and transactions - this might not be the case for much longer. This is due to the advent of so-called "quantum computers", that is, computers that can leverage the properties of quantum mechanics to solve problems that would be impossible with traditional computers... such as breaking the cryptography that secures current generation blockchains.

Many blockchains of today use at least two types of cryptographic algorithms - asymmetric key algorithms and hash functions.

The first kind, also known as public-key cryptography, is used to produce pairs of private and public keys that are provably cryptographically linked. In Bitcoin, this private key is used to spend UTXOs - thereby transferring value from one person to another. The second kind - the hash function - is used to securely process raw transaction data into a block in a way that is practically irreversible.

As you might imagine, a sufficiently powerful quantum computer capable of breaking either of these security mechanisms could have devastating consequences for susceptible blockchains - since they could be used to potentially derive private keys or even mine cryptocurrency units much faster than the expected rate (leading to supply inflation).

So, just how far away from this are we? Well, according to recent estimates, a quantum computer possessing 4,000 qubits of processing power could be the minimum necessary to break the public key cryptography that secures Bitcoin user funds. A sufficiently flexible quantum computer with this processing power could, theoretically, take over the funds contained in any Bitcoin p2pk address - that's a total of around 2 million BTC (circa $67 billion at today's rates).

Fortunately, this isn't an immediate concern. As it stands, the world's most powerful quantum computer - the Zuchongzhi quantum computer- currently clocks in at an impressive (albeit insufficient) 66 qubits. However, given the rapid pace of development in the quantum computing sector, some experts predict that Bitcoin's Elliptic Curve Digital Signature Algorithm (ECDSA) could meet its quantum match within a decade.

(Photo : The Next Platform)

The algorithm that could be potentially used to break ECDSA has already been developed. If generalized and applied by a powerful enough quantum computer, it is widely thought that Peter Shor's polynomial time quantum algorithm would be able to attack the Bitcoin blockchain - while similar algorithms could be applied to other forms of traditional encryption.

But this might not be a concern for much longer, thanks to the introduction of what many consider to be the world's first truly quantum-resistant blockchain. The platform, known as QANplatform, is built to resist all known quantum attacks by using lattice cryptography. QAN manages to achieve quantum resistance while simultaneously tackling the energy concerns that come with some other blockchains through its highly efficient consensus mechanism known as Proof-of-Randomness (PoR).

Unlike some other so-called quantum-resistant blockchains, QAN is unusual in that it also supports decentralized applications (DApps) - allowing developers to launch quantum-resistant DApps within minutes using its free developer tools.

Besides platforms like QAN, the development communities behind several popular blockchains are already beginning to consider implementing their own quantum-resistance solutions, such as the recently elaboratedcommit-delay-reveal scheme - which could be used to transition Bitcoin to a quantum-resistant state. Nonetheless, the future of post-quantum cryptography still remains up in the air, as none of the top ten blockchains by user count have yet committed to a specific quantum-resistant signature scheme.

2021 TECHTIMES.com All rights reserved. Do not reproduce without permission.

Tags:

See more here:

Why Quantum Resistance Is the Next Blockchain Frontier - Tech Times

Posted in Quantum Computing | Comments Off on Why Quantum Resistance Is the Next Blockchain Frontier – Tech Times

Page 68«..1020..67686970..8090..»