Page 104«..1020..103104105106..110120..»

Category Archives: Quantum Computing

Is teleportation possible? Yes, in the quantum world – University of Rochester

Posted: June 24, 2020 at 6:30 am

Quantum teleportation is an important step in improving quantum computing.

Beam me up is one of the most famous catchphrases from the Star Trek series. It is the command issued when a character wishes to teleport from a remote location back to the Starship Enterprise.

While human teleportation exists only in science fiction, teleportation is possible in the subatomic world of quantum mechanicsalbeit not in the way typically depicted on TV. In the quantum world, teleportation involves the transportation of information, rather than the transportation of matter.

Last year scientists confirmed that information could be passed between photons on computer chips even when the photons were not physically linked.

Now, according to new research from the University of Rochester and Purdue University, teleportation may also be possible between electrons.

In a paper published in Nature Communications and one to appear in Physical Review X, the researchers, including John Nichol, an assistant professor of physics at Rochester, and Andrew Jordan, a professor of physics at Rochester, explore new ways of creating quantum-mechanical interactions between distant electrons. The research is an important step in improving quantum computing, which, in turn, has the potential to revolutionize technology, medicine, and science by providing faster and more efficient processors and sensors.

Quantum teleportation is a demonstration of what Albert Einstein famously called spooky action at a distancealso known as quantum entanglement. In entanglementone of the basic of concepts of quantum physicsthe properties of one particle affect the properties of another, even when the particles are separated by a large distance. Quantum teleportation involves two distant, entangled particles in which the state of a third particle instantly teleports its state to the two entangled particles.

Quantum teleportation is an important means for transmitting information in quantum computing. While a typical computer consists of billions of transistors, called bits, quantum computers encode information in quantum bits, or qubits. A bit has a single binary value, which can be either 0 or 1, but qubits can be both 0 and 1 at the same time. The ability for individual qubits to simultaneously occupy multiple states underlies the great potential power of quantum computers.

Scientists have recently demonstrated quantum teleportation by using electromagnetic photons to create remotely entangled pairs of qubits.

Qubits made from individual electrons, however, are also promising for transmitting information in semiconductors.

Individual electrons are promising qubits because they interact very easily with each other, and individual electron qubits in semiconductors are also scalable, Nichol says. Reliably creating long-distance interactions between electrons is essential for quantum computing.

Creating entangled pairs of electron qubits that span long distances, which is required for teleportation, has proved challenging, though: while photons naturally propagate over long distances, electrons usually are confined to one place.

In order to demonstrate quantum teleportation using electrons, the researchers harnessed a recently developed technique based on the principles of Heisenberg exchange coupling. An individual electron is like a bar magnet with a north pole and a south pole that can point either up or down. The direction of the polewhether the north pole is pointing up or down, for instanceis known as the electrons magnetic moment or quantum spin state. If certain kinds of particles have the same magnetic moment, they cannot be in the same place at the same time. That is, two electrons in the same quantum state cannot sit on top of each other. If they did, their states would swap back and forth in time.

The researchers used the technique to distribute entangled pairs of electrons and teleport their spin states.

We provide evidence for entanglement swapping, in which we create entanglement between two electrons even though the particles never interact, and quantum gate teleportation, a potentially useful technique for quantum computing using teleportation, Nichol says. Our work shows that this can be done even without photons.

The results pave the way for future research on quantum teleportation involving spin states of all matter, not just photons, and provide more evidence for the surprisingly useful capabilities of individual electrons in qubit semiconductors.

See the original post here:

Is teleportation possible? Yes, in the quantum world - University of Rochester

Posted in Quantum Computing | Comments Off on Is teleportation possible? Yes, in the quantum world – University of Rochester

Teleportation Is Indeed Possible At Least in the Quantum World – SciTechDaily

Posted: at 6:30 am

Quantum teleportation is an important step in improving quantum computing.

Beam me up is one of the most famous catchphrases from the Star Trek series. It is the command issued when a character wishes to teleport from a remote location back to the Starship Enterprise.

While human teleportation exists only in science fiction, teleportation is possible in the subatomic world of quantum mechanicsalbeit not in the way typically depicted on TV. In the quantum world, teleportation involves the transportation of information, rather than the transportation of matter.

Last year scientists confirmed that information could be passed between photons on computer chips even when the photons were not physically linked.

Now, according to new research from the University of Rochester and Purdue University, teleportation may also be possible between electrons.

A quantum processor semiconductor chip is connected to a circuit board in the lab of John Nichol, an assistant professor of physics at the University of Rochester. Nichol and Andrew Jordan, a professor of physics, are exploring new ways of creating quantum-mechanical interactions between distant electrons, promising major advances in quantum computing. Credit: University of Rochester photo / J. Adam Fenster

In a paper published in Nature Communications and one to appear in Physical Review X, the researchers, including John Nichol, an assistant professor of physics at Rochester, and Andrew Jordan, a professor of physics at Rochester, explore new ways of creating quantum-mechanical interactions between distant electrons. The research is an important step in improving quantum computing, which, in turn, has the potential to revolutionize technology, medicine, and science by providing faster and more efficient processors and sensors.

Quantum teleportation is a demonstration of what Albert Einstein famously called spooky action at a distancealso known as quantum entanglement. In entanglementone of the basic of concepts of quantum physicsthe properties of one particle affect the properties of another, even when the particles are separated by a large distance. Quantum teleportation involves two distant, entangled particles in which the state of a third particle instantly teleports its state to the two entangled particles.

Quantum teleportation is an important means for transmitting information in quantum computing. While a typical computer consists of billions of transistors, called bits, quantum computers encode information in quantum bits, or qubits. A bit has a single binary value, which can be either 0 or 1, but qubits can be both 0 and 1 at the same time. The ability for individual qubits to simultaneously occupy multiple states underlies the great potential power of quantum computers.

Scientists have recently demonstrated quantum teleportation by using electromagnetic photons to create remotely entangled pairs of qubits.

Qubits made from individual electrons, however, are also promising for transmitting information in semiconductors.

Individual electrons are promising qubits because they interact very easily with each other, and individual electron qubits in semiconductors are also scalable, Nichol says. Reliably creating long-distance interactions between electrons is essential for quantum computing.

Creating entangled pairs of electron qubits that span long distances, which is required for teleportation, has proved challenging, though: while photons naturally propagate over long distances, electrons usually are confined to one place.

In order to demonstrate quantum teleportation using electrons, the researchers harnessed a recently developed technique based on the principles of Heisenberg exchange coupling. An individual electron is like a bar magnet with a north pole and a south pole that can point either up or down. The direction of the polewhether the north pole is pointing up or down, for instanceis known as the electrons magnetic moment or quantum spin state. If certain kinds of particles have the same magnetic moment, they cannot be in the same place at the same time. That is, two electrons in the same quantum state cannot sit on top of each other. If they did, their states would swap back and forth in time.

The researchers used the technique to distribute entangled pairs of electrons and teleport their spin states.

We provide evidence for entanglement swapping, in which we create entanglement between two electrons even though the particles never interact, and quantum gate teleportation, a potentially useful technique for quantum computing using teleportation, Nichol says. Our work shows that this can be done even without photons.

The results pave the way for future research on quantum teleportation involving spin states of all matter, not just photons, and provide more evidence for the surprisingly useful capabilities of individual electrons in qubit semiconductors.

References:

Conditional teleportation of quantum-dot spin states by Haifeng Qiao, Yadav P. Kandel, Sreenath K. Manikandan, Andrew N. Jordan, Saeed Fallahi, Geoffrey C. Gardner, Michael J. Manfra and John M. Nichol, 15 June 2020, Nature Communications.DOI: 10.1038/s41467-020-16745-0

Coherent multi-spin exchange in a quantum-dot spin chain by Haifeng Qiao, Yadav P. Kandel, Kuangyin Deng, Saeed Fallahi, Geoffrey C. Gardner, Michael J. Manfra, Edwin Barnes, John M. Nichol, Accepted 12 May 2020, Physical Review X.arXiv: 2001.02277

Read the rest here:

Teleportation Is Indeed Possible At Least in the Quantum World - SciTechDaily

Posted in Quantum Computing | Comments Off on Teleportation Is Indeed Possible At Least in the Quantum World – SciTechDaily

Atos takes the most powerful quantum simulator in the world to the next level with Atos QLM E – GlobeNewswire

Posted: at 6:30 am

Paris, 23 June 2020 Atos, a global leader in digital transformation, extends its portfolio of quantum solutions with Atos QLM Enhanced (Atos QLM E), a new GPU-accelerated range of its Atos Quantum Learning Machine (Atos QLM) offer, the world's highest-performing commercially available quantum simulator. Offering up to 12 times more computation speed, AtosQLME paves the way to optimized digital quantum simulation on the first, intermediate-scale quantum computers to be commercialized in the next few years (called NISQ - Noisy Intermediate-Scale Quantum).

By promising to apply, in the near-term, computation capabilities that are beyond the reach of even the most powerful existing computers to solve complex, real-life problems, NISQ devices will play an important role in determining the commercial potential of quantum computing. Herein lies a double challenge for the industry: developing NISQ-optimized algorithms is as important as building the machines, since both are required to identify concrete applications.

Integrating NVIDIAs V100S PCIe GPUs, Atos QLM E has been optimized to drastically reduce the simulation time of hybrid classical-quantum algorithms simulations, leading to quicker progress in application research. It will allow researchers, students and engineers to leverage some of the most promising variational algorithms (like VQE or QAOA) to further explore models fostering new drugs discovery, tackling pollution with innovative materials or better anticipation of climate change and severe weather phenomena, etc.

Bob Sorensen, Chief Analyst for Quantum Computing at Hyperion Research, said: Atos continues to play a key role in the advancement of the quantum computing sector by offering yet another world-class digital quantum simulator with increasingly powerful capabilities, this time through the inclusion of leading-edge NVIDIA GPUs. This latest Atos QLM offering uses a quantum hardware agnostic architecture that is well suited to support faster development of new quantum systems and related architectures as well as new and innovative quantum algorithms, architectures, and use cases. Since launching the first commercially available quantum system in 2017, Atos has concentrated its efforts on helping an increasing base of users better explore a wide range of practical business and scientific applications, a critical requirement for the overall advancement and long-term viability of the quantum computing sector writ large. The launch of the Atos QLM E is an exciting step for Atos but also for its clients and potential new end users, both of whom could benefit from access to these leading-edge digital quantum simulation capabilities.

Agns Boudot, Senior Vice President, Head of HPC & Quantum at Atos, explained: We are proud to help imagine tomorrows quantum applications. As we are entering the NISQ era, the search for concrete problems that can be solved by quantum computing technologies becomes critical, as it will determine the role they will play in helping society shape a better future. Combining unprecedented simulation performances and a programming and execution environment for hybrid algorithms, Atos QLM E represents a major step towards achieving near time breakthroughs

Atos QLM E is available in six configurations, ranging from 2 to 32 NVIDIA V100S PCIe GPUs. Atos QLM customers have the possibility to upgrade to Atos QLM E at any moment.

The Atos QLM user community continues to grow. Launched in 2017, this platform is being used in numerous countries worldwide includingAustria, Finland, France,Germany, India, Italy, Japan, the Netherlands, Senegal,UKand theUnited States, empowering major research programs in various sectors like industry or energy. Atos ambitious program to anticipate the future of quantum computing the Atos Quantum program was launched in November 2016. As a result of this initiative,Atos was the first organization to offer a quantum noisy simulation module within its Atos QLM offer.

***

About AtosAtos is a global leader in digital transformation with 110,000 employees in 73 countries and annual revenue of 12 billion. European number one in Cloud, Cybersecurity and High-Performance Computing, the Group provides end-to-end Orchestrated Hybrid Cloud, Big Data, Business Applications and Digital Workplace solutions. The Group is the Worldwide Information Technology Partner for the Olympic & Paralympic Games and operates under the brands Atos, Atos|Syntel, and Unify. Atos is a SE (Societas Europaea), listed on the CAC40 Paris stock index.

The purpose of Atos is to help design the future of the information space. Its expertise and services support the development of knowledge, education and research in a multicultural approach and contribute to the development of scientific and technological excellence. Across the world, the Group enables its customers and employees, and members of societies at large to live, work and develop sustainably, in a safe and secure information space.

Press contact

Marion Delmas | marion.delmas@atos.net | +33 6 37 63 91 99 |

See more here:

Atos takes the most powerful quantum simulator in the world to the next level with Atos QLM E - GlobeNewswire

Posted in Quantum Computing | Comments Off on Atos takes the most powerful quantum simulator in the world to the next level with Atos QLM E – GlobeNewswire

Docuseries takes viewers into the lives and labs of scientists – UChicago News

Posted: at 6:30 am

The camera crew was given full access to Earnest-Nobles research. In several scenes, Earnest-Noble is suited up in white PPE in the Pritzker Nanofabrication Facility in the Eckhardt Research Center. His scientific process and the breakthrough he seeks are depicted with animations and close-up footage of the state-of-the-art facilities. The filmmakers capture Earnest-Noble in the midst of a failed attempt or among his graveyard of failed quantum devices. As he embraces his doubts and is propelled by tenacity, viewers witness an emotional depiction of real science.

Earnest-Nobles lively interviews focus on the experience versus the result of his labors, providing a realistic portrayal of graduate studies and enabling viewers to follow him to his goal of identifying the ideal qubit for superpositiona phenomenon in quantum mechanics in which a particle can exist in several states at once.

When we were filming, I was trying to explain a qubit or something, and how much I was using jargon words was eye-opening to me. It helped me appreciate the challenge of making science understandable, said Earnest-Noble, who is now a quantum computing researcher at IBM. Science is a process far more than a series of facts. That became clear to me from working on this project.

Science communications typically takes a very long struggle of discovery and wraps it up into a pretty package, said Schuster. But something I found very special in this story is that you got to follow Nate for a couple of years. It accurately captured what Nates experience was like. And it focused on his experience, and not on the result, which is pretty amazing."

STAGEs director of science Sunanda Prabhu-Gaunkar originally joined the STAGE lab as a postdoc, and taught herself filmmaking in order to create the series. The scientific process inspires our filmmaking, she said. The workflow embraces failure, remains receptive to discoveries through iteration, and allows for risk-taking, all within a highly collaborative process.

Ellen Askey, the pilot episodes co-director, joined the project as a first-year student at UChicago with prior filmmaking experience. She worked on the series across her college career, graduating in June with a degree in cinema and media studies. Showing a story develop over time can be powerful, she said. We hope to get it out there to a lot of people who are and who are not yet interested in science.

Interested attendees can register through Eventbrite.

Adapted from an article by Maureen McMahon posted on the Physical Sciences Division website.

More:

Docuseries takes viewers into the lives and labs of scientists - UChicago News

Posted in Quantum Computing | Comments Off on Docuseries takes viewers into the lives and labs of scientists – UChicago News

Corporate innovation weekly: the rise of hydrogen – Sifted

Posted: at 6:30 am

AutomotiveBattery breakthroughs

Volkswagen invested a further $200m into QuantumScape, a US startup developing solid-state batteries. Volkswagen had already invested $100m in the company, and owned a 5% stake, but is now making a bigger bet in the technology which promises to last longer and charge faster than the current generation of batteries.

Solid-state batteries are more expensive than current lithium-ion ones, but Volkswagen and QuantumScope have plans to set up a pilot factory to investigate how to ramp up industrial-scale production.

Meanwhile, researchers at Brown University have discovered a way to use graphene to increase the toughness of solid-state batteries.

Enel, Europes largest utility, is set to launch a hydrogen business next year. The move comes just as Germany earmarked 9bn to expand its hydrogen capacity and the EU is expected this week to launch a hydrogen strategy with aims to turn this into a 140bn industry by 2030. There is a good summary of the state of play in this Petroleum Economist article.

Get the Sifted Newsletter

Thank you for subscribing to the newsletter!

Santander InnoVentures led the $40m Series D funding round into Upgrade Inc, the US mobile bank.

Spanish insurer Mutua Madrilea and Liberty Mutual Strategic Ventures, corporate venture capital of Liberty Mutual Insurance, took part in the 4.47m Series A funding round for SingularCover, a startup offering personalised insurance for small businesses and freelancers.

Not all healthcare startups are booming. Proteus Digital Health, whose investors include Novartis Venture Fund, filed for bankruptcy. The startup makes pills with sensors that can monitor whether patients have taken them or not.

Medication adherence i.e. getting patients to take pills as prescribed is a huge problem in medicine. But it seems Proteus solution was expensive and patients didnt react well to the idea of swallowing monitoring devices.

Boehringer Ingelheim Venture Fund was one of the investors in the $4.5m seed funding round for DiogenX, a French biotech company developing a drug that could potentially regenerate insulin-producing pancreatic beta cells in type-1 diabetics.

It is one of many approaches being pushed forward to help manage the disease better. We recently wrote about a nanofibre teabag that could be used to implant pancreatic cells in diabetic patients.

Boehringer Ingelheim Venture Fund was also an investor in Belgian biotech eTheRNA, which is developing a vaccine for Covid-19 and recently picked up a 34m Series B funding round. The company uses messenger RNA to develop the vaccine, an approach that is emerging as a front-runner in the race to develop a vaccine for the novel coronavirus.

Healthtech initial public offerings are still going strong. Repare Therapeutics, a precision oncology company backed by Bristol-Myers Squibb and Celgene, and Forma Therapeutics, a Novartis and Eli Lilly-backed company developing therapeutics to treat rare hematologic diseases and cancers both launched successful IPOs.

C4 Therapeutics, which is backed by Roche, Novartis and Kraft Group, raised $170m to start human testing for its treatments that use cells own natural recycling mechanisms to combat diseases.

BNP Paribas Dveloppement was one of the investors putting in a total of $9m into Linkfluence, the French social media listening company. Linkfluence uses natural language processing and image recognition to monitor consumer responses to brands.

Swedish music sampling service Tracklib has raised $4.5m from investors including the Sony Innovation Fund. Artists who have used Tracklibs 100,000+ catalogue of sample-friendly music to create their songs include Lil Wayne, DJ Khaled, Phantogram, Mary J. Blige and J. Cole.

IonQ, the quantum computing startup, extended its Series B funding round with an additional $7m, bringing the total invested in the company to $84m. Lockheed Martin and Robert Bosch Venture Capital were among the new investors.

IonQ is developing a version of quantum computing based on ions suspended in a vacuum by electromagnets. Many of the other quantum computing projects have gone for superconducting qubits that need to be cooled to near absolute zero, but IonQs trapped ion technology has the advantage it can run at room temperature.

Bosch and Lockheed Martin have both made a number of investments in quantum computing. Bosch invested in quantum software company Zapata last year.

Purplebricks and Axel Springer were among investors putting a further 40m into Homeday, the German real estate broker.

Microsofts M12 and Siemens Next47 investment arm took part in the $30m Series B funding round for Wandelbots, the German robotics startup that allows industrial robots to be taught tasks without the need to write code.

The Covid-19 pandemic is causing demand for robots to surge, and the cost of robots has been coming down dramatically over the past decade. However, the time and effort needed for coding robots has still put them out of reach for many users. A no-code robot could pave the way for much broader adoption.

SLAMcore, a UK startup developing spatial awareness algorithms for robots and drones, raised $5m from investors including Toyota AI Ventures.

Red Elctrica de Espaa, which runs the national grid in Spain, was one of the investors in the $5m funding round for CounterCraft, a US-based cybersecurity company that turns the tables and counterattacks cyber attackers.

Kristin Aamodt isleaving Equinor Technology Ventures

Nerida Scotthas been appointed Johnson & Johnsons new head of innovation for EMEA, based in London

Innovation manager, a growing fast-moving consumer goods (FMCG) business specialising in the petcare sector, Hertfordshire, UK

Director of consulting Global Hybrid Agency, London, UK

VP strategy, Huge (agency), London, UK

Strategy director, R/GA, Berlin, Germany

Strategic business developer, New Solutions & Propositions, Vattenfall (energy), Stockholm, Sweden

Director of AI, Nordic countries, Huawei, Stockholm, Sweden

A fascinating look at how the way we cook and eat is changing by Fast Company. A huge number of new terms will have to be coined for this. Welcome to edu-cooking and the groceraunt.

Research from McKinsey suggests that innovation is indeed drying up at companies as other priorities take up executive brainspace. Focus on innovation is down in every industry except for healthcare.

Read the original:

Corporate innovation weekly: the rise of hydrogen - Sifted

Posted in Quantum Computing | Comments Off on Corporate innovation weekly: the rise of hydrogen – Sifted

17 ways technology could change the world by 2025 – The European Sting

Posted: at 6:30 am

(Credit: Unsplash)

This article is brought to you thanks to the collaboration ofThe European Stingwith theWorld Economic Forum.

Author: Saemoon Yoon, Community Lead, Technology Pioneers, World Economic Forum Geneva

We asked our 2020 intake of Technology Pioneers for their views on how technology will change the world in the next five years.

1. AI-optimized manufacturing

Paper and pencil tracking, luck, significant global travel and opaque supply chains are part of todays status quo, resulting in large amounts of wasted energy, materials and time. Accelerated in part by the long-term shutdown of international and regional travel by COVID-19, companies that design and build products will rapidly adopt cloud-based technologies to aggregate, intelligently transform, and contextually present product and process data from manufacturing lines throughout their supply chains. By 2025, this ubiquitous stream of data and the intelligent algorithms crunching it will enable manufacturing lines to continuously optimize towards higher levels of output and product quality reducing overall waste in manufacturing by up to 50%. As a result, we will enjoy higher quality products, produced faster, at lower cost to our pocketbooks and the environment.

Anna-Katrina Shedletsky, CEO and Founder of Instrumental

2. A far-reaching energy transformation

In 2025, carbon footprints will be viewed as socially unacceptable, much like drink driving is today. The COVID-19 pandemic will have focused the publics attention on the need to take action to deal with threats to our way of life, our health and our future. Public attention will drive government policy and behavioural changes, with carbon footprints becoming a subject of worldwide scrutiny. Individuals, companies and countries will seek the quickest and most affordable ways to achieve net-zero the elimination of their carbon footprint. The creation of a sustainable, net-zero future will be built through a far-reaching energy transformation that significantly reduces the worlds carbon emissions, and through the emergence of a massive carbon management industry that captures, utilizes and eliminates carbon dioxide. Well see a diversity of new technologies aimed at both reducing and removing the worlds emissions unleashing a wave of innovation to compare with the industrial and digital Revolutions of the past.

Steve Oldham, CEO of Carbon Engineering

3. A new era of computing

By 2025, quantum computing will have outgrown its infancy, and a first generation of commercial devices will be able tackle meaningful, real-world problems. One major application of this new kind of computer will be the simulation of complex chemical reactions, a powerful tool that opens up new avenues in drug development. Quantum chemistry calculations will also aid the design of novel materials with desired properties, for instance better catalysts for the automotive industry that curb emissions and help fight climate change. Right now, the development of pharmaceuticals and performance materials relies massively on trial and error, which means it is an iterative, time-consuming and terribly expensive process. Quantum computers may soon be able to change this. They will significantly shorten product development cycles and reduce the costs for R&D.

Thomas Monz, Co-Founder and CEO of Alpine Quantum Technologies

4. Healthcare paradigm shift to prevention through diet

By 2025, healthcare systems will adopt more preventative health approaches based on the developing science behind the health benefits of plant-rich, nutrient-dense diets. This trend will be enabled by AI-powered and systems biology-based technology that exponentially grows our knowledge of the role of specific dietary phytonutrients in specific human health and functional outcomes. After the pandemic of 2020, consumers will be more aware of the importance of their underlying health and will increasingly demand healthier food to help support their natural defences. Armed with a much deeper understanding of nutrition, the global food industry can respond by offering a broader range of product options to support optimal health outcomes. The healthcare industry can respond by promoting earths plant intelligence for more resilient lives and to incentivize people to take care of themselves in an effort to reduce unsustainable costs.

Jim Flatt, Co-Founder and CEO of Brightseed

5. 5G will enhance the global economy and save lives

Overnight, weve experienced a sharp increase in delivery services with a need for day-of goods from providers like Amazon and Instacart but it has been limited. With 5G networks in place, tied directly into autonomous bots, goods would be delivered safely within hours.

Wifi cant scale to meet higher capacity demands. Sheltering-in-place has moved businesses and classrooms to video conferencing, highlighting poor-quality networks. Low latency 5G networks would resolve this lack of network reliability and even allow for more high-capacity services like telehealth, telesurgery and ER services. Businesses can offset the high cost of mobility with economy-boosting activities including smart factories, real-time monitoring, and content-intensive, real-time edge-compute services. 5G private networks make this possible and changes the mobile services economy.

The roll-out of 5G creates markets that we only imagine like self-driving bots, along with a mobility-as-a-service economy and others we cant imagine, enabling next generations to invent thriving markets and prosperous causes.

Maha Achour, Founder and CEO of Metawave

6. A new normal in managing cancer

Technology drives data, data catalyzes knowledge, and knowledge enables empowerment. In tomorrows world, cancer will be managed like any chronic health condition we will be able to precisely identify what we may be facing and be empowered to overcome it.

In other words, a new normal will emerge in how we can manage cancer. We will see more early and proactive screening with improved diagnostics innovation, such as in better genome sequencing technology or in liquid biopsy, that promises higher ease of testing, higher accuracy and ideally at an affordable cost. Early detection and intervention in common cancer types will not only save lives but reduce the financial and emotional burden of late discovery.

We will also see a revolution in treatment propelled by technology. Gene editing and immunotherapy that bring fewer side effects will have made greater headway. With advances in early screening and treatment going hand in hand, cancer will no longer be the cursed C word that inspires such fear among people.

Sizhen Wang, CEO of Genetron Health

Historically, robotics has turned around many industries, while a few select sectors such as grocery retail have remained largely untouched . With the use of a new robotics application called microfulfillment, Grocery retailing will no longer look the same. The use of robotics downstream at a hyper local level (as opposed to the traditional upstream application in the supply chain) will disrupt this 100-year-old, $5 trillion industry and all its stakeholders will experience significant change. Retailers will operate at a higher order of magnitude on productivity, which will in turn result in positive and enticing returns in the online grocery business (unheard of at the moment). This technology also unlocks broader access to food and a better customer proposition to consumers at large: speed, product availability and cost. Microfulfillment centers are located in existing (and typically less productive) real estate at the store level and can operate 5-10% more cheaply than a brick and mortar store. We predict that value will be equally captured by retailers and consumers as online.

Jose Aguerrevere, Co-Founder, Chairman and CEO of Takeoff Technologies

8. A blurring of physical and virtual spaces

One thing the current pandemic has shown us is how important technology is for maintaining and facilitating communication not simply for work purposes, but for building real emotional connections. In the next few years we can expect to see this progress accelerate, with AI technology built to connect people at a human level and drive them closer to each other, even when physically theyre apart. The line between physical space and virtual will forever be blurred. Well start to see capabilities for global events from SXSW to the Glastonbury Festival to provide fully digitalized alternatives, beyond simple live streaming into full experiences. However, its not as simple as just providing these services data privacy will have to be prioritised in order to create confidence among consumers. At the beginning of the COVID-19 pandemic we saw a lot in the news about concerns over the security of video conferencing companies. These concerns arent going anywhere and as digital connectivity increases, brands simply cant afford to give users anything less than full transparency and control over their data.

Tugce Bulut, CEO of Streetbees

9. Putting individuals not institutions at the heart of healthcare

By 2025, the lines separating culture, information technology and health will be blurred. Engineering biology, machine learning and the sharing economy will establish a framework for decentralising the healthcare continuum, moving it from institutions to the individual. Propelling this forward are advances in artificial intelligence and new supply chain delivery mechanisms, which require the real-time biological data that engineering biology will deliver as simple, low-cost diagnostic tests to individuals in every corner of the globe. As a result, morbidity, mortality and costs will decrease in acute conditions, such as infectious diseases, because only the most severe cases will need additional care. Fewer infected people will leave their homes, dramatically altering disease epidemiology while decreasing the burden on healthcare systems. A corresponding decrease in costs and increase in the quality of care follows, as inexpensive diagnostics move expenses and power to the individual, simultaneously increasing the cost-efficiency of care. Inextricable links between health, socio-economic status and quality of life will begin to loosen, and tensions that exist by equating health with access to healthcare institutions will dissipate. From daily care to pandemics, these converging technologies will alter economic and social factors to relieve many pressures on the global human condition.

Rahul Dhanda, Co-Founder and CEO of Sherlock Biosciences

10. The future of construction has already begun

Construction will become a synchronized sequence of manufacturing processes, delivering control, change and production at scale. It will be a safer, faster and more cost-effective way to build the homes, offices, factories and other structures we need to thrive in cities and beyond. As rich datasets are created across the construction industry through the internet of things, AI and image capture, to name a few, this vision is already coming to life. Using data to deeply understand industry processes is profoundly enhancing the ability of field professionals to trust their instincts in real-time decision making, enabling learning and progress while gaining trust and adoption.

Actionable data sheds light where we could not see before, empowering leaders to manage projects proactively rather than reactively. Precision in planning and execution enables construction professionals to control the environment, instead of it controlling them, and creates repeatable processes that are easier to control, automate, and teach.

Thats the future of construction. And its already begun.

Meirav Oren, CEO and Co-Founder of Versatile

11. Gigaton-scale CO2 removal will help to reverse climate change

A scale up of negative emission technologies, such as carbon dioxide removal, will remove climate-relevant amounts of CO2 from the air. This will be necessary in order to limit global warming to 1.5C. While humanity will do everything possible to stop emitting more carbon into the atmosphere, it will also do everything it can in order to remove historic CO2 from the air permanently. By becoming widely accessible, the demand for CO2 removal will increase and costs will fall. CO2 removal will be scaled up to the gigaton-level, and will become the responsible option for removing unavoidable emissions from the air. It will empower individuals to have a direct and climate-positive impact on the level of CO2 in the atmosphere. It will ultimately help to prevent global warming from reaching dangerous levels and give humanity the potential to reverse climate change.

Jan Wurzbacher, Co-Founder and co-CEO of Climeworks

12. A new era in medicine

Medicine has always been on a quest to gather more knowledge and understanding of human biology for better clinical decision-making. AI is that new tool that will enable us to extract more insights at an unprecedented level from all the medical big data that has never really been fully taken advantage of in the past. It will shift the world of medicine and how it is practiced.

Brandon Suh, CEO of Lunit

13. Closing the wealth gap

Improvements in AI will finally put access to wealth creation within reach of the masses. Financial advisors, who are knowledge workers, have been the mainstay of wealth management: using customized strategies to grow a small nest egg into a larger one. Since knowledge workers are expensive, access to wealth management has often meant you already need to be wealthy to preserve and grow your wealth. As a result, historically, wealth management has been out of reach of those who needed it most. Artificial intelligence is improving at such a speed that the strategies employed by these financial advisors will be accessible via technology, and therefore affordable for the masses. Just like you dont need to know how near-field communication works to use ApplePay, tens of millions of people wont have to know modern portfolio theory to be able to have their money work for them.

Atish Davda, Co-Founder and CEO of Equityzen

14. A clean energy revolution supported by digital twins

Over the next five years, the energy transition will reach a tipping point. The cost of new-build renewable energy will be lower than the marginal cost of fossil fuels. A global innovation ecosystem will have provided an environment in which problems can be addressed collectively, and allowed for the deployment of innovation to be scaled rapidly. As a result, we will have seen an astounding increase in offshore wind capacity. We will have achieved this through an unwavering commitment to digitalization, which will have gathered a pace that aligns with Moores law to mirror solars innovation curve. The rapid development of digital twins virtual replicas of physical devices will support a systems-level transformation of the energy sector. The scientific machine learning that combines physics-based models with big data will lead to leaner designs, lower operating costs and ultimately clean, affordable energy for all. The ability to monitor structural health in real-time and fix things before they break will result in safer, more resilient infrastructure and everything from wind farms to bridges and unmanned aerial vehicles being protected by a real-time digital twin.

Thomas Laurent, CEO of Akselos

15. Understanding the microscopic secrets hidden on surfaces

Every surface on Earth carries hidden information that will prove essential for avoiding pandemic-related crises, both now and in the future. The built environment, where humans spend 90% of their lives, is laden with naturally occurring microbiomes comprised of bacterial, fungal and viral ecosystems. Technology that accelerates our ability to rapidly sample, digitalize and interpret microbiome data will transform our understanding of how pathogens spread. Exposing this invisible microbiome data layer will identify genetic signatures that can predict when and where people and groups are shedding pathogens, which surfaces and environments present the highest transmission risk, and how these risks are impacted by our actions and change over time. We are just scratching the surface of what microbiome data insights offer and will see this accelerate over the next five years. These insights will not only help us avoid and respond to pandemics, but will influence how we design, operate and clean environments like buildings, cars, subways and planes, in addition to how we support economic activity without sacrificing public health.

Jessica Green, Co-Founder and CEO of Phylagen

16. Machine learning and AI expedite decarbonization in carbon-heavy industries

Over the next five years, carbon-heavy industries will use machine learning and AI technology to dramatically reduce their carbon footprint. Traditionally, industries like manufacturing and oil and gas have been slow to implement decarbonization efforts as they struggle to maintain productivity and profitability while doing so. However, climate change, as well as regulatory pressure and market volatility, are pushing these industries to adjust. For example, oil and gas and industrial manufacturing organizations are feeling the pinch of regulators, who want them to significantly reduce CO2 emissions within the next few years. Technology-enabled initiatives were vital to boosting decarbonizing efforts in sectors like transportation and buildings and heavy industries will follow a similar approach. Indeed, as a result of increasing digital transformation, carbon-heavy sectors will be able to utilize advanced technologies, like AI and machine learning, using real-time, high-fidelity data from billions of connected devices to efficiently and proactively reduce harmful emissions and decrease carbon footprints.

David King, CEO of FogHorn Systems

17. Privacy is pervasive and prioritized

Despite the accelerating regulatory environments weve seen surface in recent years, we are now just seeing the tip of the privacy iceberg, both from a regulatory and consumer standpoint. Five years from now, privacy and data-centric security will have reached commodity status and the ability for consumers to protect and control sensitive data assets will be viewed as the rule rather than the exception. As awareness and understanding continue to build, so will the prevalence of privacy preserving and enhancing capabilities, namely privacy-enhancing technologies (PET). By 2025, PET as a technology category will become mainstream. They will be a foundational element of enterprise privacy and security strategies rather than an added-on component integrated only meet a minimum compliance threshold. While the world will still lack a global privacy standard, organizations will embrace a data-centric approach to security that provides the flexibility necessary to adapt to regional regulations and consumer expectations. These efforts will be led by cross-functional teams representing the data, privacy and security interests within an organization.

Ellison Anne Williams, Founder and CEO of Enveil

How will technology change the world in the next five years?

It is very exciting to see the pace and transformative potential of todays innovative technologies being applied to solve the worlds most pressing problems, such as feeding a global and growing population; improving access to and quality of healthcare; and significantly reducing carbon emissions to arrest the negative effects of climate change. The next five years will see profound improvements in addressing these challenges as entrepreneurs, the investment community and the worlds largest enterprise R&D organizations focus on developing and deploying solutions that will deliver tangible results.

While the COVID-19 pandemic has provided a difficult lesson in just how susceptible our world is today to human and economic turmoil, it has also perhaps for the first time in history necessitated global collaboration, data transparency and speed at the highest levels of government in order to minimize an immediate threat to human life. History will be our judge, but despite the heroic resolve and resiliency on a country by country basis, as a world we have underperformed. As a global community and through platforms like the World Economic Forum, we must continue to bring visibility to these issues while recognizing and supporting the opportunities for technology and innovation that can best and most rapidly address them.

Continue reading here:

17 ways technology could change the world by 2025 - The European Sting

Posted in Quantum Computing | Comments Off on 17 ways technology could change the world by 2025 – The European Sting

AI fuels boom in innovation, investment and jobs in Canada: U of T report – News@UofT

Posted: at 6:30 am

Canadas artificial intelligence sector is fuelling innovation, job creation and private sector investment and University of Toronto researchers and entrepreneurs are playing a central role in that success, according to a report based on data compiledby Ottawas Global Advantage Consulting Group.

The report, prepared by U of T, found that Canadas unique combination of public investment, private capital, research capacity and talent has generated over 50,000 jobs and attracted nearly $3 billion in investment since 2010, with the number of active AI firms in Canada doubling to more than 670 since 2015.

U of T alone has produced 81 active AI startups, according to Global Advantage, a research and analytics firm that provides ecosystem mapping and analysis services to private and public sector clients. In total, AI-powered startups connected to U of T have raised $183 million in funding and created over 600 jobs in the last five years, the report says.

Vivek Goel, U of Ts vice-president, research and innovation, and strategic initiatives, says the report offers further evidence of the success of the federal governments Pan-Canadian AI Strategy, launched in 2017 with a $125-million commitment over five years.

Canada, and Toronto in particular, have long been recognized as global hubs of AI research thanks to the pioneering work of people like [U of T University Professor Emeritus]Geoffrey Hinton, but in the past many people trained in U of Ts machine learning group ended up going abroad to work for big tech companies, says Goel.

This report shows that the Pan-Canadian AI Strategy has helped create the conditions necessary to retain that talent in Canada presenting opportunities to be involved in further research and training so we can have the talent supply needed to fuel Canadian research, innovation and application in business sectors.

As an example, Goel cites the impact of the Vector Institute for Artificial Intelligence, launched three years ago with $50 million in support from the government of Ontario and another $80 million from industry partners, in taking what was happening in the university, connecting it with the business community and getting it out into the marketplace before people in other countries could do it.

The investment in Canadas AI research foundation is now yielding important applications and advances in the fight to contain and treat the COVID-19 virus. A few of the research projects detailed in the report titled Canadas AI Ecosystem: Government Investment Propels Private Sector Growth include:

Health-related AI applications have drawn particular interest from private sector investors, with over 15 per cent of AI-related private investment between 2015 and 2019 going to companies operating in health care and related areas like cloud computing and cybersecurity.

Deep Genomics, an AI-powered drug discovery startup co-founded by U of T ProfessorBrendan Freyof the Edward S. Rogers Sr. department of electrical and computer engineering in the Faculty of Applied Science & Engineering, is one of many AI startups in the health-care space. The company has so far raised $61 million, including a recent $40-million Series B financing, as it works to develop a drug candidate for Wilson disease, a rare and potentially fatal genetic disorder, based on calculations performed by its systems.

Brendan Freyof the Faculty of Applied Science & Engineering co-founded Deep Genomics, which hasso far raised $61 millionas it works to develop a drug candidate for a rare and potentially fatal genetic disorder(photo by Johnny Guatto)

U of T startups are also applying AI to a variety of other problems, from medical imaging to quantum computing and consumer research. The report underlines that Canada is an innovation leader in the AI sector, producing the most AI patents per million people among the G7 countries and China, while Toronto has attracted the densest cluster of AI startups in the world.

Blue J Legal, a Toronto startup co-founded by three members of U of Ts Faculty of Law, uses AI to predict the outcomes of tax and employment law cases. The company launched its Canadian tax law product in 2017 and employment law offering in 2018. In 2019, it expanded into the U.S. tax law market, where it has already secured more than a dozen law firms as clients.

Both Blue J Legal and Deep Genomics emerged from U of Ts expansive entrepreneurship ecosystem, having received early support from CDL and UTEST two of U of Ts many startup accelerators.

Several of U of Ts AI startups will be in focus this week during Collision at Home, the online edition of one of the worlds fastest-growing tech conferences, which draws speakers, entrepreneurs, inventors, investors and business leaders from around the world.The event is being held virtually this year because of the COVID-19 pandemic. But U of T entrepreneurs, researchers and students will still have a major presence, with more than two dozen U of T startups scheduled to participate.

A rendering of theSchwartz Reisman Innovation Centre on College Street(rendering by Weis/Manfredi)

In all, U of Ts AI programs attracted $244 million in research funding between 2015 and 2019 a period that saw substantial increases in funding for AI research from the federal government. This, in turn, has allowed Canada to outperform many countries in two key metrics: field-weighted citation impact of AI research publications and academic-corporate collaborations, the report says.

The report also notes that the expanding AI ecosystem around U of T is attracting philanthropists in addition to investors. That includes a $100-million gift to U of T from business leaders Heather Reisman and Gerald Schwartz. The money is being used to construct the Schwartz Reisman Innovation Centre, a 750,000 square foot complex that will anchor U of Ts cluster of AI and biomedical researchers, as well as entrepreneurs and their startups. It will also be home to the Schwartz Reisman Institute for Technology and Society, which will explore the social implications of AI and other emerging technologies.

If you look at the most successful innovation ecosystems, they always have an anchor academic institution a leading global university; they always have anchor multinational corporations and they have a thriving startup ecosystem, Goel says.

We have all those pieces coming together around the AI and tech sector here.

Read more here:

AI fuels boom in innovation, investment and jobs in Canada: U of T report - News@UofT

Posted in Quantum Computing | Comments Off on AI fuels boom in innovation, investment and jobs in Canada: U of T report – News@UofT

To live up to the hype, quantum computers must repair their error problems – Science News

Posted: June 22, 2020 at 2:45 pm

Astronaut John Glenn was wary about trusting a computer.

It was 1962, early in the computer age, and a room-sized machine had calculated the flight path for his upcoming orbit of Earth the first for an American. But Glenn wasnt willing to entrust his life to a newfangled machine that might make a mistake.

The astronaut requested that mathematician Katherine Johnson double-check the computers numbers, as recounted in the book Hidden Figures. If she says theyre good, Glenn reportedly said, then Im ready to go. Johnson determined that the computer, an IBM 7090, was correct, and Glenns voyage became a celebrated milestone of spaceflight (SN: 3/3/62, p. 131).

A computer that is even slightly error-prone can doom a calculation. Imagine a computer with 99 percent accuracy. Most of the time the computer tells you 1+1=2. But once every 100 calculations, it flubs: 1+1=3. Now, multiply that error rate by the billions or trillions of calculations per second possible in a typical modern computer. For complex computations, a small probability for error can quickly generate a nonsense answer. If NASA had been relying on a computer that glitchy, Glenn would have been right to be anxious.

Luckily, modern computers are very reliable. But the era of a new breed of powerful calculator is dawning. Scientists expect quantum computers to one day solve problems vastly too complex for standard computers (SN: 7/8/17, p. 28).

Current versions are relatively wimpy, but with improvements, quantum computers have the potential to search enormous databases at lightning speed, or quickly factor huge numbers that would take a normal computer longer than the age of the universe. The machines could calculate the properties of intricate molecules or unlock the secrets of complicated chemical reactions. That kind of power could speed up the discovery of lifesaving drugs or help slash energy requirements for intensive industrial processes such as fertilizer production.

But theres a catch: Unlike todays reliable conventional computers, quantum computers must grapple with major error woes. And the quantum calculations scientists envision are complex enough to be impossible to redo by hand, as Johnson did for Glenns ambitious flight.

If errors arent brought under control, scientists high hopes for quantum computers could come crashing down to Earth.

Conventional computers which physicists call classical computers to distinguish them from the quantum variety are resistant to errors. In a classical hard drive, for example, the data are stored in bits, 0s or 1s that are represented by magnetized regions consisting of many atoms. That large group of atoms offers a built-in redundancy that makes classical bits resilient. Jostling one of the bits atoms wont change the overall magnetization of the bit and its corresponding value of 0 or 1.

But quantum bits or qubits are inherently fragile. They are made from sensitive substances such as individual atoms, electrons trapped within tiny chunks of silicon called quantum dots, or small bits of superconducting material, which conducts electricity without resistance. Errors can creep in as qubits interact with their environment, potentially including electromagnetic fields, heat or stray atoms or molecules. If a single atom that represents a qubit gets jostled, the information the qubit was storing is lost.

Additionally, each step of a calculation has a significant chance of introducing error. As a result, for complex calculations, the output will be garbage, says quantum physicist Barbara Terhal of the research center QuTech in Delft, Netherlands.

Before quantum computers can reach their much-hyped potential, scientists will need to master new tactics for fixing errors, an area of research called quantum error correction. The idea behind many of these schemes is to combine multiple error-prone qubits to form one more reliable qubit. The technique battles what seems to be a natural tendency of the universe quantum things eventually lose their quantumness through interactions with their surroundings, a relentless process known as decoherence.

Its like fighting erosion, says Ken Brown, a quantum engineer at Duke University. But quantum error correction provides a way to control the seemingly uncontrollable.

Scientists and journalists share a core belief in questioning, observing and verifying to reach the truth. Science News reports on crucial research and discovery across science disciplines. We need your financial support to make it happen every contribution makes a difference.

Quantum computers gain their power from the special rules that govern qubits. Unlike classical bits, which have a value of either 0 or 1, qubits can take on an intermediate state called a superposition, meaning they hold a value of 0 and 1 at the same time. Additionally, two qubits can be entangled, with their values linked as if they are one entity, despite sitting on opposite ends of a computer chip.

These unusual properties give quantum computers their game-changing method of calculation. Different possible solutions to a problem can be considered simultaneously, with the wrong answers canceling one another out and the right one being amplified. That allows the computer to quickly converge on the correct solution without needing to check each possibility individually.

The concept of quantum computers began gaining steam in the 1990s, when MIT mathematician Peter Shor, then at AT&T Bell Laboratories in Murray Hill, N.J., discovered that quantum computers could quickly factor large numbers (SN Online: 4/10/14). That was a scary prospect for computer security experts, because the fact that such a task is difficult is essential to the way computers encrypt sensitive information. Suddenly, scientists urgently needed to know if quantum computers could become reality.

Shors idea was theoretical; no one had demonstrated that it could be done in practice. Qubits might be too temperamental for quantum computers to ever gain the upper hand. It may be that the whole difference in the computational power depends on this extreme accuracy, and if you dont have this extreme accuracy, then this computational power disappears, says theoretical computer scientist Dorit Aharonov of Hebrew University of Jerusalem.

But soon, scientists began coming up with error-correction schemes that theoretically could fix the mistakes that slip into quantum calculations and put quantum computers on more solid footing.

For classical computers, correcting errors, if they do occur, is straightforward. One simple scheme goes like this: If your bit is a 1, just copy that three times for 111. Likewise, 0 becomes 000. If one of those bits is accidentally flipped say, 111 turns into 110, the three bits will no longer match, indicating an error. By taking the majority, you can determine which bit is wrong and fix it.

But for quantum computers, the picture is more complex, for several reasons. First, a principle of quantum mechanics called the no-cloning theorem says that its impossible to copy an arbitrary quantum state, so qubits cant be duplicated.

Secondly, making measurements to check the values of qubits wipes their quantum properties. If a qubit is in a superposition of 0 and 1, measuring its value will destroy that superposition. Its like opening the box that contains Schrdingers cat. This imaginary feline of quantum physics is famously both dead and alive when the box is closed, but opening it results in a cat thats entirely dead or entirely alive, no longer in both states at once (SN: 6/25/16, p. 9).

So schemes for quantum error correction apply some work-arounds. Rather than making outright measurements of qubits to check for errors opening the box on Schrdingers cat scientists perform indirect measurements, which measure what error occurred, but leave the actual information [that] you want to maintain untouched and unmeasured, Aharonov says. For example, scientists can check if the values of two qubits agree with one another without measuring their values. Its like checking whether two cats in boxes are in the same state of existence without determining whether theyre both alive or both dead.

And rather than directly copying qubits, error-correction schemes store data in a redundant way, with information spread over multiple entangled qubits, collectively known as a logical qubit. When individual qubits are combined in this way, the collective becomes more powerful than the sum of its parts. Its a bit like a colony of ants. Each individual ant is relatively weak, but together, they create a vibrant superorganism.

Those logical qubits become the error-resistant qubits of the final computer. If your program requires 10 qubits to run, that means it needs 10 logical qubits which could require a quantum computer with hundreds or even hundreds of thousands of the original, error-prone physical qubits. To run a really complex quantum computation, millions of physical qubits may be necessary more plentiful than the ants that discovered a slice of last nights pizza on the kitchen counter.

Creating that more powerful, superorganism-like qubit is the next big step in quantum error correction. Physicists have begun putting together some of the pieces needed, and hope for success in the next few years.

Massive excitement accompanied last years biggest quantum computing milestone: quantum supremacy. Achieved by Google researchers in October 2019, it marked the first time a quantum computer was able to solve a problem that is impossible for any classical computer (SN Online: 10/23/19). But the need for error correction means theres still a long way to go before quantum computers hit their stride.

Sure, Googles computer was able to solve a problem in 200 seconds that the company claimed would have taken the best classical computer 10,000 years. But the task, related to the generation of random numbers, wasnt useful enough to revolutionize computing. And it was still based on relatively imprecise qubits. That wont cut it for the most tantalizing and complex tasks, like faster database searches. We need a very small error rate to run these long algorithms, and you only get those with error correction in place, says physicist and computer scientist Hartmut Neven, leader of Googles quantum efforts.

So Neven and colleagues have set their sights on an error-correction technique called the surface code. The most buzzed-about scheme for error correction, the surface code is ideal for superconducting quantum computers, like the ones being built by companies including Google and IBM (the same company whose pioneering classical computer helped put John Glenn into space). The code is designed for qubits that are arranged in a 2-D grid in which each qubit is directly connected to neighboring qubits. That, conveniently, is the way superconducting quantum computers are typically laid out.

As in an ant colony with workers and soldiers, the surface code requires that different qubits have different jobs. Some are data qubits, which store information, and others are helper qubits, called ancillas. Measurements of the ancillas allow for checking and correcting of errors without destroying the information stored in the data qubits. The data and ancilla qubits together make up one logical qubit with, hopefully, a lower error rate. The more data and ancilla qubits that make up each logical qubit, the more errors that can be detected and corrected.

In 2015, Google researchers and colleagues performed a simplified version of the surface code, using nine qubits arranged in a line. That setup, reported in Nature, could correct a type of error called a bit-flip error, akin to a 0 going to a 1. A second type of error, a phase flip, is unique to quantum computers, and effectively inserts a negative sign into the mathematical expression describing the qubits state.

Now, researchers are tackling both types of errors simultaneously. Andreas Wallraff, a physicist at ETH Zurich, and colleagues showed that they could detect bit- and phase-flip errors using a seven-qubit computer. They could not yet correct those errors, but they could pinpoint cases where errors occurred and would have ruined a calculation, the team reported in a paper published June 8 in Nature Physics. Thats an intermediate step toward fixing such errors.

But to move forward, researchers need to scale up. The minimum number of qubits needed to do the real-deal surface code is 17. With that, a small improvement in the error rate could be achieved, theoretically. But in practice, it will probably require 49 qubits before theres any clear boost to the logical qubits performance. That level of error correction should noticeably extend the time before errors overtake the qubit. With the largest quantum computers now reaching 50 or more physical qubits, quantum error correction is almost within reach.

IBM is also working to build a better qubit. In addition to the errors that accrue while calculating, mistakes can occur when preparing the qubits, or reading out the results, says physicist Antonio Crcoles of IBMs Thomas J. Watson Research Center in Yorktown Heights, N.Y. He and colleagues demonstrated that they could detect errors made when preparing the qubits, the process of setting their initial values, the team reported in 2017 in Physical Review Letters. Crcoles looks forward to a qubit that can recover from all these sorts of errors. Even if its only a single logical qubit that will be a major breakthrough, Crcoles says.

In the meantime, IBM, Google and other companies still aim to make their computers useful for specific applications where errors arent deal breakers: simulating certain chemical reactions, for example, or enhancing artificial intelligence. But the teams continue to chase the error-corrected future of quantum computing.

Its been a long slog to get to the point where doing error correction is even conceivable. Scientists have been slowly building up the computers, qubit by qubit, since the 1990s. One thing is for sure: Error correction seems to be really hard for anybody who gives it a serious try, Wallraff says. Lots of work is being put into it and creating the right amount of progress seems to take some time.

For error correction to work, the original, physical qubits must stay below a certain level of flakiness, called a threshold. Above this critical number, error correction is just going to make life worse, Terhal says. Different error-correction schemes have different thresholds. One reason the surface code is so popular is that it has a high threshold for error. It can tolerate relatively fallible qubits.

Imagine youre really bad at arithmetic. To sum up a sequence of numbers, you might try adding them up several times, and picking the result that came up most often.

Lets say you do the calculation three times, and two out of three of your calculations agree. Youd assume the correct solution was the one that came up twice. But what if you were so error-prone that you accidentally picked the one that didnt agree? Trying to correct your errors could then do more harm than good, Terhal says.

Headlines and summaries of the latest Science News articles, delivered to your inbox

The error-correction method scientists choose must not introduce more errors than it corrects, and it must correct errors faster than they pop up. But according to a concept known as the threshold theorem, discovered in the 1990s, below a certain error rate, error correction can be helpful. It wont introduce more errors than it corrects. That discovery bolstered the prospects for quantum computers.

The fact that one can actually hope to get below this threshold is one of the main reasons why people started to think that these computers could be realistic, says Aharonov, one of several researchers who developed the threshold theorem.

The surface codes threshold demands qubits that err a bit less than 1 percent of the time. Scientists recently reached that milestone with some types of qubits, raising hopes that the surface code can be made to work in real computers.

But the surface code has a problem: To improve the ability to correct errors, each logical qubit needs to be made of many individual physical qubits, like a populous ant colony. And scientists will need many of these superorganism-style logical qubits, meaning millions of physical qubits, to do many interesting computations.

Since quantum computers currently top out at fewer than 100 qubits (SN: 3/31/18, p. 13), the days of million-qubit computers are far in the future. So some researchers are looking at a method of error correction that wouldnt require oodles of qubits.

Everybodys very excited, but theres these questions about, How long is it going to take to scale up to the stage where well have really robust computations? says physicist Robert Schoelkopf of Yale University. Our point of view is that actually you can make this task much easier, but you have to be a little bit more clever and a little bit more flexible about the way youre building these systems.

Schoelkopf and colleagues use small, superconducting microwave cavities that allow particles of light, or photons, to bounce back and forth within. The numbers of photons within the cavities serve as qubits that encode the data. For example, two photons bouncing around in the cavity might represent a qubit with a value of 0, and four qubits might indicate a value of 1. In these systems, the main type of error that can occur is the loss of a photon. Superconducting chips interface with those cavities and are used to perform operations on the qubits and scout for errors. Checking whether the number of photons is even or odd can detect that type of error without destroying the data.

Using this method, Schoelkopf and colleagues reported in 2016 in Naturethat they can perform error correction that reaches the break-even point. The qubit is just beginning to show signs that it performs better with error correction.

To me, Aharonov says, whether you actually can correct errors is part of a bigger issue. The physics that occurs on small scales is vastly different from what we experience in our daily lives. Quantum mechanics seems to allow for a totally new kind of computation. Error correction is key to understanding whether that dramatically more powerful type of calculation is truly possible.

Scientists believe that quantum computers will prove themselves to be fundamentally different than the computer that helped Glenn make it into orbit during the space race. This time, the moon shot is to show that hunch is right.

See the original post:

To live up to the hype, quantum computers must repair their error problems - Science News

Posted in Quantum Computing | Comments Off on To live up to the hype, quantum computers must repair their error problems – Science News

Honeywell Says It Has Built The Worlds Most Powerful Quantum Computer – Forbes

Posted: at 2:45 pm

Honeywell says its new quantum computer is twice as fast than any other machine.

In the race to the future of quantum computing, Honeywell has just secured a fresh lead.

The North Carolina-based conglomerate announced Thursday that it has produced the worlds fastest quantum computer, at least twice as powerful as the existing computers operated by IBM and Google.

The machine, located in a 1,500-square-foot high-security storage facility in Boulder, Colorado, consists of a stainless steel chamber about the size of basketball that is cooled by liquid helium at a temperature just above absolute zero, the point at which atoms stop vibrating. Within that chamber, individual atoms floating above a computer chip are targeted with lasers to perform calculations.

While people have studied the potential of quantum computing for decades, that is, building machines with the ability to complete calculations beyond the limits of classic computers and supercomputers, the sector has until recently been limited to the intrigue of research groups at tech companies such as IBM and Google.

But in the past year, the race between those companies to claim supremacy and provide a commercial use in the quantum race has become heated. Honeywells machine has achieved a Quantum Volume of 64, a metric devised by IBM that measures the capability of the machine and error rates, but is also difficult to decipher (and as quantum computing expert Scott Aaronson wrote in March, is potentially possible to game). By comparison, IBM announced in January that it had achieved a Quantum Volume of 32 with its newest machine, Raleigh.

Google has also spent significant resources on developing its quantum capabilities and In October said it had developed a machine that completed a calculation that would have taken a supercomputer 10,000 years to process in just 200 seconds. (IBM disputed Googles claim, saying the calculation would have taken only 2.5 days to complete.)

Honeywell has been working toward this goal for the past decade when it began developing the technology to produce cryogenics and laser tools. In the past five years, the company assembled a team of more than 100 technologists entirely dedicated to building the machine, and in March, Honeywell announced it would be within three months a goal it was able to meet even as the Covid-19 turned its workforce upside down and forced some employees to work remotely. We had to completely redesign how we work in the facilities, had to limit who was coming on the site, and put in place physical barriers, says Tony Uttley, president of Honeywell Quantum Solutions. All of that happened at the same time we were planning on being on this race.

The advancement also means that Honeywell is opening its computer to companies looking to execute their own unimaginably large calculations a service that can cost about $10,000 an hour, says Uttley. While it wont disclose how many customers it has, Honeywell did say that it has a contract with JPMorgan Chase, which has its own quantum experts who will use its machine to execute gargantuan tasks, such as building fraud detection models. For those companies without in-house quantum experts, queries can be made through intermediary quantum firms, Zapata Computing and Cambridge Quantum Computing.

With greater access to the technology, Uttley says, quantum computers are nearing the point where they have graduated from an item of fascination to being used to solve problems like climate change and pharmaceutical development. Going forward, Uttley says Honeywell plans to increase the Quantum Volume of its machine by a factor of 10 every year for the next five years, reaching a figure of 640,000 a capability far beyond that imagined ever before.

Read more here:

Honeywell Says It Has Built The Worlds Most Powerful Quantum Computer - Forbes

Posted in Quantum Computing | Comments Off on Honeywell Says It Has Built The Worlds Most Powerful Quantum Computer – Forbes

Two-Electron Qubit Points the Way to Scaling up Quantum Computers, According to RIKEN Research – HPCwire

Posted: at 2:45 pm

June 22, 2020 The high-accuracy, resonant operation in silicon of a new type of qubitthe basic unit of data in quantum computershas been demonstrated for the first time by an all-RIKEN team1. This qubit overcomes a problem with conventional qubits in silicon, which has been a roadblock to scaling up quantum computers.

Quantum computers promise to revolutionize computing as they will be able to perform certain types of calculations much faster than conventional computers.

There are various competing technologies for realizing quantum computers, all with their advantages and disadvantages. One of the most promising is the use of electron spins in silicon. It has the huge head start of being able to apply the semiconductor manufacturing techniques used today for conventional electronics.

But in all these diverse technologies, quantum computers are based on qubitsthe quantum equivalent of bits in conventional computersand use them to store information and perform calculations.

In silicon-based quantum computers, the simplest qubit is the spin of a single electron, which can be in a superposition of two possible states: up and down. However, these qubits require high-frequency microwave pulses to control them, which are hard to focus down so that they only control one qubit without disrupting its neighbors.

Now, Seigo Tarucha, Kenta Takeda and three co-workers, all at the RIKEN Center for Emergent Matter Science, have realized high-accuracy operation using a qubit that employs the spins of two electrons, which can exist in the superposition of two states: up, down and down, up.

Compared to qubits based on single electrons, this qubit can be controlled by much lower frequency microwave pulses, which are easier to restrict to narrow areas. The big advantage of our qubit is that it doesnt require high-frequency control pulses, which are usually difficult to localize and can be a problem when scaling a system up, explains Takeda. The crosstalk caused by high-frequency signals can unintentionally rotate qubits near the target one.

While these two-electron qubits have been realized in previous studies, this is the first time that the accuracy of the operation was 99.6%.

Previous demonstrations of these qubits suffered from both nuclear and charge noises, Takeda notes. In this study, we used an improved device and operation scheme to mitigate the issues and show that the control fidelity of the qubit can exceed the 99% threshold for quantum error correction.

The team now intends to make their device even more accurate by rendering the nuclear noise negligible through employing a special type of silicon that contains only one isotope.

About RIKEN

RIKEN is Japans largest comprehensive research institution renowned for high-quality research in a diverse range of scientific disciplines. Founded in 1917 as a private research foundation in Tokyo, RIKEN has grown rapidly in size and scope, today encompassing a network of world-class research centers and institutes across Japan.

Source: RIKEN

Read the original here:

Two-Electron Qubit Points the Way to Scaling up Quantum Computers, According to RIKEN Research - HPCwire

Posted in Quantum Computing | Comments Off on Two-Electron Qubit Points the Way to Scaling up Quantum Computers, According to RIKEN Research – HPCwire

Page 104«..1020..103104105106..110120..»