Muquans and Pasqal partner to advance quantum computing – Quantaneo, the Quantum Computing Source

This partnership is an opportunity to leverage a unique industrial and technological expertise for the design, integration and validation of advanced quantum solutions that has been applied for more than a decade to quantum gravimeters and atomic clocks. It will speed up the development of Pasqals processors and will bring them to an unprecedented maturity level.

Muquans will supply several key technological building blocks and a technical assistance to Pasqal, that will offer an advanced computing and simulation capability towards quantum advantage for real life applications.

We have the strong belief that the neutral atoms technology developed by Pasqal has a unique potential and this agreement is a wonderful opportunity for Muquans to participate on the great adventure of quantum computing. It will also help us find new opportunities for our technologies. We expect this activity to significantly grow in the coming years and this partnership will allow us to become a key stakeholder in the supply chain of quantum computers., Bruno Desruelle, CEO Muquans

Muquans laser solutions combine extreme performance, advanced functionalities and industrial reliability. When you develop the next generation of quantum computers, you need to rely on strong bases and build trust with your partners. Being able to embed this technology in our processors will be a key factor for our company to consolidate our competitive advantage and bring quantum processors to the market., Georges-Olivier Reymond, CEO Pasqal

View post:
Muquans and Pasqal partner to advance quantum computing - Quantaneo, the Quantum Computing Source

Quantum computing and blockchain, is this our bold future? – Irish Tech News

By Theodora Lau and Bradley Leimer, with some interesting musings on Quantum computing and blockchain

Everything that happens is connected to everything else.

There are then, moments in time, that act as trigger points for a series of interconnected events that result in significant human progress, whether through a new technology or a period of transformative societal change. This rejects both the conventional linear and teleological views of history those focusing on the procession toward the result rather than threaded causation of historical progression and looks for sparks of connected ingenuity that further develops the thrust of human advancement.

And so begins the heralded documentary series Connections created by science historian James Burke. Throughout the series, Burke demonstrates why we cannot view the development of any portion of our contemporary world in isolation. He asserts that advances in the modern world are the result of a series of interconnected events and moments of progress, whether that be an invention of necessity or a curious progression of culture from the seemingly disjointed motivations of humans, all of whom had no concept or perhaps little intention of the final result of their activities.

Human progress flies blind until everything becomes very transparent. This interaction of these isolated events drives our history, our innovation, our progress.

Evolution feels slow until a sudden series of tremors makes it all feel far too real.

This is how we often feel in our very modern world.

We are lost in the world of the dim light of glass, until we are awoken from our slumber of scrolling by something personally transformative to our lives.

The promise of technology is that it will improve our society, or at least make our lives more efficient, freeing up our time to pursue some of lifes pleasures, whether that be leisure like reading and writing and expressing ourselves through art, or toward more time working to solve lifes more pressing problems through the output of our work.

Certain technology especially recent improvements in computing, from faster processors, cloud storage, and advanced quantum computing combine with others to create opportunities to alleviate significant challenges like climate change, water scarcity, and global poverty. Others, like blockchain (distributed ledger technology), hold the promise of reigning in the issue around defining the source of truth within certain forms of data, some of which are life defining.

The creation of trust through technology is an interesting thread to pull. From the source of goods and services traveling through our supply chain to the authenticity of our elections, new technologies hold the potential to rapidly improve the future and the advancement of humanity. Closer to our focus on financial services, quantum computing addresses market risk, credit risk, digital annealing, dynamic portfolio selection, ATM replenishment and more. Blockchain technology has focused on AML/KYC, trade finance, remittance, central bank backed digital currency, security tokens, and has the capacity for continued innovation in the financial space.

What if these two elemental forces were viewed together? What if we channeled our inner James Burke, and looked for connections between these two transformative technologies? This is exactly what our partner Arunkumar Krishnakumar did in his new book Quantum Computing and Blockchain in Business: Exploring the applications, challenges and collision of quantum computing and blockchain. Though a seemingly impenetrable title, we can more than assure you its worth a read to understand where the future is headed.

Aruns book dissects the genesis of these twin technologies and how they intersect. Similar to how James Burke rejects the threading of historical events, the first time author writes about the impacts of these technologies on healthcare and pharmaceutical industries, governance, elections, smart cities, the environment, chemistry, logistics, and much more. We are left with the question of whether there is anything that a blockchain powered by quantum computing cannot do? Fortunately the book answers that as well.

As the book discusses in the last few chapters as viewed through Aruns critical lens there are also darker sides to these technologies where they could threaten nation states, launch a new cyber arms race he details the dangers of these technologies and how they might impact every life. He also concludes with some blue sky ideas both dreams and realized aspirations derived from the power of these complementary tools of knowledge and how writing this book provided him with a sense of hope for the future of humanity, in the age of rapidly developing and highly interdependent technologies.

Perhaps it is fitting then, that Arun uses a quote from the opening of the Charles Dickens novel, A Tale of Two Cities, to tell his story. The conflict between good and evil, between light and darkness, can be won. Technology is just another means to this end.

There is a lot of hype, but somewhere amid all the hype, there is still hope.

How we write the next chapter and the future of the human race is entirely up to us.

The sky is indeed blue.

We must never lose hope.

Listen in via iTunes and Spotify as Theo and Bradley of Unconventional Ventures have a conversation with our partner and co-host Arunkumar Krishnakumar, as he talks about his new book Quantum Computing and Blockchain in Business: Exploring the applications, challenges and collision of quantum computing and blockchain, and how he is finding solace in this summer of COVID-19. Listen to this, and every episode of One Vision, on your favorite player.

More about Irish Tech News and Business Showcase here

FYI the ROI for you is => Irish Tech News now gets over 1.5 million monthly views, and up to 900k monthly unique visitors, from over 160 countries. We have over 860,000 relevant followers on Twitter on our various accounts & were recently described as Irelands leading online tech news site and Irelands answer to TechCrunch, so we can offer you a good audience!

Since introducing desktop notifications a short time ago, which notify readers directly in their browser of new articles being published, over 30,000 people have now signed up to receive them ensuring they are instantly kept up to date on all our latest content. Desktop notifications offer a unique method of serving content directly to verified readers and bypass the issue of content getting lost in peoples crowded news feeds.

Drop us a line if you want to be featured, guest post, suggest a possible interview, or just let us know what you would like to see more of in our future articles. Were always open to new and interesting suggestions for informative and different articles. Contact us, by email, twitter or whatever social media works for you and hopefully we can share your story too and reach our global audience.

Irish Tech News

If you would like to have your company featured in the Irish Tech News Business Showcase, get in contact with us at [emailprotected] or on Twitter: @SimonCocking

Read the original:
Quantum computing and blockchain, is this our bold future? - Irish Tech News

Quantum Computing will host April 28 webinar on its technical strategy – Proactive Investors USA & Canada

Steve Reinhardt, vice president of product development, will discuss the company's Mukai middleware while demonstrating its QCI NetworkX for solving graph problems

Quantum Computing Inc (), an advanced technology company developing quantum-ready applications and tools, will host a webinar to discuss its technical strategy and Mukai middleware while demonstrating its QCI NetworkX for solving graph problems.

The webinar, scheduled for 12 pm ET on April 28, will be hosted by Steve Reinhardt, vice president of product development.

Quantum Computings Mukai middleware, announced in January, is quantum-ready application-development middleware, developed to help users and application developers solve extremely complex discrete optimization problems, which are at the heart of some of the most difficult computing challenges in industry, government, and academia.

The Mukai software stack enables developers to create and execute quantum-ready applications on classical computers, often with superior performance, while being ready to run on quantum computers when those systems can achieve performance advantages.

The Leesburg, Virginia-based company said it has already demonstrated superior performance for some applications built on Mukai and running on classical computers.

Discrete combinatorial optimization is one high-value class of problems expected to benefit greatly from quantum computers, and techniques for exploiting quantum computers for optimization have been deeply explored, evidenced by the work on quantum annealers by early D-Wave researchers and on gate-model QCs by researchers of the Quantum Approximate Optimization Algorithm (QAOA).

The company's Mukai software stack is centered on the quadratic unconstrained binary optimization (QUBO) formulation well known to quantum annealing users.

The Mukai software product includes two primary user/developer interfaces the QCI NetworkX graph-analysis package and the QCI qbsolv QUBO solver. Modeled on the D-Wave NetworkX package targeting quantum annealing, QCI NetworkX implements a set of extremely compute-intense (NP-hard to mathematicians) graph kernels that are expected to benefit the most from QCs; the kernels use the QUBO formulation.

Quantum Computing said the April 28 webinar will conclude after a question and answer session. Questions regarding QCIs strategy, Mukai, and QCI NetworkX can be sent in advance to [emailprotected]

To attend the webinar, free registration will be required. To register for and attend the webinar, use this URL:https://zoom.us/s/98851767288.

Contact the author: [emailprotected]

Follow him on Twitter @PatrickMGraham

More:
Quantum Computing will host April 28 webinar on its technical strategy - Proactive Investors USA & Canada

The future of quantum computing in the cloud – TechTarget

AWS, Microsoft and other IaaS providers have jumped on the quantum computing bandwagon as they try to get ahead of the curve on this emerging technology.

Developers use quantum computing to encode problems as qubits, which compute multiple combinations of variables at once rather than exploring each possibility discretely. In theory, this could allow researchers to quickly solve problems involving different combinations of variables, such as breaking encryption keys, testing the properties of different chemical compounds or simulating different business models. Researchers have begun to demonstrate real-world examples of how these early quantum computers could be put to use.

However, this technology is still being developed, so experts caution that it could take more than a decade for quantum computing to deliver practical value. In the meantime, there are a few cloud services, such as Amazon Bracket and Microsoft Quantum, that aim to get developers up to speed on writing quantum applications.

Quantum computing in the cloud has the potential to disrupt industries in a similar way as other emerging technologies, such as AI and machine learning. But quantum computing is still being established in university classrooms and career paths, said Bob Sutor, vice president of IBM Quantum Ecosystem Development. Similarly, major cloud providers are focusing primarily on education at this early stage.

"The cloud services today are aimed at preparing the industry for the soon-to-arrive day when quantum computers will begin being useful," said Itamar Sivan, co-founder and CEO of Quantum Machines, an orchestration platform for quantum computing.

There's still much to iron out regarding quantum computing and the cloud, but the two technologies appear to be a logical fit, for now.

Cloud-based quantum computing is more difficult to pull off than AI, so the ramp up will be slower and the learning curve steeper, said Martin Reynolds, distinguished vice president of research at Gartner. For starters, quantum computers require highly specialized room conditions that are dramatically different from how cloud providers build and operate their existing data centers.

Reynolds believes practical quantum computers are at least a decade away. The biggest drawback lies in aligning the quantum state of qubits in the computer with a given problem, especially since quantumcomputersstill haven't been proven to solve problems better than traditional computers.

Coders also must learn new math and logic skills to utilize quantum computing. This makes it hard for them since they can't apply traditional digital programming techniques. IT teams need to develop specialized skills to understand how to apply quantum computing in the cloud so they can fine tune the algorithms, as well as the hardware, to make this technology work.

Current limitations aside, the cloud is an ideal way to consume quantum computing, because quantum computing has low I/O but deep computation, Reynolds said. Because cloud vendors have the technological resources and a large pool of users, they will inevitably be some of the first quantum-as-a-service providers and will look for ways to provide the best software development and deployment stacks.

Quantum computing could even supplement general compute and AI services cloud providers currently offer, said Tony Uttley, president of Honeywell Quantum Solutions.In that scenario, the cloud would integrate with classical computing cloud resources in a co-processing environment.

The cloud plays two key roles in quantum computing today, according to Hyoun Park, CEO and principal analyst at Amalgam Insights. The first is to provide an application development and test environment for developers to simulate the use of quantum computers through standard computing resources.

The second is to offer access to the few quantum computers that are currently available, in the way mainframe leasing was common a generation ago. This improves the financial viability of quantum computing, since multiple users can increase machine utilization.

It takes significant computing power to simulate quantum algorithm behavior from a development and testing perspective. For the most part, cloud vendors want to provide an environment to develop quantum algorithms before loading these quantum applications onto dedicated hardware from other providers, which can be quite expensive.

However, classical simulations of quantum algorithms that use large numbers of qubits are not practical. "The issue is that the size of the classical computer needed will grow exponentially with the number of qubits in the machine," said Doug Finke, publisher of the Quantum Computing Report.So, a classical simulation of a 50-qubit quantum computer would require a classical computer with roughly 1 petabyte of memory. This requirement will double with every additional qubit.

Nobody knows which approach is best, or which materials are best. We're at the Edison light bulb filament stage. Martin ReynoldsDistinguished vice president of research at Gartner

But classical simulations for problems using a smaller number of qubits are useful both as a tool to teach quantum algorithms to students and also for quantum software engineers to test and debug algorithms with "toy models" for their problem, Finke said.Once they debug their software, they should be able to scale it up to solve larger problems on a real quantum computer.

In terms of putting quantum computing to use, organizations can currently use it to support last-mile optimization, encryption and other computationally challenging issues, Park said. This technology could also aid teams across logistics, cybersecurity, predictive equipment maintenance, weather predictions and more. Researchers can explore multiple combinations of variables in these kinds of problems simultaneously, whereas a traditional computer needs to compute each combination separately.

However, there are some drawbacks to quantum computing in the cloud. Developers should proceed cautiously when experimenting with applications that involve sensitive data, said Finke. To address this, many organizations prefer to install quantum hardware in their own facilities despite the operational hassles, Finke said.

Also, a machine may not be immediately available when a quantum developer wants to submit a job through quantum services on the public cloud. "The machines will have job queues and sometimes there may be several jobs ahead of you when you want to run your own job," Finke said. Some of the vendors have implemented a reservation capability so a user can book a quantum computer for a set time period to eliminate this problem.

IBM was first to market with its Quantum Experience offering, which launched in 2016 and now has over 15 quantum computers connected to the cloud. Over 210,000 registered users have executed more than 70 billion circuits through the IBM Cloud and published over 200 papers based on the system, according to IBM.

IBM also started the Qiskit open source quantum software development platform and has been building an open community around it. According to GitHub statistics, it is currently the leading quantum development environment.

In late 2019, AWS and Microsoft introduced quantum cloud services offered through partners.

Microsoft Quantum provides a quantum algorithm development environment, and from there users can transfer quantum algorithms to Honeywell, IonQ or Quantum Circuits Inc. hardware. Microsoft's Q# scripting offers a familiar Visual Studio experience for quantum problems, said Michael Morris, CEO of Topcoder, an on-demand digital talent platform.

Currently, this transfer involves the cloud providers installing a high-speed communication link from their data center to the quantum computer facilities, Finke said. This approach has many advantages from a logistics standpoint, because it makes things like maintenance, spare parts, calibration and physical infrastructure a lot easier.

Amazon Braket similarly provides a quantum development environment and, when generally available, will provide time-based pricing to access D-Wave, IonQ and Rigetti hardware. Amazon says it will add more hardware partners as well. Braket offers a variety of different hardware architecture options through a common high-level programming interface, so users can test out the machines from the various partners and determine which one would work best with their application, Finke said.

Google has done considerable core research on quantum computing in the cloud and is expected to launch a cloud computing service later this year. Google has been more focused on developing its in-house quantum computing capabilities and hardware rather than providing access to these tools to its cloud users, Park said. In the meantime, developers can test out quantum algorithms locally using Google's Circ programming environment for writing apps in Python.

In addition to the larger offerings from the major cloud providers, there are several alternative approaches to implementing quantum computers that are being provided through the cloud.

D-Wave is the furthest along, with a quantum annealer well-suited for many optimization problems. Other alternatives include QuTech, which is working on a cloud offering of its small quantum machine utilizing its spin qubits technology. Xanadu is another and is developing a quantum machine based on a photonic technology.

Researchers are pursuing a variety of approaches to quantum computing -- using electrons, ions or photons -- and it's not yet clear which approaches will pan out for practical applications first.

"Nobody knows which approach is best, or which materials are best. We're at the Edison light bulb filament stage, where Edison reportedly tested thousands of ways to make a carbon filament until he got to one that lasted 1,500 hours," Reynolds said. In the meantime, recent cloud offerings promise to enable developers to start experimenting with these different approaches to get a taste of what's to come.

More here:
The future of quantum computing in the cloud - TechTarget

Orquestra, an end-to-end, unified Quantum Operating Environment is now in early access – Neowin

Zapata, a firm whose primary focus is on quantum computing and software, launched early access to Orquestra today. Orquestra, dubbed as a novel end-to-end, unified Quantum Operating Environment (QOE), is meant for designing, manipulating, optimizing, and running quantum circuits. These quantum circuits are then generalized to run across different quantum computers, simulators, and HPC resources.

Orquestra enables advanced technology, R&D and academic teams to acceleratequantum solutions for complex computational problems in optimization, machinelearning and simulation across a variety of industries.

Some of the noteworthy features of Orquestra are as follows. First, it provides an extensive library supplying optimized open-source (VQE, QAOA) and proprietary (VQF) algorithms. The environment allows users to combine modules written in different libraries, some of which include Cirq, Qiskit, PennyLane and PyQuil.

In addition, it also offers hardware-interoperable layering and is the only quantum platform that goes beyond hardware-agnostic capabilities. This allows users to compare various devices in the context of particular computational problems and benchmark how workflows perform across them.

Users can also submit these workflows to the Orquestra Quantum Engine (OQE) servers with command-line tools and orchestrate workflow tasks across a variety of backends that include gate model devices, quantum annealers, quantum simulators, and HPC resources. Automatedparallelization through container orchestration and management of complex records is offered as well.

Orquestra is currently in early-access and is aimed at users with backgrounds in software engineering, machine learning, physics, computational chemistry or quantum information theory. To be a part of the program, and request further information, you can send an e-mail to Zapata.

See the original post:
Orquestra, an end-to-end, unified Quantum Operating Environment is now in early access - Neowin

The Force is With Physicist Andy Howell as He Discusses Star Trek Science With Cast and Crew – Noozhawk

In the most recent episode of his YouTube series Science vs. Cinema, UC Santa Barbara physicist Andy Howell takes on Star Trek: Picard, exploring how the CBS offerings presentation of supernovae and quantum computing stack up against real world science.

For Howell, the series that reviews the scientific accuracy and portrayal of scientists in Hollywoods top sci-fi films is as much an excuse to dive into exciting scientific concepts and cutting edge research.

Science fiction writers are fond of grappling with deep philosophical questions, he said. I was really excited to see that UCSB researchers were thinking about some of the same things in a more grounded way.

For the Star Trek episode, Howell spoke with series creators Alex Kurtzman and Michael Chabon, as well as a number of cast members, including Patrick Stewart. Joining him to discuss quantum science and consciousness were John Martinis a quantum expert at UCSB and chief scientist of the Google quantum computing hardware group and fellow UCSB physics professor

Matthew Fisher. Fishers group is studying whether quantum mechanics plays a role in the brain, a topic taken up in the new Star Trek series.

Howell also talked supernovae and viticulture with friend and colleague Brian Schmidt, vice chancellor of the Australian National University. Schmidt won the 2011 Nobel Prize in Physics for helping to discover that the expansion of the universe is accelerating.

"We started Science vs. Cinema to use movies as a jumping-off point to talk science Howell said. Star Trek Picard seemed like the perfect fit. Star Trek has a huge cultural impact and was even one of the things that made me want to study astronomy.

Previous episodes of Science vs. Cinema have separated fact from fiction in films such as Star Wars, The Current War, Ad Astra, Arrival and The Martian. The success of prior episodes enabled Howell to get early access to the show and interview the cast and crew.

"What most people think about scientific subjects probably isn't what they learned in a university class, but what they saw in a movie, Howell remarked. That makes movies an ideal springboard for introducing scientific concepts. And while I can only reach dozens of students at a time in a classroom, I can reach millions on TV or the internet.

Our professional journalists are working round the clock to make sure you have the news and information you need in these uncertain times.

If you appreciate Noozhawks coronavirus coverage, and the rest of the local Santa Barbara County news we deliver to you 24/7, please become a member of our Hawks Club today.

You need us more than ever, and we need your support.

We provide special member benefits to show how much we appreciate your confidence.

See the article here:
The Force is With Physicist Andy Howell as He Discusses Star Trek Science With Cast and Crew - Noozhawk

These 25 Technology Trends Will Define The Next Decade – Forbes

We may not be living on Mars or traveling to work using jet packs, but there's no doubt the coming decade will bring many exciting technological advances. In this article, I want to outline the 25 key technology trends that I believe will shape the 2020s.

These 25 Technology Trends Will Define The Next Decade

1.Artificial intelligence (AI) and machine learning. The increasing ability of machines to learn and act intelligently will absolutely transform our world. It is also the driving force behind many of the other trends on this list.

2.The Internet of Things (IoT). This refers to the ever-growing number of smart devices and objects that are connected to the internet. Such devices are constantly gathering and transmitting data, further fueling the growth in Big Data and AI.

3.Wearables and augmented humans. What started with fitness trackers has now exploded into a whole industry of wearable technology designed to improve human performance and help us live healthier, safer, more efficient lives. In the future, we may even see humans merge with technology to create augmented humans or transhumans.

4.Big Data and augmented analytics. Big Data refers to the exponential growth in the amount of data being created in our world. Thanks to augmented analytics (highly advanced data analytics, often fueled by AI techniques), we can now make sense of and work with enormously complex and varied streams of data.

5.Intelligent spaces and smart places. Closely linked to the IoT, this trend is seeing physical spaces like homes, offices, and even whole cities becoming increasingly connected and smart.

6.Blockchains and distributed ledgers. This super-secure method of storing, authenticating, and protecting data could revolutionize many aspects of business particularly when it comes to facilitating trusted transactions.

7.Cloud and edge computing. Cloud computing where data is stored on other computers and accessed via the internet has helped to open up data and analytics to the masses. Edge computing where data is processed on smart devices (like phones) will take this to the next level.

8.Digitally extended realities. Encompassing virtual reality, augmented reality, and mixed reality, this trend highlights the move towards creating more immersive digital experiences.

9.Digital twins. A digital twin is a digital copy of an actual physical object, product, process, or ecosystem. This innovative technology allows us to try out alterations and adjustments that would be too expensive or risky to try out on the real physical object.

10.Natural language processing. This technology, which allows machines to understand human language, has dramatically changed how humans interact with machines, in particular giving rise to

11.Voice interfaces and chatbots. Alexa, Siri, chatbots many of us are now quite used to communicate with machines by simply speaking or typing our request. In the future, more and more businesses will choose to interact with their customers via voice interfaces and chatbots.

12.Computer vision and facial recognition. Machines can talk, so why shouldnt they see as well? This technology allows machines to visually interpret the world around them, with facial recognition being a prime example. Although we will no doubt see greater regulatory control over the use of facial recognition, this technology isnt going anywhere.

13.Robots and cobots. Todays robots are more intelligent than ever, learning to respond to their environment and perform tasks without human intervention. In certain industries, the future of work is likely to involve humans working seamlessly with robot colleagues hence the term cobot," or "collaborative robot."

14.Autonomous vehicles. The 2020s will be the decade in which autonomous vehicles of all kinds cars, taxis, trucks, and even ships become truly autonomous and commercially viable.

15.5G. The fifth generation of cellular network technology will give us faster, smarter, more stable wireless networking, thereby driving advances in many other trends (e.g., more connected devices and richer streams of data).

16.Genomics and gene editing. Advances in computing and analytics have driven incredible leaps in our understanding of the human genome. Now, were progressing to altering the genetic structure of living organisms (for example, correcting DNA mutations that can lead to cancer).

17.Machine co-creativity and augmented design. Thanks to AI, machines can do many things including creating artwork and designs. As a result, we can expect creative and design processes to shift towards greater collaboration with machines.

18.Digital platforms. Facebook, Uber, and Airbnb are all household-name examples of digital platforms networks that facilitate connections and exchanges between people. This trend is turning established business models on their head, leading many traditional businesses to transition to or incorporate a platform-based model.

19.Drones and unmanned aerial vehicles. These aircraft, which are piloted either remotely or autonomously, have changed the face of military operations. But the impact doesnt stop there search and rescue missions, firefighting, law enforcement, and transportation will all be transformed by drone technology. Get ready for passenger drones (drone taxis), too!

20.Cybersecurity and resilience. As businesses face unprecedented new threats, the ability to avoid and mitigate cybersecurity threats will be critical to success over the next decade.

21.Quantum computing. Quantum computers unimaginably fast computers capable of solving seemingly unsolvable problems will make our current state-of-the-art technology look like something out of the Stone Age. As yet, work in quantum computing is largely restricted to labs, but we could see the first commercially available quantum computer this decade.

22.Robotic process automation. This technology is used to automate structured and repetitive business processes, freeing up human workers to concentrate on more complex, value-adding work. This is part of a wider shift towards automation that will impact every industry.

23.Mass personalization and micro-moments. Mass-personalization is, as you might expect, the ability to offer highly personalized products or services on a mass scale. Meanwhile, the term micro-moments essentially means responding to customer needs at the exact right moment. Both are made possible by technologies like AI, Big Data, and analytics.

24.3D and 4D printing and additive manufacturing. Although this may seem low-tech compared to some of the other trends, 3D and 4D printing will have very wide applications and will be particularly transformative when combined with trends like mass-personalization.

25.Nanotechnology and materials science. Our increasing ability to understand materials and control matter on a tiny scale is giving rise to exciting new materials and products, such as bendable displays.

Read more about these 25 key technology trends including practical examples from a wide range of industries in my new book, Tech Trends in Practice: The 25 Technologies That Are Driving The 4th Industrial Revolution.

See the article here:
These 25 Technology Trends Will Define The Next Decade - Forbes

Quantum computing – Wikipedia

Study of a model of computation

Quantum computing is the use of quantum-mechanical phenomena such as superposition and entanglement to perform computation. Computers that perform quantum computation are known as a quantum computers.[1]:I-5 Quantum computers are believed to be able to solve certain computational problems, such as integer factorization (which underlies RSA encryption), significantly faster than classical computers. The study of quantum computing is a subfield of quantum information science.

Quantum computing began in the early 1980s, when physicist Paul Benioff proposed a quantum mechanical model of the Turing machine.[2]Richard FeynmanandYuri Maninlater suggested that a quantum computer had the potential to simulate things that a classical computer could not.[3][4] In 1994, Peter Shor developed a quantum algorithm for factoring integers that had the potential to decrypt RSA-encrypted communications.[5] Despite ongoing experimental progress since the late 1990s, most researchers believe that "fault-tolerant quantum computing [is] still a rather distant dream".[6] In recent years, investment into quantum computing research has increased in both the public and private sector.[7][8] On 23 October 2019, Google AI, in partnership with the U.S. National Aeronautics and Space Administration (NASA), published a paper in which they claimed to have achieved quantum supremacy.[9] While some have disputed this claim, it is still a significant milestone in the history of quantum computing.[10]

Quantum computing is modeled by quantum circuits. Quantum circuits are based on the quantum bit, or "qubit", which is somewhat analogous to the bit in classical computation. Qubits can be in a 1 or 0 quantum state, or they can be in a superposition of the 1 and 0 states. However, when qubits are measured the result is always either a 0 or a 1; the probabilities of these two outcomes depend on the quantum state that they were in immediately prior to the measurement. Computation is performed by manipulating qubits with quantum logic gates, which are somewhat analogous to classical logic gates.

There are currently two main approaches to physically implementing a quantum computer: analog and digital. Analog approaches are further divided into quantum simulation, quantum annealing, and adiabatic quantum computation. Digital quantum computers use quantum logic gates to do computation. Both approaches use quantum bits or qubits.[1]:213

Any computational problem that can be solved by a classical computer can also, in principle, be solved by a quantum computer. Conversely, quantum computers obey the ChurchTuring thesis; that is, any computational problem that can be solved by a quantum computer can also be solved by a classical computer. While this means that quantum computers provide no additional power over classical computers in terms of computability, they do in theory provide additional power when it comes to the time complexity of solving certain problems. Notably, quantum computers are believed to be able to quickly solve certain problems that no classical computer could solve in any feasible amount of timea feat known as "quantum supremacy". The study of the computational complexity of problems with respect to quantum computers is known as quantum complexity theory.

The prevailing model of quantum computation describes the computation in terms of a network of quantum logic gates.[11]

A memory consisting of n {textstyle n} bits of information has 2 n {textstyle 2^{n}} possible states. A vector representing all memory states thus has 2 n {textstyle 2^{n}} entries (one for each state). This vector is viewed as a probability vector and represents the fact that the memory is to be found in a particular state.

In the classical view, one entry would have a value of 1 (i.e. a 100% probability of being in this state) and all other entries would be zero. In quantum mechanics, probability vectors are generalized to density operators. This is the technically rigorous mathematical foundation for quantum logic gates, but the intermediate quantum state vector formalism is usually introduced first because it is conceptually simpler. This article focuses on the quantum state vector formalism for simplicity.

We begin by considering a simple memory consisting of only one bit. This memory may be found in one of two states: the zero state or the one state. We may represent the state of this memory using Dirac notation so that

The state of this one-qubit quantum memory can be manipulated by applying quantum logic gates, analogous to how classical memory can be manipulated with classical logic gates. One important gate for both classical and quantum computation is the NOT gate, which can be represented by a matrix

The mathematics of single qubit gates can be extended to operate on multiqubit quantum memories in two important ways. One way is simply to select a qubit and apply that gate to the target qubit whilst leaving the remainder of the memory unaffected. Another way is to apply the gate to its target only if another part of the memory is in a desired state. These two choices can be illustrated using another example. The possible states of a two-qubit quantum memory are

In summary, a quantum computation can be described as a network of quantum logic gates and measurements. Any measurement can be deferred to the end of a quantum computation, though this deferment may come at a computational cost. Because of this possibility of deferring a measurement, most quantum circuits depict a network consisting only of quantum logic gates and no measurements. More information can be found in the following articles: universal quantum computer, Shor's algorithm, Grover's algorithm, DeutschJozsa algorithm, amplitude amplification, quantum Fourier transform, quantum gate, quantum adiabatic algorithm and quantum error correction.

Any quantum computation can be represented as a network of quantum logic gates from a fairly small family of gates. A choice of gate family that enables this construction is known as a universal gate set. One common such set includes all single-qubit gates as well as the CNOT gate from above. This means any quantum computation can be performed by executing a sequence of single-qubit gates together with CNOT gates. Though this gate set is infinite, it can be replaced with a finite gate set by appealing to the Solovay-Kitaev theorem.

Integer factorization, which underpins the security of public key cryptographic systems, is believed to be computationally infeasible with an ordinary computer for large integers if they are the product of few prime numbers (e.g., products of two 300-digit primes).[12] By comparison, a quantum computer could efficiently solve this problem using Shor's algorithm to find its factors. This ability would allow a quantum computer to break many of the cryptographic systems in use today, in the sense that there would be a polynomial time (in the number of digits of the integer) algorithm for solving the problem. In particular, most of the popular public key ciphers are based on the difficulty of factoring integers or the discrete logarithm problem, both of which can be solved by Shor's algorithm. In particular, the RSA, DiffieHellman, and elliptic curve DiffieHellman algorithms could be broken. These are used to protect secure Web pages, encrypted email, and many other types of data. Breaking these would have significant ramifications for electronic privacy and security.

However, other cryptographic algorithms do not appear to be broken by those algorithms.[13][14] Some public-key algorithms are based on problems other than the integer factorization and discrete logarithm problems to which Shor's algorithm applies, like the McEliece cryptosystem based on a problem in coding theory.[13][15] Lattice-based cryptosystems are also not known to be broken by quantum computers, and finding a polynomial time algorithm for solving the dihedral hidden subgroup problem, which would break many lattice based cryptosystems, is a well-studied open problem.[16] It has been proven that applying Grover's algorithm to break a symmetric (secret key) algorithm by brute force requires time equal to roughly 2n/2 invocations of the underlying cryptographic algorithm, compared with roughly 2n in the classical case,[17] meaning that symmetric key lengths are effectively halved: AES-256 would have the same security against an attack using Grover's algorithm that AES-128 has against classical brute-force search (see Key size).

Quantum cryptography could potentially fulfill some of the functions of public key cryptography. Quantum-based cryptographic systems could, therefore, be more secure than traditional systems against quantum hacking.[18]

Besides factorization and discrete logarithms, quantum algorithms offering a more than polynomial speedup over the best known classical algorithm have been found for several problems,[19] including the simulation of quantum physical processes from chemistry and solid state physics, the approximation of Jones polynomials, and solving Pell's equation. No mathematical proof has been found that shows that an equally fast classical algorithm cannot be discovered, although this is considered unlikely.[20] However, quantum computers offer polynomial speedup for some problems. The most well-known example of this is quantum database search, which can be solved by Grover's algorithm using quadratically fewer queries to the database than that are required by classical algorithms. In this case, the advantage is not only provable but also optimal, it has been shown that Grover's algorithm gives the maximal possible probability of finding the desired element for any number of oracle lookups. Several other examples of provable quantum speedups for query problems have subsequently been discovered, such as for finding collisions in two-to-one functions and evaluating NAND trees.

Problems that can be addressed with Grover's algorithm have the following properties:

For problems with all these properties, the running time of Grover's algorithm on a quantum computer will scale as the square root of the number of inputs (or elements in the database), as opposed to the linear scaling of classical algorithms. A general class of problems to which Grover's algorithm can be applied[21] is Boolean satisfiability problem. In this instance, the database through which the algorithm is iterating is that of all possible answers. An example (and possible) application of this is a password cracker that attempts to guess the password or secret key for an encrypted file or system. Symmetric ciphers such as Triple DES and AES are particularly vulnerable to this kind of attack.[citation needed] This application of quantum computing is a major interest of government agencies.[22]

Since chemistry and nanotechnology rely on understanding quantum systems, and such systems are impossible to simulate in an efficient manner classically, many believe quantum simulation will be one of the most important applications of quantum computing.[23] Quantum simulation could also be used to simulate the behavior of atoms and particles at unusual conditions such as the reactions inside a collider.[24]

Quantum annealing or Adiabatic quantum computation relies on the adiabatic theorem to undertake calculations. A system is placed in the ground state for a simple Hamiltonian, which is slowly evolved to a more complicated Hamiltonian whose ground state represents the solution to the problem in question. The adiabatic theorem states that if the evolution is slow enough the system will stay in its ground state at all times through the process.

The Quantum algorithm for linear systems of equations, or "HHL Algorithm", named after its discoverers Harrow, Hassidim, and Lloyd, is expected to provide speedup over classical counterparts.[25]

John Preskill has introduced the term quantum supremacy to refer to the hypothetical speedup advantage that a quantum computer would have over a classical computer in a certain field.[26] Google announced in 2017 that it expected to achieve quantum supremacy by the end of the year though that did not happen. IBM said in 2018 that the best classical computers will be beaten on some practical task within about five years and views the quantum supremacy test only as a potential future benchmark.[27] Although skeptics like Gil Kalai doubt that quantum supremacy will ever be achieved,[28][29] in October 2019, a Sycamore processor created in conjunction with Google AI Quantum was reported to have achieved quantum supremacy,[30] with calculations more than 3,000,000 times as fast as those of Summit, generally considered the world's fastest computer.[31] Bill Unruh doubted the practicality of quantum computers in a paper published back in 1994.[32] Paul Davies argued that a 400-qubit computer would even come into conflict with the cosmological information bound implied by the holographic principle.[33]

There are a number of technical challenges in building a large-scale quantum computer.[34] Physicist David DiVincenzo has listed the following requirements for a practical quantum computer:[35]

Sourcing parts for quantum computers is also very difficult. Many quantum computers, like those constructed by Google and IBM, need Helium-3, a nuclear research byproduct, and special superconducting cables that are only made by a single company in Japan.[36]

One of the greatest challenges involved with constructing quantum computers is controlling or removing quantum decoherence. This usually means isolating the system from its environment as interactions with the external world cause the system to decohere. However, other sources of decoherence also exist. Examples include the quantum gates, and the lattice vibrations and background thermonuclear spin of the physical system used to implement the qubits. Decoherence is irreversible, as it is effectively non-unitary, and is usually something that should be highly controlled, if not avoided. Decoherence times for candidate systems in particular, the transverse relaxation time T2 (for NMR and MRI technology, also called the dephasing time), typically range between nanoseconds and seconds at low temperature.[37] Currently, some quantum computers require their qubits to be cooled to 20 millikelvins in order to prevent significant decoherence.[38]

As a result, time-consuming tasks may render some quantum algorithms inoperable, as maintaining the state of qubits for a long enough duration will eventually corrupt the superpositions.[39]

These issues are more difficult for optical approaches as the timescales are orders of magnitude shorter and an often-cited approach to overcoming them is optical pulse shaping. Error rates are typically proportional to the ratio of operating time to decoherence time, hence any operation must be completed much more quickly than the decoherence time.

As described in the Quantum threshold theorem, if the error rate is small enough, it is thought to be possible to use quantum error correction to suppress errors and decoherence. This allows the total calculation time to be longer than the decoherence time if the error correction scheme can correct errors faster than decoherence introduces them. An often cited figure for the required error rate in each gate for fault-tolerant computation is 103, assuming the noise is depolarizing.

Meeting this scalability condition is possible for a wide range of systems. However, the use of error correction brings with it the cost of a greatly increased number of required qubits. The number required to factor integers using Shor's algorithm is still polynomial, and thought to be between L and L2, where L is the number of qubits in the number to be factored; error correction algorithms would inflate this figure by an additional factor of L. For a 1000-bit number, this implies a need for about 104 bits without error correction.[40] With error correction, the figure would rise to about 107 bits. Computation time is about L2 or about 107 steps and at 1MHz, about 10 seconds.

A very different approach to the stability-decoherence problem is to create a topological quantum computer with anyons, quasi-particles used as threads and relying on braid theory to form stable logic gates.[41][42]

Physicist Mikhail Dyakonov has expressed skepticism of quantum computing as follows:

There are a number of quantum computing models, distinguished by the basic elements in which the computation is decomposed. The four main models of practical importance are:

The quantum Turing machine is theoretically important but the physical implementation of this model is not feasible. All four models of computation have been shown to be equivalent; each can simulate the other with no more than polynomial overhead.

For physically implementing a quantum computer, many different candidates are being pursued, among them (distinguished by the physical system used to realize the qubits):

A large number of candidates demonstrates that quantum computing, despite rapid progress, is still in its infancy.

Any computational problem solvable by a classical computer is also solvable by a quantum computer.[62] Intuitively, this is because it is believed that all physical phenomena, including the operation of classical computers, can be described using quantum mechanics, which underlies the operation of quantum computers.

Conversely, any problem solvable by a quantum computer is also solvable by a classical computer, or more formally any quantum computer can be simulated by a Turing machine. In other words, quantum computers provide no additional power over classical computers in terms of computability. This means that quantum computers cannot solve undecidable problems like the halting problem and the existence of quantum computers does not disprove the ChurchTuring thesis.[63]

While quantum computers cannot solve any problems that classical computers cannot already solve, it is suspected that they can solve many problems faster than classical computers. For instance, it is known that quantum computers can efficiently factor integers, while this is not believed to be the case for classical computers. However, the capacity of quantum computers to accelerate classical algorithms has rigid upper bounds, and the overwhelming majority of classical calculations cannot be accelerated by the use of quantum computers.[64]

The class of problems that can be efficiently solved by a quantum computer with bounded error is called BQP, for "bounded error, quantum, polynomial time". More formally, BQP is the class of problems that can be solved by a polynomial-time quantum Turing machine with error probability of at most 1/3. As a class of probabilistic problems, BQP is the quantum counterpart to BPP ("bounded error, probabilistic, polynomial time"), the class of problems that can be efficiently solved by probabilistic Turing machines with bounded error.[65] It is known that BPP {displaystyle subseteq } BQP and widely suspected, but not proven, that BQP {displaystyle nsubseteq } BPP, which intuitively would mean that quantum computers are more powerful than classical computers in terms of time complexity.[66]

The exact relationship of BQP to P, NP, and PSPACE is not known. However, it is known that P {displaystyle subseteq } BQP {displaystyle subseteq } PSPACE; that is, the class of problems that can be efficiently solved by quantum computers includes all problems that can be efficiently solved by deterministic classical computers but does not include any problems that cannot be solved by classical computers with polynomial space resources. It is further suspected that BQP is a strict superset of P, meaning there are problems that are efficiently solvable by quantum computers that are not efficiently solvable by deterministic classical computers. For instance, integer factorization and the discrete logarithm problem are known to be in BQP and are suspected to be outside of P. On the relationship of BQP to NP, little is known beyond the fact that some NP problems are in BQP (integer factorization and the discrete logarithm problem are both in NP, for example). It is suspected that NP {displaystyle nsubseteq } BQP; that is, it is believed that there are efficiently checkable problems that are not efficiently solvable by a quantum computer. As a direct consequence of this belief, it is also suspected that BQP is disjoint from the class of NP-complete problems (if an NP-complete problem were in BQP, then it follows from NP-hardness that all problems in NP are in BQP).[68]

The relationship of BQP to the basic classical complexity classes can be summarized as:

It is also known that BQP is contained in the complexity class #P (or more precisely in the associated class of decision problems P#P),[68] which is a subclass of PSPACE.

It has been speculated that further advances in physics could lead to even faster computers. For instance, it has been shown that a non-local hidden variable quantum computer based on Bohmian Mechanics could implement a search of an N-item database in at most O ( N 3 ) {displaystyle O({sqrt[{3}]{N}})} steps, a slight speedup over Grover's algorithm, which runs in O ( N ) {displaystyle O({sqrt {N}})} steps (however, neither search method would allow quantum computers to solve NP-Complete problems in polynomial time).[69] Theories of quantum gravity, such as M-theory and loop quantum gravity, may allow even faster computers to be built. However, defining computation in these theories is an open problem due to the problem of time; that is, within these physical theories there is currently no obvious way to describe what it means for an observer to submit input to a computer at one point in time and then receive output at a later point in time.[70][71]

See the original post here:
Quantum computing - Wikipedia

What Is Quantum Computing? The Next Era of Computational …

When you first stumble across the term quantum computer, you might pass it off as some far-flung science fiction concept rather than a serious current news item.

But with the phrase being thrown around with increasing frequency, its understandable to wonder exactly what quantum computers are, and just as understandable to be at a loss as to where to dive in. Heres the rundown on what quantum computers are, why theres so much buzz around them, and what they might mean for you.

All computing relies on bits, the smallest unit of information that is encoded as an on state or an off state, more commonly referred to as a 1 or a 0, in some physical medium or another.

Most of the time, a bit takes the physical form of an electrical signal traveling over the circuits in the computers motherboard. By stringing multiple bits together, we can represent more complex and useful things like text, music, and more.

The two key differences between quantum bits and classical bits (from the computers we use today) are the physical form the bits take and, correspondingly, the nature of data encoded in them. The electrical bits of a classical computer can only exist in one state at a time, either 1 or 0.

Quantum bits (or qubits) are made of subatomic particles, namely individual photons or electrons. Because these subatomic particles conform more to the rules of quantum mechanics than classical mechanics, they exhibit the bizarre properties of quantum particles. The most salient of these properties for computer scientists is superposition. This is the idea that a particle can exist in multiple states simultaneously, at least until that state is measured and collapses into a single state. By harnessing this superposition property, computer scientists can make qubits encode a 1 and a 0 at the same time.

The other quantum mechanical quirk that makes quantum computers tick is entanglement, a linking of two quantum particles or, in this case, two qubits. When the two particles are entangled, the change in state of one particle will alter the state of its partner in a predictable way, which comes in handy when it comes time to get a quantum computer to calculate the answer to the problem you feed it.

A quantum computers qubits start in their 1-and-0 hybrid state as the computer initially starts crunching through a problem. When the solution is found, the qubits in superposition collapse to the correct orientation of stable 1s and 0s for returning the solution.

Aside from the fact that they are far beyond the reach of all but the most elite research teams (and will likely stay that way for a while), most of us dont have much use for quantum computers. They dont offer any real advantage over classical computers for the kinds of tasks we do most of the time.

However, even the most formidable classical supercomputers have a hard time cracking certain problems due to their inherent computational complexity. This is because some calculations can only be achieved by brute force, guessing until the answer is found. They end up with so many possible solutions that it would take thousands of years for all the worlds supercomputers combined to find the correct one.

The superposition property exhibited by qubits can allow supercomputers to cut this guessing time down precipitously. Classical computings laborious trial-and-error computations can only ever make one guess at a time, while the dual 1-and-0 state of a quantum computers qubits lets it make multiple guesses at the same time.

So, what kind of problems require all this time-consuming guesswork calculation? One example is simulating atomic structures, especially when they interact chemically with those of other atoms. With a quantum computer powering the atomic modeling, researchers in material science could create new compounds for use in engineering and manufacturing. Quantum computers are well suited to simulating similarly intricate systems like economic market forces, astrophysical dynamics, or genetic mutation patterns in organisms, to name only a few.

Amidst all these generally inoffensive applications of this emerging technology, though, there are also some uses of quantum computers that raise serious concerns. By far the most frequently cited harm is the potential for quantum computers to break some of the strongest encryption algorithms currently in use.

In the hands of an aggressive foreign government adversary, quantum computers could compromise a broad swath of otherwise secure internet traffic, leaving sensitive communications susceptible to widespread surveillance. Work is currently being undertaken to mature encryption ciphers based on calculations that are still hard for even quantum computers to do, but they are not all ready for prime-time, or widely adopted at present.

A little over a decade ago, actual fabrication of quantum computers was barely in its incipient stages. Starting in the 2010s, though, development of functioning prototype quantum computers took off. A number of companies have assembled working quantum computers as of a few years ago, with IBM going so far as to allow researchers and hobbyists to run their own programs on it via the cloud.

Despite the strides that companies like IBM have undoubtedly made to build functioning prototypes, quantum computers are still in their infancy. Currently, the quantum computers that research teams have constructed so far require a lot of overhead for executing error correction. For every qubit that actually performs a calculation, there are several dozen whose job it is to compensate for the ones mistake. The aggregate of all these qubits make what is called a logical qubit.

Long story short, industry and academic titans have gotten quantum computers to work, but they do so very inefficiently.

Fierce competition between quantum computer researchers is still raging, between big and small players alike. Among those who have working quantum computers are the traditionally dominant tech companies one would expect: IBM, Intel, Microsoft, and Google.

As exacting and costly of a venture as creating a quantum computer is, there are a surprising number of smaller companies and even startups that are rising to the challenge.

The comparatively lean D-Wave Systems has spurred many advances in the fieldand proved it was not out of contention by answering Googles momentous announcement with news of a huge deal with Los Alamos National Labs. Still, smaller competitors like Rigetti Computing are also in the running for establishing themselves as quantum computing innovators.

Depending on who you ask, youll get a different frontrunner for the most powerful quantum computer. Google certainly made its case recently with its achievement of quantum supremacy, a metric that itself Google more or less devised. Quantum supremacy is the point at which a quantum computer is first able to outperform a classical computer at some computation. Googles Sycamore prototype equipped with 54 qubits was able to break that barrier by zipping through a problem in just under three-and-a-half minutes that would take the mightiest classical supercomputer 10,000 years to churn through.

Not to be outdone, D-Wave boasts that the devices it will soon be supplying to Los Alamos weigh in at 5000 qubits apiece, although it should be noted that the quality of D-Waves qubits has been called into question before. IBM hasnt made the same kind of splash as Google and D-Wave in the last couple of years, but they shouldnt be counted out yet, either, especially considering their track record of slow and steady accomplishments.

Put simply, the race for the worlds most powerful quantum computer is as wide open as it ever was.

The short answer to this is not really, at least for the near-term future. Quantum computers require an immense volume of equipment, and finely tuned environments to operate. The leading architecture requires cooling to mere degrees above absolute zero, meaning they are nowhere near practical for ordinary consumers to ever own.

But as the explosion of cloud computing has proven, you dont need to own a specialized computer to harness its capabilities. As mentioned above, IBM is already offering daring technophiles the chance to run programs on a small subset of its Q System Ones qubits. In time, IBM and its competitors will likely sell compute time on more robust quantum computers for those interested in applying them to otherwise inscrutable problems.

But if you arent researching the kinds of exceptionally tricky problems that quantum computers aim to solve, you probably wont interact with them much. In fact, quantum computers are in some cases worse at the sort of tasks we use computers for every day, purely because quantum computers are so hyper-specialized. Unless you are an academic running the kind of modeling where quantum computing thrives, youll likely never get your hands on one, and never need to.

Go here to read the rest:
What Is Quantum Computing? The Next Era of Computational ...

Quantum computer chips demonstrated at the highest temperatures ever – New Scientist News

By Leah Crane

Credit: Luca Petit for QuTech

Quantum computing is heating up. For the first time, quantum computer chips have been operated at a temperature above -272C, or 1 kelvin. That may still seem frigid, but it is just warm enough to potentially enable a huge leap in the capabilities.

Quantum computers are made of quantum bits, or qubits, which can be made in several different ways. One that is receiving attention from some of the fields big players consists of electrons on a silicon chip.

These systems only function at extremely low temperatures below 100 millikelvin, or -273.05C so the qubits have to be stored in powerful refrigerators. The electronics that power them wont run at such low temperatures, and also emit heat that could disrupt the qubits, so they are generally stored outside the refrigerators with each qubit is connected by a wire to its electronic controller.

Advertisement

Eventually, for useful quantum computing, we will need to go to something like a million qubits, and this sort of brute force method, with one wire per qubit, wont work any more, says Menno Veldhorst at QuTech in the Netherlands. It works for two qubits, but not for a million.

Veldhorst and his colleagues, along with another team led by researchers at the University of New South Wales in Australia, have now demonstrated that these qubits can be operated at higher temperatures. The latter team showed they were able to control the state of two qubits on a chip at temperatures up to 1.5 kelvin, and Veldhorsts group used two qubits at 1.1 kelvin in what is called a logic gate, which performs the basic operations that make up more complex calculations.

Now that we know the qubits themselves can function at higher temperatures, the next step is incorporating the electronics onto the same chip. I hope that after we have that circuit, it wont be too hard to scale to something with practical applications, says Veldhorst.

Those quantum circuits will be similar in many ways to the circuits we use for traditional computers, so they can be scaled up relatively easily compared with other kinds of quantum computers, he says.

Journal references: Nature, DOI: 10.1038/s41586-020-2170-7 and DOI: 10.1038/s41586-020-2171-6

More on these topics:

See the original post:
Quantum computer chips demonstrated at the highest temperatures ever - New Scientist News