Page 13«..10..12131415..2030..»

Category Archives: Quantum Computing

‘It’s very powerful’: The promise and potential of quantum computers – AOL

Posted: January 4, 2024 at 3:27 am

TAMPA, Fla. - Quantum computers are still in development, but the early developments show how this emerging technology can transform our world in ways we cant even fully predict.

"Honestly, they're not 100% sure what exactly they're going to be able to use it for yet except that its very powerful and can generate very complex numbers," said Toms Hardware Editor Tom Freedman.

To understand what a quantum computer is and how it works, lets start with traditional computers.

RELATED: The Quantum Leaps physicists made in science and how it's changing our lives

The computers we use today work by transmitting and receiving rapid pulses of electricity. Those electrical pulses carry intricate codes in a string of zeroes and ones that flow in and out of the chips (or brains) of our computers.

The chips coordinate, interpret and transmit the codes to our monitors to form images, to apps to perform calculations, etc.

A quantum computer uses subatomic particles within tiny circuits called Qubits, and those particles or Qubits that are entangled (or linked together), so they connect and function in tandem. And as strange as it sounds as we learned from the laws of quantum mechanics those subatomic particles are also in different positions at the same time.

"When you have a traditional computer, its on or off. It uses these things called bits: 1-0, on-off, yes-no," Freedman noted. "Quantum computing is both on and off at the same time. Its this weird head space. They'll stack these things called cubits together. And in really cold rooms, they can use them to measure multiple values at once using quantum mechanics."

READ: Mint Mobile informs customers about a security data breach

In other words, a Qubit can multitask in ways a traditional computer cannot.

Scientists hope these exponentially faster and more powerful Qubits could give us precise times and locations of natural disasters, develop far more advanced medicine, solve our traffic woes, help us take the next giant leaps in space and help us reign in the effects of climate change.

Read more here:

'It's very powerful': The promise and potential of quantum computers - AOL

Posted in Quantum Computing | Comments Off on ‘It’s very powerful’: The promise and potential of quantum computers – AOL

We’re on the brink of the biggest changes to computing’s DNA and it’s not just quantum that’s coming – PC Gamer

Posted: at 3:27 am

This article was originally published on 30th June this year and we are republishing it today as part of a series celebrating some of our favourite pieces from the past 12 months.

Read more: the future of CPUs

Computers are built around logic: performing mathematical operations using circuits. Logic is built around things such as Addersnot the snake; the basic circuit that adds together two numbers. This is as true of today's microprocessors as all those going back to the very beginning of computing history. You could go back to an abacus and find that, at some fundamental level, it does the same thing as your shiny gaming PC. It's just much, much less capable.

Nowadays, processors can do a lot of mathematical calculations using any number of complex circuits in a single clock. And a lot more than just add two numbers together, too. But to get to your shiny new gaming CPU, there has been a process of iterating on the classical computers that came before, going back centuries.

As you might imagine, building something entirely different to that is a little, uh, tricky, but that's what some are striving to do, with technologies like quantum and neuromorphic computingtwo distinct concepts that could change computing for good.

"Quantum computing is a technology that, at least by name, we have become very accustomed to hearing about and is always mentioned as 'the future of computing'," says Carlos Andrs Trasvia Moreno, software engineering coordinator at CETYS Ensenada.

Quantum computers utilise qubits, or quantum bits. Unlike a classical bit, which can only exist in one of two states, these qubits can exist in two states and a superposition of those two states. It's zero, one, or both zero and one at the same time. And if that sounds awfully confusing, that's because it is, but it also has immense potential.

Quantum computers are expected to be powerful enough to break modern-day 'unbreakable' encryption, accelerate medicine discover, re-shape how the global economy transports goods, explore the stars, and pretty much revolutionise anything involving massive number crunching.

The problem is, quantum computers are immensely difficult to make, and maybe even more difficult to run.

Social Links Navigation

"One of the main drawbacks of quantum computing is its high-power consumption, since it works with algorithms of far greater complexity than that of any current CPU," Moreno continues. "Also, it requires an environment of near absolute zero temperatures, which worsens the power requirements of the system. Lastly, they are extremely sensitive to environmental disturbances such as heat, light and vibrations.

We're scratching the surface there with quantum computing.

"Any of these can alter the current quantum states and produce unexpected outcomes."

And while you can sort of copy the function of classical logic with qubitswe're not starting entirely at zero in developing these machinesto exploit a quantum computer's power requires new and complex quantum algorithms that we're only just getting to grips with.

IBM is one company investing heavily in quantum computing, aiming to create a quantum computer with 4,158 or more qubits by 2025. Google also has its fingers in quantum.

Admittedly, we're still a long way off ubiquitous 'quantum supremacy', which is the moment when a quantum computer is better than today's top classical supercomputers. Google did claim it did just that back in 2019, though that may have turned out to be something of a niche achievement, but nonetheless an impressive one. Either way, in practical terms, we're just not there yet.

They're a real pain to figure out, to put it scientifically. But that's never stopped a good engineer yet.

"I do think that we're scratching the surface there with quantum computing. And again, just like we broke the laws of physics with silicon over and over and over again, I think we break the laws of physics here, too," Marcus Kennedy, general manager of gaming at Intel, tells me.

Marcus Kennedy

There's more immediate potential for the future of computing in artificial intelligence, your favourite 2023 buzzword. But it really is a massive and life-changing development for many, and I'm not just talking about that clever-sounding, slightly-too-argumentative chatbot in your browser. We're only scratching the surface of AI's uses today, and to unlock those deeper, more impactful uses there's a whole new type of chip in the works.

"Neuromorphic computing is, in my mind, the most viable alternative [to classical computing]," Moreno says.

"In a sense, we could say that neuromorphic computers are biological neural networks implemented on hardware. One would think it's simply translating a perceptron to voltages and gates, but it's actually a closer imitation on how brains work, on how actual neurons communicate amongst each other through synapsis."

What is neuromorphic computing? The answers in the name, neuro, meaning related to the nervous system. A neuromorphic computer aims to imitate the greatest computer, and most complex creation, ever known to man: the brain.

"I think we'll get to a place where the processing capability of those neuromorphic chips far outstrips the processing capability of a monolithic die based on an x86 architecture, a traditional kind of architecture. Because the way the brain operates, we know it has the capacity and the capability that far outstrips anything else," Kennedy says.

"The most effective kind of systems tend to look very much like things that you see in nature."

Neuromorphic chips are yet to reach their breakthrough moment, but they're coming. Intel has a couple of neuromorphic chips in development today, Loihi and Loihi 2.

And what is a neuromorphic chip, really? Well, it's a brain, with neurons and synapses. But since they're still crafted from silicon, think of them as a sort of hybrid of a classical computer chip and the biology of the brain.

And not necessarily a big brainLoihi 2 has 1 million neurons and 120 million synapses, which is many orders of magnitude smaller than a human brain with roughly 86 billion neurons and trillions of synapses. It's hard to count them all, as you might imagine, so we don't really know precisely, but we have big ol' brains. You can brag about that all you want to your smaller-brained animal companions.

A cockroach is estimated to have as many synapses as Loihi 2, for a better understanding of the grey matter scale we're talking about here.

"We claim you don't need to be that complex that the brain has its function, but if you're going to do computing, you just need some of the basic functions of a neuron and synapse to actually make it work," Dr. Mark Dean told me in 2021.

Dr. Mark Dean

Neuromorphic computing has a lot of room to grow, and with a rapidly growing interest in AI, this nascent technology may prove to be the key to powering those ever-more-impressive AI models you keep reading about.

The amount of processing power would surpass any of the existing products with just a fraction of the energy.

You might think that AI models are running just fine today, which is primarily thanks to Nvidia's graphics cards running the show. But the reason neuromorphic computing is so tantalising to some is "that it can heavily reduce the power consumption of a processor, whilst still managing the same computational capabilities of modern chips," Moreno says.

"In comparison, the human brain is capable of hundreds of teraflops of processing power with only 20 watts of energy consumption, whilst a modest graphics card can output 40-50 teraflops of power with an energy consumption of 450 watts."

Basically, "If a neuromorphic processor were to be developed and implemented in a GPU, the amount of processing power would surpass any of the existing products with just a fraction of the energy."

Sound appealing? Yeah, of course it does. Lower energy consumption isn't only massive for the potential computing power it could bring about, it's massive for using less energy, which has knock-on effects for cooling, too.

"Changing the architecture of computing would also require a different programming paradigm to be implemented, which in its own will also be an impressive feat," Moreno continues.

Building a neuromorphic chip is one thing, programming for it is something else. That's one reason why Intel's neuromorphic computing framework is open-source, you need a lot of hands on deck to get this sort of project off the ground.

"The thing that we haven't cracked yet is the software behind how to leverage the structure," Kennedy says. "And so you can create a chip that looks very much like a brain, the software is really what makes it function like a brain. And to date, we haven't cracked that nut."

It'll take some time before we entirely replace AI accelerators with something that resembles a brain. Or Adders and binary functions, that are as old as computing itself, with quantum computers. Yet experiential attempts have already begun to replace classical computing as we know it.

A recent breakthrough claimed by Microsoft sees the company very bullish on quantum's future, and there's also recently been IBM predicting quantum computers will outperform classical ones in important tasks within two years.

In the words of Intel's Kennedy, "I think we're getting there."

More:

We're on the brink of the biggest changes to computing's DNA and it's not just quantum that's coming - PC Gamer

Posted in Quantum Computing | Comments Off on We’re on the brink of the biggest changes to computing’s DNA and it’s not just quantum that’s coming – PC Gamer

Quantum Computing Explained: A Simple Dive into the Future of Tech – ELE Times

Posted: at 3:27 am

What is Quantum Computing?

Utilizing the ideas of quantum mechanics to carry out computations, quantum computing is a paradigm shift in computing. Quantum computers employ quantum bits, or qubits, which can exist in numerous states concurrently due to the laws of superposition and entanglement. This is in contrast to classical computers, which use bits as binary units (0 or 1).

Quantum Computing History

Physicist Richard Feynman first introduced the idea of quantum computing in the early 1980s as a way to emulate quantum systems. David Deutsch later came up with the name quantum computing in 1985. But the first quantum algorithms, like Grovers and Shors, didnt show off the potential capabilities of quantum computing until the late 1990s.

Types of Quantum Computing

Although there are many ways to create quantum computers, gate-based quantum computing and quantum annealing are the two primary varieties. Quantum gates are used by gate-based quantum computers, like those made by Google and IBM, to control qubits. D-Wave and other quantum annealers use quantum annealing to solve optimization issues.

How Does Quantum Computing Work?

The concepts of superposition and entanglement are used in quantum computing to carry out intricate calculations. Because qubits can exist in several states at once, quantum computers can process enormous volumes of data at once. Qubits are manipulated by quantum gates to carry out operations, and the outcome is determined by measuring the final state.

Quantum Computing Applications

Quantum computing holds promise for a wide range of applications, including:

Quantum Computing Technology

Technologies for quantum computing are being actively developed by several businesses and academic institutes. IBM, Google, Microsoft, Rigetti, IonQ, and D-Wave are some of the major participants. Usually kept in specially designed buildings with extremely low temperatures to minimize interference from outside sources, quantum computers are stored there.

Quantum Computing Advantages

Quantum Computing Disadvantages

Future of Quantum Computing

Quantum computing has immense potential for revolutionary developments in the future. Researchers work to address issues like mistake rates and scalability as technology advances. It is already possible for quantum computers to achieve quantum supremacy, wherein they surpass classical computers in specific tasks. It is anticipated that further advancements in quantum hardware and algorithms will open up new avenues and influence the direction of computing in the future.

In conclusion, quantum computing is an exciting new area of technology that has the potential to completely transform several different sectors. Even while quantum computing is still in its infancy, its quick development and growing interest from academia and industry point to a bright future. We may expect a quantum leap in computational power and efficiency as long as researchers keep overcoming obstacles.

Read more:

Quantum Computing Explained: A Simple Dive into the Future of Tech - ELE Times

Posted in Quantum Computing | Comments Off on Quantum Computing Explained: A Simple Dive into the Future of Tech – ELE Times

Quantum Computing and AI: Basics and Quantum Algorithms | by Aamir Aftab | Dec, 2023 – Medium

Posted: at 3:27 am

The intersection of quantum computing and artificial intelligence (AI) represents one of the most intriguing frontiers in modern technology. While both fields have made significant strides independently, their convergence promises revolutionary advances in computation, data processing, and problem-solving. In this blog post, we will delve into the basics of quantum computing, explore its relevance to AI, and discuss some quantum algorithms that hold promise for the future.

Quantum Computing: A Brief Overview

To understand the potential synergy between quantum computing and AI, its essential first to grasp the fundamentals of quantum computing. Traditional computers use bits binary units of 0s and 1s to store and process information. In contrast, quantum computers leverage quantum bits, or qubits, which can exist in multiple states simultaneously due to the principles of superposition and entanglement.

Superposition allows a qubit to represent both 0 and 1 simultaneously, exponentially increasing computational power. Entanglement, on the other hand, enables qubits to be correlated in such a way that the state of one qubit instantaneously influences the state of another, regardless of distance.

Quantum Computing and AI: A Symbiotic Relationship

The marriage between quantum computing and AI is not merely a theoretical concept but a practical necessity. As AI algorithms grow more complex, the computational demands skyrocket, often exceeding the capabilities of classical computers. Quantum computing offers the potential to overcome these limitations, enabling more efficient algorithms, faster computations, and groundbreaking applications.

Quantum Algorithms for AI

Challenges and Considerations

While the potential benefits of integrating quantum computing and AI are tantalizing, several challenges and considerations warrant attention:

Conclusion

The convergence of quantum computing and AI heralds a new era of technological innovation, offering transformative opportunities across various industries, from healthcare and finance to cybersecurity and logistics. While the path forward is fraught with challenges and uncertainties, the potential rewards justify the investment and exploration of this burgeoning field.

As researchers, scientists, and innovators continue to push the boundaries of quantum computing and AI, collaboration, knowledge-sharing, and interdisciplinary expertise will be crucial. By fostering a symbiotic relationship between quantum computing and AI, we can unlock unprecedented capabilities, solve complex problems, and shape a future defined by innovation, discovery, and progress.

Quantum algorithms hold promise for revolutionizing AI applications, enhancing computational efficiency, and driving advancements in science, technology, and society. As we navigate this exciting frontier, embracing curiosity, collaboration, and creativity will be key to realizing the full potential of quantum computing and AI.

More:

Quantum Computing and AI: Basics and Quantum Algorithms | by Aamir Aftab | Dec, 2023 - Medium

Posted in Quantum Computing | Comments Off on Quantum Computing and AI: Basics and Quantum Algorithms | by Aamir Aftab | Dec, 2023 – Medium

The Emergence of Quantum Computing: Advancements, Challenges, and Future Prospects – Medriva

Posted: at 3:27 am

The Emergence of Quantum Computing

Quantum computing is an emergent computational paradigm that uses quantum bits or qubits as the basic units of information. This unique approach allows for massive parallelism and complex computation through quantum effects and entanglement. Unlike traditional bits that can be either a 0 or a 1, qubits can be in a state of superposition, being both 0 and 1 simultaneously. This feature, along with entanglement, where qubits become interconnected and the state of one can instantly affect the state of another, is what enables quantum speedups.

A notable achievement in quantum computing is the demonstration of quantum supremacy, where a quantum computer performs a task faster than any classical computer. This supremacy has been achieved in experimental setups for specific problems, such as integer factorization using Shors landmark quantum algorithm. This has valuable real-world implications in areas like cryptography. Quantum simulation is another promising domain where quantum computing can have a significant advantage.

While quantum computing poses exciting possibilities, there are tangible challenges to overcome. Fragile qubit coherence times, the engineering scalability of qubit arrays, and operational errors are among the difficulties faced in the field. However, steady experimental progress and cutting-edge technological advancements, such as IBMs 433 qubit powerful Osprey processor, are paving the way towards more robust and efficient quantum processors.

As quantum computing evolves, the risk it poses to existing encryption systems becomes increasingly apparent. The computational prowess of quantum machines threatens to render current cryptographic defenses obsolete. However, initiatives are underway to develop quantum resistant cryptography and quantum key distribution to safeguard digital communications. Post quantum algorithms are also being developed, which are based on complex mathematical problems with no known solutions, ensuring long-term security in the quantum era.

Quantum computing also holds implications for blockchain technology. It has the potential to optimize blockchain by accelerating the mining process, execution of smart contracts, and enhancing security with post quantum algorithms. However, the transition to quantum safe solutions poses challenges in terms of development, implementation, and maintaining the scalability and efficiency of blockchain transactions.

Despite the challenges and threats, the potential of quantum computing is immense. It promises to solve problems currently deemed insurmountable by classical computing. Experts argue that the future of quantum computing lies in small, steady improvements rather than revolutionary leaps. Once integrated effectively, these improvements could lead to the construction of increasingly larger and more powerful quantum systems, revolutionizing numerous fields of study and transforming the world as we know it.

In conclusion, while quantum computing is surrounded by hype, its not just an illusion. Its a rapidly evolving field with significant challenges to overcome, but its potential to reshape our world is undeniable.

Read more:

The Emergence of Quantum Computing: Advancements, Challenges, and Future Prospects - Medriva

Posted in Quantum Computing | Comments Off on The Emergence of Quantum Computing: Advancements, Challenges, and Future Prospects – Medriva

What Quantum Computing Will Mean for the Future Artificial Intelligence – Medium

Posted: at 3:27 am

Todays artificial intelligence (AI) systems are only as good as the data theyre trained on. The AI industry is currently taking advantage of large datasets to train AI models and make them more useful. However, as these datasets are becoming limited, researchers are exploring other ways to improve AI algorithms. One such way is quantum computing. It is a new frontier of computer science that will enable better AI algorithms shortly.

Atoms make up our world, and they and their constituents have baffling yet interesting properties. For example, electrons have spin and orbit that can be either up. In addition, they can be in any of the infinite discrete energy levels. These properties determine the quantum states of atoms. At a subatomic level, everything exists as quantum states rather than as traditional logical on or off values. This phenomenon gave rise to quantum computing. It has the potential to change how we see artificial intelligence forever.

Quantum computing is an entirely different way of studying the world around us. It does not just focus on the properties of atoms and molecules. It takes a look at the subatomic properties of atoms that are actually in superposition. That is, they exist in multiple states at the same time. This is one of the principles of quantum mechanics that enable subatomic particles to exist as both particles and waves at the same time.

These principles are strange and counterintuitive. According to them, a computing system cannot only store and process data in binary bits, 0s and 1s. Or in more electronic engineering terms, the state of off and on of an electronic switch. It can also store and process data in superposed states of not on or off but the combination thereof. By harnessing these principles, quantum computers can solve complex problems much faster than traditional computers.

Quantum computers are a variety of different supercomputers based on quantum mechanics. These quantum computers use the laws of quantum mechanics to process information. That means they can find patterns in big data that are almost impossible to find with conventional computers. This way, they are fundamentally different from the computers we use today.

When it comes to artificial intelligence, quantum computing can analyze a wider variety of data. At the same time, they can come to better conclusions than computers today. Conventional computers can only process information as either 1s or 0s. Quantum computers can process information in multiple states known as qubits at once. That enables them to analyze a wider variety of data and come to better conclusions than computers can today.

Artificial intelligence has come a long way in the past few years. It has been able to generate realistic 3D images and videos. In addition, it is beginning to embrace quantum computing. That has given rise to quantum AI. Artificial intelligence now leverages quantum computers. And their full integration will be a technological revolution of the century.

There are several benefits of using quantum AI in creative industries. I have already made it clear it can handle large data sets faster and more efficiently than traditional AI technologies. It can also identify patterns that are difficult for regular computers to spot. Furthermore, it can combine and rearrange existing ideas. Hence it can create new ideas in ways that any human cannot imagine possible.

One of the biggest hurdles for artificial intelligence today is training the machine to do something useful. For example, we might have a model that can correctly identify a dog in a photo. But the model will need to be trained with tens of thousands of images for it to recognize the subtle differences between a beagle, a poodle, and a Great Dane. This process is what AI researchers call training. They use it to teach AI algorithms to make predictions in new situations.

Quantum computing can make this training process faster and more accurate. It will allow AI researchers to use more data than they have ever used before. It can process large amounts of data in 1s and 0s and the combination thereof which will enable quantum computers to come to more accurate conclusions than traditional computers. In other words, AI researchers can use larger datasets to train AI models to be more accurate and better at decision-making.

One of the most exciting predictions for quantum computing in artificial intelligence is the potential to break through language barriers. AI models can currently understand one language the language used to train them. so if we need AI to understand a different language, we shall need to teach it from scratch. However, quantum computing can help AI models break through language barriers. It will allow us to train models in one language and translate them into a different language effortlessly.

That will enable AI to understand and interpret different languages simultaneously. What this will do is create a global AI that can speak multiple languages. Another exciting prediction for the future of AI with quantum computing is the potential to build models with more accurate decision-making skills: Quantum computing will allow using larger datasets to train models. Hence AI will be able to make more accurate decisions that will be especially helpful for financial models, which often have a high rate of inaccuracy because of the limited data used to train them.

Artificial intelligence is already improving the performance of quantum computers. This trend will only continue in the future. The following are some reasons why:

The potential of quantum computing is limitless, but its integration into artificial intelligence will produce a technology that will be rather powerful than anything we have today. The new technology will enable machines to learn and self-evolve. It will make them exponentially better at solving complex problems and developing self-learning algorithms that will drive efficiency in sectors such as finance or healthcare.

Quantum AI systems will be able to process large amounts of information quickly and accurately. That will open up a new world of possibilities for businesses and individuals. They will also be able to solve complex problems that are impossible for even the most advanced conventional computer systems.

Nevertheless, we must remember that these technologies are relatively new; we are still discovering new ways to use quantum computing. Therefore, we must be aware of the latest technology to take advantage of new opportunities as they come along.

The rise of quantum computing will change the way we interact with AI in the future. That means we must stay informed so we can prepare for the changes and make the most of this exciting technology.

Go here to read the rest:

What Quantum Computing Will Mean for the Future Artificial Intelligence - Medium

Posted in Quantum Computing | Comments Off on What Quantum Computing Will Mean for the Future Artificial Intelligence – Medium

The Quantum Leap: Revolutionizing Computing and Its Impact – Medium

Posted: at 3:27 am

Supercomputers have their limitations. Fortunately, a new technology is emerging. Its the quantum computer, utilizing phenomena at the atomic and subatomic levels. Quantum computer Chalmers [Photo: Anita Fors (Chalmers), CC BY-SA 4.0, via Wikimedia Commons]

Our civilization largely operates today due to computers and the data they process. However, when significant computational power is required, the existing silicon-based technology falls short. Hence, companies like IBM, Google, Microsoft, Alibaba, and a few others are currently working on prototype inventions. This is about quantum computing.

Major companies understand that whoever first masters quantum computations will gain a significant advantage over competitors. Computers based on this technology will be able to swiftly sift through massive amounts of data. They will also enable modeling complex physical or biochemical phenomena.

Quantum computers perform computations not on bits, which can hold values of 0 or 1, but on so-called qubits. These can hold different values simultaneously. Scientists leverage the principles governing the world of elementary particles to create computational machines.

Conventional computers conduct calculations on sequences of bits zeros and ones. Quantum computers employ quantum bits or qubits, which can assume both these values simultaneously this is called superposition. This exponential increase in computational power occurs as a result. Quantum computers can perform operations in one fell swoop that would take classical machines an enormous amount of time. Qubits can be constructed from individual elementary particles like electrons, atoms, or slightly larger entities loops of superconductors through which current flows incessantly.

In the realm of quantum physics, there exists a strange and not entirely understood relationship between elementary particles such as electrons. When we entangle them (for example, by bringing them close together), their fates become closely intertwined. If we then alter the properties of one, the other

See original here:

The Quantum Leap: Revolutionizing Computing and Its Impact - Medium

Posted in Quantum Computing | Comments Off on The Quantum Leap: Revolutionizing Computing and Its Impact – Medium

Scientists think they’ve created the world’s 1st ‘practical’ quantum-secure algorithm – Livescience.com

Posted: December 28, 2023 at 11:52 pm

Scientists think they've created the first practical cryptographic algorithm that could protect data and communications from quantum computers.

However, other experts in the field remain skeptical, saying algorithms backed by a cutting-edge U.S.-government-funded lab have a better chance of being used widely.

Cryptography tools, like WhatsApp's end-to-end encryption, protect data like messages sent between two people by scrambling it into a secret code that only a unique digital key can unlock. If hackers intercept an encrypted message, all they'll see is jumbled-up nonsense. The hacker could try to guess the cryptographic key and decipher the message, but it would take the most powerful supercomputer millions of years to try every possible combination which these machines would perform one at a time.

Quantum computers, on the other hand, can perform several calculations at once. They aren't powerful enough to break cryptography yet, but scientists plan to develop increasingly powerful machines that could one day bypass this essential security layer within seconds.

Now, researchers say they've developed the most efficient quantum-safe proposal to date, based on existing so-called verifiable random function (VRF) technology, which they dub "LaV." They described their research in a paper, which has not yet been peer-reviewed, published Nov. 14 in the Cryptology ePrint Archive, a cryptology research preprint database.

VRF takes a series of inputs, computes them, and churns out a random number that can be cryptographically verified to be random. It's usually an add-on to encryption that boosts the security of digital platforms. It's an essential part of WhatsApp's key transparency protocol, as well as some blockchain systems.

But LaV is a quantum-safe version of VRF. Unlike its predecessor, it could theoretically provide end-to-end security from quantum computers, said lead researcher Muhammed Esgin, an information technology lecturer at Monash University in Australia.

Related link: Chinese researchers to send an 'uncrackable' quantum message to space

"Our algorithm is designed to withstand theoretical and practical attacks even by large-scale quantum computers (that can break today's classical cryptographic algorithms)," Esgin told Live Science in an email. "So it can protect against today's supercomputers as well as tomorrow's powerful quantum computers."

LaV can be accessed through the open-source platform GitLab. Its creators claim it's a practical solution, as opposed to four candidates backed by the National Institute of Standards and Technology (NIST), which has been hunting for a quantum encryption protocol for years. However, some experts disagree.

LaV may not be the best solution to the impending quantum threat, Edward Parker, a physical scientist with The RAND Corporation, told Live Science.

"There are several existing quantum-secure cryptography algorithms that already exist," he said, and NIST is standardizing these tools, "essentially giving those four algorithms the U.S. government's stamp of approval for widespread use."

"It's widely expected that these four algorithms will become the backbone of future quantum-secure cryptography, rather than LaV or any of the dozens of other quantum-secure algorithms that have been proposed," he added. "The four algorithms that NIST selected have undergone several years of very careful vetting, and we can be very confident that they are indeed secure."

Jonathan Katz, a computer scientist at the University of Maryland's Institute for Advanced Computer Studies (UMIACS), also backsNIST's efforts. "The cryptography research community has been working on quantum-safe algorithms for well over two decades, and the NIST post-quantum cryptography standardization effort began in 2017," he told Live Science in an email.

However, Parker added that "it's certainly possible that LaV may be somewhat more efficient than other quantum-secure algorithms."

Vlatko Vedral, a professor of quantum information science at the University of Oxford, told Live Science he suspects LaV may not be the first algorithm of its type, though it may be the first released publicly.

"The industry is getting closer and closer to making a large-scale quantum computer, and it is only natural that various protections against its negative uses are being explored," Vedral said. "Code making and code breaking have always been locked into an arms race against each other."

Read more here:

Scientists think they've created the world's 1st 'practical' quantum-secure algorithm - Livescience.com

Posted in Quantum Computing | Comments Off on Scientists think they’ve created the world’s 1st ‘practical’ quantum-secure algorithm – Livescience.com

Quantum Computers Begin to Measure Up | Research & Technology | Dec 2023 – Photonics.com

Posted: at 11:52 pm

WAKO, Japan, Dec. 27, 2023 Much of the progress so far in quantum computing has been done on so-called gate-based quantum computers. These devices use physical components, most notably superconducting circuits, to host and control the qubits. The approach bears similarity to conventional, device-based classical computers. The two computing architectures are thus relatively compatible and could be used together in hybrid. Furthermore, future quantum computers could be fabricated by harnessing existing technologies used to fabricate conventional computers.

But the Optical Quantum Computing Research Team at the RIKEN Center for Quantum Computing has been taking a very different approach. Instead of optimizing gate-based quantum computers, Atsushi Sakaguchi, Jun-ichi Yoshikawa and team leader Akira Furusawa have been developing measurement-based quantum computing.

Measurement-based quantum computers process information in a complex quantum state known as a cluster state, which consists of three (or more) qubits linked together by a non-classical phenomenon called entanglement.

Measurement-based quantum computers work by making a measurement on the first qubit in the cluster state. The outcome of this measurement determines what measurement to perform on the second entangled qubit, a process called feedforward. This then determines how to measure the third. In this way, any quantum gate or circuit can be implemented through the appropriate choice of the series of measurements.

Importantly, measurement-based quantum computation offers programmability in optical systems. We can change the operation by just changing the measurement, said Sakaguchi. This is much easier than changing the hardware, as gated-based systems require in optical systems.

But feedforward is essential. Feedforward is a control methodology in which we feed the measurement results to a different part of the system as a form of control, Sakaguchi said. In measurement-based quantum computation, feedforward is used to compensate for the inherent randomness in quantum measurements. Without feedforward operations, measurement-based quantum computation becomes probabilistic, while practical quantum computing will need to be deterministic.

The Optical Quantum Computing Research Team and their co-workers from The University of Tokyo, Palack University in the Czech Republic, the Australian National University and the University of New South Wales, Australia have now demonstrated a more advanced form of feedforward: nonlinear feedforward. Nonlinear feedforward is required to implement the full range of potential gates in optics-based quantum computers.

Optical quantum computers use qubits made of wave packets of light. At other institutions, some of the current RIKEN team had previously constructed the large optical cluster states needed for measurement-based quantum computation. Linear feedforward has also been achieved to construct simple gate operations, but more advanced gates need nonlinear feedforward.

A theory for practical implementation of nonlinear quadrature measurement was proposed in 2016.3 But this approach presented two major practical difficulties: generating a special ancillary state (which the team achieved in 20214) and performing a nonlinear feedforward operation.

The key advantages of this nonlinear feedforward technique are its speed and flexibility. The process needs to be fast enough that the output can be synchronized with the optical quantum state.

Now that we have shown that we can perform nonlinear feedforward, we want to apply it to actual measurement-based quantum computation and quantum error correction using our previously developed system, Sakaguchi said. And we hope to be able to increase the higher speed of our nonlinear feedforward for high-speed optical quantum computation.

But the key message is that, although superconducting circuit-based approaches may be more popular, optical systems are a promising candidate for quantum-computer hardware, he added.

The research was published in Nature Communications (www.doi.org/10.1038/s41467-023-39195-w).

Originally posted here:

Quantum Computers Begin to Measure Up | Research & Technology | Dec 2023 - Photonics.com

Posted in Quantum Computing | Comments Off on Quantum Computers Begin to Measure Up | Research & Technology | Dec 2023 – Photonics.com

Experts warn quantum computers are overhyped and far away – Fudzilla

Posted: at 11:52 pm

Neither dead nor alive yet

While quantum computing companies have said their machines could be doing amazing things in just a few years, some top experts say they don't believe the hype.

Meta's AI boss, Yann LeCun, made a splash after saying quantum computers are not that great. Speaking at a media event to mark ten years of Meta's AI team, he said the technology is "a fascinating scientific topic". Still, he was unsure of "the possibility of actually making useful quantum computers."

LeCun is not a quantum computing expert; other big names in the field also raise doubts. Oskar Painter, head of quantum hardware for Amazon Web Services, says there is a "tremendous amount of hype" in the industry right now and "it can be hard to tell the hopeful from the hopeless."

A big problem for today's quantum computers is that they make many mistakes. Some have said these so-called "noisy intermediate-scale quantum" (NISQ) machines could still work well. But Painter says that's not likely, and quantum error-correction tricks will be needed to make practical quantum computers.

The main idea is to spread information over unreliable qubits to make "logical qubits." But this could need as many as 1,000 dodgy qubits for each good one. Some have said that quantum error correction could be impossible, but that's not popular. Either way, making these tricks work at the size and speed needed is a long way off, Painter says.

"Given the remaining technical challenges in making a fault-tolerant quantum computer that can run billions of gates over thousands of qubits, it's hard to say when it will happen, but I would guess at least ten years away," he said.

In May, top Microsoft boffin Matthias Troyer penned a paper saying that quantum computers could only do better than regular computers in a few areas.

"We discovered over the last ten years that many things people have suggested don't work. And then we found some straightforward reasons for that."

The main point of quantum computing is to solve problems much faster than regular computers, but how much quicker depends. There are two things where quantum tricks seem to give a tremendous speed up, said Troyer.

One is breaking big numbers into smaller ones, which could crack the codes that keep the Internet safe. The other is copying quantum systems, which could help with chemistry and materials.

Quantum tricks have been suggested for optimisation, drug design, and fluid dynamics. But the speed-ups don't always work out--sometimes they are only a bit faster, meaning the time it takes the quantum trick to solve a problem is the square root of the time taken by the normal one.

Troyer says these speed-ups can quickly disappear because of the enormous amount of work quantum computers need. Running a qubit is much more complicated and slower than flipping a switch. This means that for more minor problems, an average computer will always be faster, and the point where the quantum computer takes the lead depends on how fast the normal one gets more challenging.

Troyer and his mates compared a single Nvidia A100 GPU against a made-up future fault-tolerant quantum computer with 10,000 "logical qubits" and gate times much faster than today's machines.

They found that a quantum trick with a bit of a speed-up would have to run for hundreds or thousands of years before it could beat a normal one on problems significant enough to matter.

Troyer said quantum computers will only work on small-data problems with huge speed-ups. "All the rest is nice theory but will not be useful," he said.

All this is pouring cold water on the idea that Quantum computers will be here soon or that the Internet is in danger of having its codes broken by thieves or spooks using the technology.

It would appear that, for now, the cat is still only potentially dead or alive.

Go here to see the original:

Experts warn quantum computers are overhyped and far away - Fudzilla

Posted in Quantum Computing | Comments Off on Experts warn quantum computers are overhyped and far away – Fudzilla

Page 13«..10..12131415..2030..»