Page 52«..1020..51525354..6070..»

Category Archives: Quantum Computing

Sandia Designs a Faster and More Accurate Quantum Computing Benchmark – HPCwire

Posted: December 22, 2021 at 12:32 am

Dec. 21, 2021 What does a quantum computer have in common with a top draft pick in sports? Both have attracted lots of attention from talent scouts. Quantum computers, experimental machines that can perform some tasks faster than supercomputers, are constantly evaluated, much like young athletes, for their potential to someday become game-changing technology.

Now, scientist-scouts have their first tool to rank a prospective technologys ability to run realistic tasks, revealing its true potential and limitations.

A new kind of benchmarktest, designed at Sandia National Laboratories, predicts how likely it is that a quantum processor will run a specific program without errors.

The so-called mirror-circuit method, published today inNature Physics, is faster and more accurate than conventional tests, helping scientists develop the technologies that are most likely to lead to the worlds first practical quantum computer, which could greatly accelerate research for medicine, chemistry, physics, agriculture andnational security.

Until now, scientists have been measuring performance on obstacle courses of random operations.

But according to the new research, conventional benchmark tests underestimate manyquantum computingerrors. This can lead to unrealistic expectations of how powerful or useful a quantum machine is. Mirror-circuits offer a more accurate testing method, according to the paper.

A mirror circuit is a computer routine that performs a set of calculations and then reverses it.

It is standard practice in the quantum computing community to use only random, disordered programs to measure performance, and our results show that this is not a good thing to do, said computer scientist Timothy Proctor, a member of Sandias Quantum Performance Laboratory who participated in the research.

The new testing method also saves time, which will help researchers evaluate increasingly sophisticated machines. Most benchmark tests check for errors by running the same set of instructions on a quantum machine and a conventional computer. If there are no errors, the results should match.

However, because quantum computers perform certain calculations much faster than conventional computers, researchers can spend a long time waiting for the regular computers to finish.

With a mirror circuit, however, the output should always be the same as the input or some intentional modification. So instead of waiting, scientists can immediately check the quantumcomputers result.

New method reveals flaws in conventional performance ratings

Proctor and his colleagues found that randomized tests miss or underestimate the compound effects of errors. When anerroris compounded it grows worse as the program runs, like a wide receiver who runs the wrong route, straying farther and farther from where they are supposed to be as the play goes on.

By mimicking functional programs, Sandia found final results often had larger discrepancies than randomized tests showed.

Our benchmarking experiments revealed that the performance of current quantum computers is much more variable on structured programs than was previously known, Proctor said.

The mirror-circuit method also gives scientists greater insight into how to improve current quantum computers.

By applying our method to current quantum computers, we were able to learn a lot about the errors that these particular devices sufferbecause different types of errors affect different programs a different amount, Proctor said. This is the first time these effects have been observed in many-qubit processors. Our method is the first tool for probing these error effects at scale.

More information:Timothy Proctor, Measuring the capabilities of quantum computers,Nature Physics(2021).DOI: 10.1038/s41567-021-01409-7.www.nature.com/articles/s41567-021-01409-7.

Journal information:Nature Physics.

Source: Sandia National Laboratories

Read the original:

Sandia Designs a Faster and More Accurate Quantum Computing Benchmark - HPCwire

Posted in Quantum Computing | Comments Off on Sandia Designs a Faster and More Accurate Quantum Computing Benchmark – HPCwire

Quantum computing now has an out-of-this-world problem: Cosmic rays – ZDNet

Posted: at 12:32 am

A new academic paper reveals a worrisome tendency for cosmic rays to disrupt quantum computer processors in a way that may be nearly impossible for current error correction techniques to reliably counteract.

One of the biggest obstacles faced by quantum computers is dealing with error correction. Traditionally, this has been most commonly handled by grouping together multiple qubits, the quantum equivalent of traditional computing's bits, into a sort of committee within quantum processing units. Rather than the system relying on a single qubit, which may or may not be correct, it instead relies on the consensus provided by an entire group of qubits. This strips away erroneous outliers and greatly reduces the error rate to a point where it's extremely unlikely that it will interfere with an ongoing processing job.

Unfortunately, in a very sci-fi-sounding turn of events, it appears that an unseen enemy from outer space may be threatening the viability of this error-correcting technology.

Cosmic rays are invisible, microscopic particle beams that constantly bombard the Earth from sources as far away as other galaxies. They typically collide harmlessly with the planet's atmosphere as well as objects within it. In fact, you'll likely be hit by several of them while reading this article. Luckily, for our peace of mind, they generally go completely unnoticed and do absolutely no harm before continuing on their cosmic journey. Unfortunately for quantum computing developers, it appears that quantum processors may be far, far more sensitive to these typically unnoticeable intruders than they realized.

In a paper published in Nature Physics and covered by Ars Technica, it's been revealed that one of these typically harmless rays could cause a major problem when it hits an operating quantum CPU. According to the findings of several researchers working at Google Quantum AI, a cosmic ray strike on an operating quantum computer core can result in the formation of a quasiparticle called a phonon.

These phonons have the capacity to disrupt operations by inverting the quantum state of not only a single qubit, but an entire entangled set of qubitsas they proliferate across the processor. This means a strike could distribute errors across an entire qubit set, essentially nullifying the protection provided by the committee-like error correction mentioned above.

In an experiment detailed within the paper, Google researchers tested a set of 26 qubits that were known to be amongst their most reliable. This set was then left in an idle state for 100 microseconds. While idling, reliable qubits should generally remain in their current state. To use a traditional, binary computing analogy, a 1 should remain a 1, a 0 should remain a 0.

On average, the 26 qubits set in question displayed an error rate of about 4 qubits that erroneously flipped their state within the 100 microsecond test period. This is well within the built-in error correction's ability to compensate by relying on the remaining majority of 22 qubits. However, during confirmed quantum ray strikes, 24 of the 26 qubits were found to have erroneously flipped to the opposite state. This result is well beyond traditional error correction's ability to compensate for. Such an outcome would place the entire group in error and could throw the entire processing job's continuity into question.

Cosmic ray interference is nothing new. As Ars noted, they can also interact with traditional CPUs by messing with the electrical charges they rely on to complete their logic operations. However, the unique and still-developing structure of quantum processors makes them far more prone to such interference, with Google's research indicating that a cosmic ray-induced error happens as often as every 10 seconds. This means the hours-long processing jobs most quantum CPUs are being tasked with could include hundreds, if not thousands of errors littered throughout their results.

Making matters worse is the fact that the processor these researchers used for their testing was rather small. As processing demands increase, so too must the size of the quantum processor. But, the larger the processor, the more surface area there is to potentially suffer a cosmic ray collision. It appears the threat of forced errors is only going to become direr as quantum CPUs continue making their way towards practical applications.

Unfortunately, there is no practical way to reliably block these problematic, intergalactic travelers. They are moving at almost the speed of light, after all. However, as pointed out by Ars Technica, some clever workarounds have already been developed to help devices like astronomical imaging equipment cope with quantum ray interference. While the paper does not specifically explore the viability of these potential solutions, they do seem to indicate the problem of cosmic ray interference is a surmountable one.

Read the rest here:

Quantum computing now has an out-of-this-world problem: Cosmic rays - ZDNet

Posted in Quantum Computing | Comments Off on Quantum computing now has an out-of-this-world problem: Cosmic rays – ZDNet

10 technology trends that could prove to be real game-changers – Mint

Posted: at 12:32 am

Smarter algorithms and machine language: AI has been the driving force for most products, applications and even devices that we use today. On 22 November, Gartner predicted that the total revenue in the AI software market is expected to hit $62.5 billion in 2022, an increase of 21.3% from 2021. The AI software market is picking up speed, but its long-term trajectory will depend on enterprises advancing their AI maturity," said Alys Woodward, senior research director at Gartner.

AI deployment in 2022 will be in knowledge management, virtual assistants, autonomous vehicles, digital workplaces and crowdsourced data, Gartner said. In addition, companies like Google are developing newer language learning models like LaMDALanguage Model for Dialogue Applicationswhich, the company claims, can hold their own in natural conversations.

Faster networks with bigger bandwidth: 5G has been in the works for what seems like years now, but 2022 may finally be the year we see these next-generation networks rolling out. India has already approved trial spectrum for telcos such as Bharti Airtel and Reliance Jio. The 5G spectrum auctions are expected in the first half of next year. 5G networks will start rolling out to the public next year if all goes well. In short, 5G means lower latency, which is what users perceive as speed. The new networks will allow new use cases for enterprises, enable smart city implementations and more.

Intelligent cloud and edge computing: The new use cases with 5G networks are heavily dependent on 5G. For instance, in September, Airtel tested Indias first cloud-gaming session in a 5G environment at its Manesar facility. The companys chief technology officer, Randip Sekhon, said cloud gaming would be among the biggest use cases" for 5G networks. The dependency on the cloud will only increase among enterprises.

Moreover, edge computing is finally set to flourish. It is helping enterprises bring the data and computing requirements closer to the users device. This trend will help make products like driverless or autonomous vehicles more efficient.

More interconnected devices that talk to each other: Earlier this month, Airtel, Invest India and the National Investment Promotion and Facilitation Agency announced a Startup Innovation Challenge. The challenge asks early-stage startups to create new use cases involving IoT. As data flows faster and computing power comes from large server farms using the cloud, more devices can start connecting and working as one. A June report by Gartner said the IoT endpoint electronics and communications market will touch $21.3 billion in 2022, increasing its forecast by 22% against the 2021 predictions. This is driven by governments using IoT for surveillance, enterprises using connected devices for everything from banking to communication, and delivering new products.

Privacy gaining ground: After about two years of deliberation, the joint parliamentary committee (JPC) on the Data Protection Bill was finally able to table its report on the bill during the ongoing winter session of Parliament. The JPC recommended that India have one bill to regulate personal and non-personal data and stop companies from profiling childrens accounts and using targeted ads for them. The bill also gives consumers rights over their data. But India isnt the only country looking into such data regulations. Indias bill borrows heavily from the European General Data Protection Regulation (GDPR), and governments worldwide are also considering such regulations. Big Tech firms are fighting lawsuits against government bodies, competition regulations and more. The outcome of all these cases will impact how our data is used in the future.

Mixing and blending realities: In 1964, an animated science-fiction franchise called Jonny Quest imagined a virtual world called QuestWorld. The protagonists would put on futuristic virtual reality (VR) headsets and fight battles in a virtual world. It was futuristic then, but VR and augmented reality (AR) headsets are all too familiar now. In fact, they have been for almost a decade now. But in 2021, Facebook launched a product called Ray-Ban Stories, partnering with eyeglass maker Ray-Ban for a pair of smart glasses that look and feel almost exactly like regular spectacles. Tech firms aim to make these devices ubiquitous and reach economies of scale that comes from selling millions of devices worldwide.

Immutable and interconnected ledgers: If AI was the key change maker over the past decade, blockchain might well enable the next step in the technology. According to many estimates, India has become one of the top players in cryptocurrency adoption worldwide, but whats seen as a trading asset today has more significant implications. Cryptocurrencies are powered by blockchain technology, and in April, the International Data Corp. said that organizations would spend as much as $6.6 billion on blockchain solutions in 2021 alonea 50% increase from 2020. The market researcher also predicted an annual average growth rate of 48% between 2020 and 2024. Indias second crypto unicorn, CoinSwitch Kuber, has said that it aims to support other blockchain firms in India. Industry stakeholders and experts understand that blockchains will power cross-border payments, banking and much more in future. Even the Reserve Bank of Indias upcoming Central Bank Digital Currency, or a digital rupee, will be powered by blockchain technologies.

The third generation of the internet: The hit HBO show Silicon Valley has imagined a new internet void of dominance by Big Tech firms, governments and more. The idea may sound utopian, but thats exactly what companies building apps for the third generation of the internet (web3) are building today. Companies like Google, Apple, Facebook and others benefit greatly from the fact that most of the worlds data flows through their servers. However, with web3, the power is handed back to the users in a way. It runs without servers, depends on a network of phones, computers and other devices, and bars any one person or entity on the network to wield control on datain a word, decentralization. For instance, Noida-based Ayush Ranjan has built the worlds first decentralized video chat app. Unlike Google Meet, Zoom, the Huddle 01 app doesnt require users to create an account, and the company doesnt have its own data centres to store your data in or record calls. Instead, it stores all the data in a decentralized manner and uses computing power from users devices to power the calls.

Rise of the metaverse: 5G, cloud computing, IoT, web3 are all tools in a larger vision that technologists and technology leaders have right now. And thats called the metaverse. Facebooks Mark Zuckerberg is so confident that the metaverse is coming that he rebranded his company, one of the most valuable in the world, to Meta as an effort to show where his focus is today. Author Neal Stephenson is often credited with coining the term in his 1992 novel Snow Crash, and it has also been explored in contemporary movies like Ready Player One. The metaverse is not a technology; it is a concept. Zuckerberg and others expect that we will do everything from conducting meetings to hosting parties in a virtual space and through very realistic looking avatars. Instead of shopping on an e-commerce store, the avatar will walk into a virtual store, try on a product and have the physical product delivered to our homes too. However, hardware veterans like Intels Raja Koduri have warned that the computing power we have today is nowhere close to being sufficient for the metaverse Zuckerberg imagines.

Quantum computing: That brings us to what could be the most transformational trend in technologyquantum computing. Any country with aspirations to be a leader in technology has its sights set on quantum computing. While web3 is a new internet, quantum computing establishes a whole new computer. Our traditional computers can take information in 0 and 1, and their computations are limited by this. Quantum computers, on the other hand, use concepts of quantum physics to enhance the amount of computing power we can use. A quantum computer is far from reality right now, and it could be the kind of computing power Koduri says we need for the metaverse. In the 2020 Budget, the government had allocated 8,000 crore over the next five years for developing quantum computing tech. It has also launched a Quantum Simulator, which allows researchers to build quantum applications without a real computer.

Subscribe to Mint Newsletters

* Enter a valid email

* Thank you for subscribing to our newsletter.

Never miss a story! Stay connected and informed with Mint. Download our App Now!!

Go here to read the rest:

10 technology trends that could prove to be real game-changers - Mint

Posted in Quantum Computing | Comments Off on 10 technology trends that could prove to be real game-changers – Mint

To build the quantum internet, UChicago engineer teaches atoms how to remember – UChicago News

Posted: at 12:32 am

When the quantum internet arrives, researchers predict it will shift the computing landscape on a scale unseen in decades. In their estimation, it will make hacking a thing of the past. It will secure global power grids and voting systems. It will enable nearly limitless computing power and allow users to securely send information across vast distances.

But forTian Zhong, assistant professor at the Pritzker School of Molecular Engineering (PME) at the University of Chicago, the most tantalizing benefits of the quantum internet have yet to be imagined.

Zhong is a quantum engineer working to create this new global network. In his mind, the full impact of the quantum internet may only be realized after its been built. To understand his work and why the United States is spending$625 millionon the new technology, it helps to consider the science behind it: quantum mechanics.

Quantum mechanics is a theory created to explain fundamental properties of matter, particularly on the subatomic scale. Its roots trace back to the late 19th and early 20th century, when scientists tried to explain the unusual nature of light, which behaves as both a wave and a particle. In the hundred years since then, physicists have learned a great deal, particularly concerning the strange behavior of subatomic particles.

Theyve learned, for example, that some subatomic particles have the ability to be in two states at the same time, a principle called superposition. Another such principle is entanglement, which is the ability of two particles to communicate instantaneously despite being separated by hundreds of miles.

Over time, scientists have found ways to manipulate those principles, entangling particles at will or controlling an electrons spin. That new control allows researchers to encode, send, and process information using subatomic particleslaying the foundations of quantum computing and the quantum internet.

At the moment, both technologies are still hampered by certain physical limitationsquantum computers, for example, need to be kept in giant sub-zero freezersbut researchers like Zhong are optimistic those limitations will be resolved in the near future.

Were at a juncture where this is no longer science fiction, Zhong said. More and more, its looking like this technology will emerge from laboratories any day, ready to be adopted by society.

Zhongs research focuses on the hardware needed to make the quantum internet a reality, things like quantum chips that encrypt and decrypt quantum information, and quantum repeaters that relay information across network lines. To create that hardware, Zhong and his team work on the subatomic scale, using individual atoms to hold information and single photons to transmit it through optic cables.

Zhongs current work centers on finding ways to fight against quantum decoherence, which is when information stored on a quantum system degrades to the point that its no longer retrievable. Decoherence is an especially difficult obstacle to overcome because quantum states are extremely sensitive and any outside forcebe it heat, light, radiation, or vibrationcan easily destroy it.

Most researchers address decoherence by keeping quantum computers at a temperature near absolute zero. But the instant any quantum state is transmitted outside the freezer, say on a network line, it begins to break down within a few microseconds, severely limiting the potential for expansive interconnectivity.

See original here:

To build the quantum internet, UChicago engineer teaches atoms how to remember - UChicago News

Posted in Quantum Computing | Comments Off on To build the quantum internet, UChicago engineer teaches atoms how to remember – UChicago News

Deep tech in 2022: the future is looking artificially intelligent – Information Age

Posted: at 12:32 am

Daan de Cloe, co-founder and CTO of AutoFill Technologies, provides his predictions for the deep tech space in 2022 and beyond

Deep tech capabilities powered by AI are set to make waves in the near future.

Whether a buzzword or not, its safe to say that deep tech is currently one of the hottest areas of interest for technologists and set to only gain traction in the coming years.

From artificial intelligence through to robotics, quantum computing, blockchain and biotech the list goes on the unique thing about deep tech is, albeit developed by commercial firms, it is not necessarily targeted at end-consumers. Based on research around engineering innovation with deep tech, scientists and technologists come together to collaborate towards a common goal, may that be finding the latest solution to a chronic disease, or the product idea that will positively impact societal challenges like climate change.

Due to its scientific, theoretical nature, a lot has been speculated on how deep tech can help us create tangible societal shifts and build a better future. But what does the future look like? More specifically, what sorts of technologies will make the most impact on the world? Ive got my bets on a few things.

When deep tech ties into a computers tangible components, and trained data sets are piping through a piece of hardware, the possibilities are endless. The increased combination of hardware and software, powered by edge computing, will bring the opportunities and applications of AI to a whole new level, resulting in enormous change of existing operational processes.

At the edge, processors that collect data are embedded within devices, and data is collected at its sources rather than in the cloud or a data centre. This massively accelerates the AI pipeline, unlocking a whole new variety of AI-powered functionalities.

I believe this will be a big focus for companies and industries in 2022, and its no surprise that leading computer systems companies, such as Nvidia, are investing heavily on their edge computing offerings.

Martin Percival, solutions architect at Red Hat, identifies three elements that are essential to the future of edge computing. Read here

Whilst the difficulties in developing practical versions of quantum computers have consequently confined them to the lab, its safe to say that the race to quantum supremacy will only grow tighter in 2022, led by the likes of Google and IBM. Its no longer a matter of if, but rather when quantum computers will become the new norm.

Simply put, quantum computers are able to solve complicated problems incredibly fast and effectively. In a few seconds, they can perform calculations that todays supercomputers would need decades or even millennia to work out. These sorts of problems range from a logistics company trying to determine the greenest route between 50 different cities and 300 addresses in real-time to reduce carbon emissions, to a pharmaceutical organisation experimenting with simulated molecules to predict drug interactions with a mutating virus.

By harnessing the strange world of quantum mechanics, quantum computer systems create multidimensional environments in which such large problems can be represented. Later, algorithms that apply quantum wave interference analyse all the different combinations and translate the optimal possibilities back into solutions we can understand and practically use.

This results in significantly higher processing efficiency and time-saving. For example, if you wanted to find one item in a list of 1 trillion, and each item took 1 microsecond to check, a classical computer would take about 1 week to find the item, whereas a quantum computer about 1 second.

That being said, theres no doubt that quantum computing will enable new, more advanced ways of machine learning. Thats because the ability to process very large quantities of data in micro periods of time improves the quality and accuracy of predictions and decisions made by artificial intelligence. It becomes intuitively smart and capable of identifying patterns. Better pattern recognition, in turn, allows for business leaders to keep a close eye on a chain of events and act proactively to avoid potential issues, instead of having to react to a situation that may have occurred.

So far, artificial intelligence has seen most of its applications in market sectors like professional & financial services, and high-tech telecoms. However, Ive observed a recent take on AI applications by the automotive, transport & mobility sector, and my guess is that the use of the technology in the industry will only increase in 2022.

Thats because as the population expands, governments must develop the infrastructure to support it. This includes better transport networks, of which AI has an enormous potential to increase safety and efficiency.

Take AI-powered automated inspections, for instance. AI enables continuous monitoring of even the remotest infrastructure, increases accuracy and objectivity, and triggers preventive maintenance. This ensures cost savings and improves safety, which can really make the difference between rolling out vital transport systems or not.

With traffic logistics optimised, the number of vehicle movements is consequently reduced, allowing for optimised traffic flow. Not only that, but a safe and reliable transport network further stimulates the use of public transport over car usage, supporting global ambitions to cut carbon emissions. Modernising, optimising, and enhancing quality control at scale will ensure that infrastructure remains resilient, inclusive, and sustainable.

Now, if theres one thing edge computing, quantum computing and automated technologies have in common, it is the use of AI as a key driver to innovation. There has been a significant leap in the way society harnesses artificial intelligence as an integral part of our lives. Im excited to see what the future holds next from the looks of it, its increasingly artificially intelligent.

Here is the original post:

Deep tech in 2022: the future is looking artificially intelligent - Information Age

Posted in Quantum Computing | Comments Off on Deep tech in 2022: the future is looking artificially intelligent – Information Age

15 Truly Unbelievable Ways Science Changed the World in 2021 – Fatherly

Posted: at 12:32 am

Sometimes youve got to look for the trees in the forest. The good news of 2021 was like a host of saplings little trees lost in the forest of inflation, the pandemic, and catastrophic weather events. Look closer and you will find numerous reasons to cheer, scientific discoveries and advances that give honest-to-goodness hope for humanity.

Most notably, 2021 saw one of the most-effective vaccines ever created, in record time. But thats just the beginning. We witnessed other monster breakthroughs in biology, astronomy, medicine, engineering, computing, genomics, and many more scientific fields.

With so many astounding advances in 2021, it was tough to pick the most significant but we tried anyway. Here are our favorite 15 moments worth telling the kids about. Prepare for your mind to be blown.

The development, testing, and rollout of COVID vaccines has been called the moonshot of our generation. That might be an understatement. Thanks to devoted medical researchers and tens of thousands of everyday Americans who participated in clinical trials, the Food and Drug Administration (FDA) granted the Pfizer and Moderna COVID vaccines emergency use authorization for adults last December, followed by Johnson & Johnsons single-shot vaccine this February. Since then, the vaccine has become available for children as young as 5. Thats a vaccine rollout available to 94 percent of the population (under 5 are excepted so far) in little over a year. Previously, the fastest vaccine to go from development to deployment was the mumps vaccine in the 1960s which took about four years. Although were still struggling with COVID variants and breakthrough cases, this feat of inoculation has saved countless lives and holds promise for a future where we can keep up with viral outbreaks in real time.

Lots of animals can regrow a torn-off tail or a leg lost to a predator, but sea slugs have the coolest regeneration trick by a long shot. As a Japanese scientist discovered this year, these slimy creatures can behead themselves on purpose and grow a whole new body within weeks. The severed head survives on its own while it regenerates vital organs and limbs, likely due to slugs plant-like ability to photosynthesize because of all the algae they eat. Even more impressive, the discarded body lives for weeks before eventually dying off. Researchers think sea slugs use this cool maneuver to hoodwink predators and escape unharmed, or possibly to survive parasite infestations of their lower body.

Brain-computer interfaces (BCIs) hold major promise for people with paralysis, allowing them to operate robotic limbs, wheelchairs, keypads, and other gadgets just by thinking about moving their bodies. But so far, BCIs have mostly been relegated to research settings, as theyve required bulky cables to connect a persons head to a computer to an external device.

Not anymore. The prestigious BrainGate research team has devised the worlds first high-bandwidth, totally wireless BCI that transmits brain signals as quickly and clearly as cabled systems do. In a recent clinical trial, the new device enabled two people with tetraplegia to point, click, and type on a tablet with precision and speed no wires required. More research is needed, but this is a major step toward taking BCIs out of the lab and into the real world to help people with paralysis regain independence.

Americans boundless fascination with unidentified flying objects was finally indulged in 2021. In January, by way of the Freedom of Information Act, The Black Vault website posted the CIAs recently declassified database of every UFO sighting reported by a military pilot, dating back to the 1980s. Concurrently, the CIA uploaded dozens of records of UFO sightings from the 1940s to the early 1990s.

Then, in June, the Pentagon issued a long-awaited nine-page report summarizing everything it claims to know about unidentified aerial phenomena, or UAPs, its fancy term for UFOs. Shocker: The government doesnt know much. The report does assert that UAPs are not U.S. military craft, but otherwise, it pretty much plays the inconclusive card. But hey, although the dossier may not clear up many mysteries, the massive data dump should keep UFO-obsessed armchair detectives captivated for years to come.

About 90 percent of human brain development happens by age five. And although neuroscientists have recently learned a lot about how and when various developments occur, especially in utero, theres still a ton they dont know, particularly about the impacts of nature versus nurture. These answers are now coming, courtesy of the largest, most comprehensive trial on early brain development ever, which kicked off this fall.

Through the HEALthy Brain and Child Development Study, researchers nationwide will track a diverse group of 7,500 pregnant people and their children throughout the next decade. Using neuroimaging and psychological assessments, they aim to map out the normal arc of brain development and discover how pre- and postnatal environments and exposures (stress, socioeconomic status, parents drug use, COVID, etc.) affect it as well as how kids brains adapt. This historic study has the potential to unlock prevailing mysteries about autism, dyslexia, and other childhood neurodevelopment, emotional, and behavioral concerns.

Long before COVID, malaria was and as of time of publication, still is one of the most lethal infectious diseases on the planet. This mosquito-borne pathogen kills half a million people annually, predominantly in sub-Saharan Africa. Over half of those malaria kills are children under age 5, Now, after a century of effort, scientists have finally developed a safe, effective malaria vaccine (the first vaccine for any parasitic disease, by the way), which the World Health Organization (WHO) greenlighted for all at-risk kids in October. Assuming nations prioritize vaccine distribution, experts estimate this breakthrough could prevent 5.3 million malaria cases and 24,000 deaths among children under 5 every year.

An estimated 1.6 million Americans live with type 1 diabetes (aka juvenile diabetes), including 200,000 kids and adults under age 20. With no known cure, this life-threatening autoimmune disease, in which the pancreas stops producing insulin to control blood sugar, almost always requires intense 24/7 management.

That may be about to change. To the shock and elation of diabetes experts, an experimental treatment delivered in an ongoing clinical trial appeared to cure a 64-year-old man of type 1 diabetes, which would be a world first. After receiving infusions of insulin-producing cells grown from stem cells, the mans body now makes insulin on its own, giving him a whole new life, as he told the New York Times.

Because this discovery is part of a five-year study involving 16 other participants, its still too soon to say with certainty whether the treatment is effective and safe long-term. But its the most promising development the world has seen in regard to a type 1 diabetes cure and likely enough to make a parent or child living with type 1 do cartwheels.

In February, almost seven months after launching from Earth, NASAs highly sophisticated Perseverance rover touched down on Mars. The vehicle will spend nearly two years on the red planet, surveying the landscape, searching for evidence of past Martian life, and collecting geological samples to bring back to Earth. Then, in April, NASAs solar-powered Ingenuity Mars Helicopter became the first-ever aircraft to make acontrolled flight on another planet. By December 8, Ingenuity had logged 17 successful flights.

Mars wasnt the only celestial body to make news in 2021. In November, NASA scientists validated the existence of 301 new exoplanets planets that orbit stars other than the Sun bringing the total exoplanet tally to 4,870. The validation frenzy comes courtesy of NASAs new ExoMiner deep-learning technology, which evaluates data collected by the Kepler spacecraft to distinguish legit exoplanets from convincing fakes.

One of the biggest differences between the COVID pandemic and that of the 1918 Spanish Flu (the last global pandemic) is the way that we track it. Its nearly unimaginable that we once had to follow death and infection rates by local tally and had essentially no way of knowing about new viral variants. Now, led by the WHO, scientists have a colossal global collaboration to monitor the spread and evolution of SARS-CoV-2. Huge amounts of data have been collected and shared across borders in real time, allowing researchers to get quickly gain an idea of how a variant like Delta or Omicron spreads and affects case numbers and hospitalizations. We didnt have this sort of technology available at the beginning of the pandemic. As of April 2021, the online GISAID database containedonly one million SARS-CoV-2 genomic sequences. Eight months later, another five million sequences have been added. In other words, genomic sequencing of SARS-CoV-2 has gotten ten times faster since the spring. This accomplishment highlights that one of the biggest challenges of science isnt discovery, but sharing discoveries, and countries across the world are now doing that in a way theyve never done before.

With the average cost of a solar panel plummeting 90 percent between 2010 and 2020, it keeps getting cheaper and cheaper to generate power directly from the sun. Thats great news, as it helps shift our reliance away from fossil fuels, a key contributor to climate change. Frustratingly, however, solar panels havent gotten much more efficient in recent years, which has hindered widespread adoption of this form of clean energy.

To solve this issue, engineers have been looking for alternative materials that can outperform the standard silicon used in solar panels and still be inexpensive. Theyve had high hopes for perovskites, atomically thin, latticed materials that convert sunlight into energy highly efficiently. The only problem? Ultraviolet rays and moisture destroy perovskites in no time, tanking their usefulness.

But this year, Rice University engineers developed and road-tested solar cells made of two-dimensional perovskites. Their invention works much better than earlier models and withstands the elements. The trick with 2D perovskites, the researchers discovered, is that sunlight contracts the spaces between the atomic layers to boost efficiency by up to 18 percent a huge leap forward in this field. With solar companies worldwide working to commercialize perovskite solar cells, this breakthrough should ultimately accelerate societys conversion to solar energy.

In January, Jean-Michel Dubernard, MD, the same surgeon who performed the first-ever hand, double hand, and partial face transplants, accomplished yet another historic feat: the worlds first double arm and shoulder transplant. The operation, performed in France, was a resounding success. The recipient, 49-year-old Felix Gretarsson of Iceland, whod lost both arms in an electrical accident in 1998, has steadily gained mobility throughout the year, charting his progress on Instagram. He can now flex his biceps, pick up objects, and hug his granddaughter. Experts expect he will make more advancements in the coming years. Sadly, Dubernard died in July, but not before giving Gretarssinan entirely new life.

This year, scientists learned a lot about the massive creatures that inhabited the Earth many millions of years ago. First up, dinosaurs. The fearsome predator Tyrannosaurus rex roamed North America starting nearly 70 million years back, and now biologists have finally estimated how many: 2.5 billion. Terrifying, right? If its any comfort, thats the total T. rex population spread out over 2.4 million years. So, really, there were only about 20,000 adult T. rexes living at one time.

Of course, that last generation of T. rex, along with the entire dinosaur kingdom, got wiped off the planet some 66 million years ago by an asteroid. Or wait, was it a comet? Thats the new theory put forth by Harvard astronomers to explain the so-called Chicxulub Impactor, the astronomical body that created a 93-mile-wide, 12-mile-deep crater off the coast of Mexico and, theoretically, killed the dinosaurs. Countering the prevailing asteroid theory, the Harvard astronomers think a comet from the fringes of our solar system got knocked off-orbit by Jupiters gravitational field and broke into chunks. Then an especially large chunk the eventual Chicxulub Impactor slammed into the Earth, wreaking major havoc and wiping out the dinos.

More than 60 million years after the dinosaurs, mammoths were living large, which researchers know because of the extensive fossil record. This year, such fossils yielded an unprecedented discovery: the oldest ancient animal genome ever recovered. In sequencing DNA from three mammoth teeth extracted from the Siberian permafrost, scientists determined the fossils were more than one million years old, obliterating the previous record held by a 560,000-to-780,000-year-old horse leg bone. The DNA also suggests a separate lineage, possibly a different species, of mammoth that scientists werent aware of before.

An international event thats been decades in the making is finally (hopefully!) happening on December 24*. After multiple delays, the James Webb Space Telescope the largest and most advanced scientific telescope in the history of space exploration is scheduled to blast off from French Guiana aboard the Ariane 5 rocket. It will take 30 days to travel nearly 1 million miles to a stable spot in space and another six months to unfold its instruments, align, and calibrate. As it tracks Earths orbit around the sun for the next several decades, the infrared scope will directly observe parts of the universe previously unseeable, thereby demystifying the origin and evolution of our planet, solar system, and galaxies beyond.

One of the biggest differences between the COVID pandemic and that of the 1918 Spanish Flu is the way that we track it. Its nearly unimaginable that we once had to follow death and infection rates by local tally and had essentially no way of knowing about new viral variants. Now, led by the WHO, scientists have a colossal global collaboration to monitor the spread and evolution of SARS-CoV-2. Huge amounts of data have been collected and shared across borders, allowing researchers to get quickly gain an idea of how a variant like Omicron spreads and affects case numbers and hospitalizations. We didnt have this sort of technology available at the beginning of the pandemic. As of April 2021, the online GISAID database contained only one million SARS-CoV-2 genomic sequences. Thats after about 16 months of pandemic. But in the eight months since, another five million sequences have been added. In other words, genomic sequencing of SARS-CoV-2 has gotten ten times faster since the spring. This accomplishment highlights that one of the biggest challenges of science isnt discovery, but sharing discoveries, and countries across the world are now doing that in a way theyve never done before.

What takes todays best supercomputers several days or weeks to process, quantum computers can knock out within seconds. Thats why quantum computing, which leverages the laws of quantum physics for unprecedented processing capabilities, is already considered among the biggest scientific breakthroughs of the 21st century. Eventually, its supposed to revolutionize manufacturing, meteorology, cybersecurity, national defense, and much more.

Well, 2021 made eventually closer than ever. In November, IBM unveiled its 127-qubit Eagle, the most powerful quantum processor yet. Then earlier this month, the company Quantinuum debuted the worlds first commercial product built from quantum computing: a cloud-based cybersecurity platform called Quantum Origin. With the worlds top tech companies and research institutions racing to advance this next-gen technology, expect quantum computing to make our list again next year, and the next, and the next

Oops! Please try again.

Thanks for subscribing!

Originally posted here:

15 Truly Unbelievable Ways Science Changed the World in 2021 - Fatherly

Posted in Quantum Computing | Comments Off on 15 Truly Unbelievable Ways Science Changed the World in 2021 – Fatherly

ixFintech Group Limited Announces Launch of ixWallet 2.0 and Plans to Launch New Asset-backed TeaCoin – Business Wire

Posted: at 12:32 am

HONG KONG--(BUSINESS WIRE)--IX Fintech Group Limited (ixFintech) is honoured to announce the successful integration of privacy identity authentication and post quantum computing security into ixWallet to safeguard users' identity against cybersecurity risk. In Q1 2022, the company also aims to launch ixPoint, the company's first-ever reward point scheme, as well as the first asset-backed TeaCoin.

The new version of ixWallet comes with two new indexes: ixBitcoin and ixEthereum. These indexes in combination with ixCrypto Index facilitate easy comparison of the performance of the whole Crypto market versus the performance of Bitcoin and Ethereum. ixWallet 2.0 offers more efficient and flexible KYC (Know Your Customer) processes depending on the users needs and usage.

The ixWallet 2.0 marks a momentous milestone in our development journey to provide enhanced protection for our customers valuable digital assets. In addition to serving users who want anonymity while satisfying regulators requirements on KYC and security, ixWallet 2.0 allows users to choose whether they want to conduct full KYC for regulated activities through its settings, said Irene Wong, the founder and CEO of ixFintech. This launching event also records a successful collaboration with Polydigi Tech Ltd for its pending patent on the worlds first Anti-Authorised Push Payment (Anti-App) scams and phishing Digital ID authentication s-Factr solution and IronCAP, a code-based cryptographic technology by 01 Communique Laboratory Inc as announced in Q2 2021. These technologies are likely to make ixWallet 2.0 the safest digital wallet in the world.

The company is also excited to announce the successful completion of the Proof-Of-Concept Project (PoC Project) on the worlds first audited Tea Cake Tokenisation. The Financial Services and the Treasury Bureau (FSTB) launched the Fintech Proof-of-Concept Subsidy Scheme (the PoC Scheme) aiming to encourage traditional financial institutions (the FIs) to partner up with Fintech companies to conduct the PoC Project on innovative financial service products. An asset-backed token is a digital token based on blockchain technology that derives its value based on the underlying asset. Coin holders may benefit from the liquidity and price performance of the token in the secondary market. The TeaCoin was designed with the traditional tea cake in mind and is available on ixWallet 2.0 for users to understand asset tokenisation. In the future, investors can store TeaCoins on ixWallet and this will allow them to safeguard their assets until they need to transfer such assets to another party or use them to redeem a real physical tea cake from the tea company.

2021 IX Fintech Group Limited. All Rights Reserved.

For more information about IX Fintech, please refer to http://ixfintech.com/

For more information about buying ixPoint using DAEM, please refer to the DAEM section or visit DAEM website http://daemtech.com/

About IX Fintech

IX Fintech Group is a Hong Kong based digital assets award winning company, including HK FinTech Awards, HK FinTech Impetus Awards, IFTA Awards and TADS Awards. The companys mission is to bridge the traditional finance and new digital finance in a secured and compliant way. In the past 3 years, the Group won awards in different areas including cross boarder payment solution, blockchain technology, wealth management and trading platform etc.

IX Fintech Group created a DEFI machine, DAEM (Digital Asset Exchange Machine) and ixWallet both installed with post quantum computing security, and the Bitcoin Lei see, all worlds first in the market. The whole system is truly decentralized, to provide users an instant trading- instant settled into clients unique wallet new and better experience, eliminating all middle parties default risk.

ixWallet 2.0 newly released is a truly distributed ledger wallet that enables users to manage not only cryptos but other kinds of digital assets. Transactions are transparent and can be checked from public proven websites. ixWallet2.0 is protected from phishing. It is equipped with the next generation OTP solution.

For more information, please visit http://www.daemtech.com or the DAEM showcasing at Cyberport.

Website: https://ixfintech.com/

About ixCrypto Index Series

IX Asia Indexes Company Limited (IX Asia Indexes) is a wholly owned subsidiary of the IX Fintech Group. Aiming to become one of the leading index compilers in Asia, its services in the areas of both real and digital assets cover index consultancy, index design, index calculation and dissemination, and index education. It is missioned to bring transparency and standardization to the digital asset and tokenisation world through building an investment-grade and rules-based benchmarks.

IX Asia Indexes launched the award winning ixCrypto Index (IXCI) in 2018, followed by two new Indexes ixBitcoin (IXBI) and ixEthereum (IXEI) Index to complete the ixCrypto Index Series in early 2021. They are currently available in 85 countries via Nasdaq and IX Asia Indexes Company data feed to Bloomberg, Reuters, banks institutions and information vendors. Real time index is disseminated every 15 second interval from Hong Kong Time 9 a.m. to 9 p.m.. An index advisory committee with representation from different industries to ensure the professionality and impartiality of the index methodologies and operations.

For more information on data dissemination and product licensing, please visit http://www.ix-index.com or contact licensing@ix-index.com

Website: https://ix-index.com

2021 IX Fintech Group Limited. All Rights Reserved.

About Polydigi Tech

Following an invitation from the United Kingdom Department of International Trade, Polydigi Tech established its Headquarter in Edinburgh, Scotland in 2019. Polydigi Tech is an innovative cybersecurity company that specialises in cutting-edge identity verification technologies. To counter the ever-growing risk of cyber-threats, Polydigi Tech has developed various patented and patent-pending innovative solutions including mobile phone based multi-factor authentication, biometric authentication, and hardware protection for IoT devices and networks.

For more details about Polydigi Tech please visit our website at https://polydigitech.uk/

About 01 Communique

Established in 1992, 01 Communique (TSX-V: ONE; OTCQB: OONEF) has always been at the forefront of technology. The Companys cyber security business unit focuses on post-quantum cybersecurity with the development of its IronCAP technology. IronCAPs patent-pending cryptographic system is an advanced Goppa code-based post-quantum cryptographic technology that can be implemented on classical computer systems as we know them today while at the same time can also safeguard against attacks in the future post-quantum world of computing. The Companys remote access business unit provides its customers with a suite of secure remote access services and products under its Im InTouch and Im OnCall product offerings. The remote access offerings are protected in the U.S.A. by its patents #6,928,479 / #6,938,076 / #8,234,701; in Canada by its patents #2,309,398 / #2,524,039 and in Japan by its patent #4,875,094. For more information, visit the Companys web site at http://www.ironcap.ca and http://www.01com.com.

2021 IX Fintech Group Limited. All Rights Reserved.

Read more:

ixFintech Group Limited Announces Launch of ixWallet 2.0 and Plans to Launch New Asset-backed TeaCoin - Business Wire

Posted in Quantum Computing | Comments Off on ixFintech Group Limited Announces Launch of ixWallet 2.0 and Plans to Launch New Asset-backed TeaCoin – Business Wire

Brits lift lid on need for more investment in technology and skills to help prevent future pandemics and climate change – Leicestershire Live -…

Posted: at 12:32 am

Concerned Brits believe the nation needs to do more to enhance its preparedness for pandemics and combat climate change.

The impact of coronavirus restrictions is continuing to take its toll and across the UK, the publics awareness of improving provision and capacity for dealing with crises is being heightened.

Over four in five of us (83 per cent) reckon it is essential for the country to invest more in emerging technology and skills to improve our ability to address major issues like pandemics and climate change, with almost nine in ten (89 per cent) believing that the use of technology is vital for the future of the economy.

And with a huge volume of our investment being dedicated to young people, almost two-thirds (63 per cent) admit the current level of focus on STEM (Science, Technology, Engineering & Technology) in schools is not enough to futureproof the UK and could leave our society vulnerable in years to come.

These new findings come as IBM launches a new campaign to encourage the uptake of STEM subjects amongst young people. The company has released a series of videos for schoolchildren explaining Quantum computing, an emerging technology with the potential address big societal issues like sustainability and climate change.

Dr James Wootton, a Quantum researcher at IBM, said: Quantum computing is a really good example of an emerging technology that requires STEM education and skills. It promises game-changing applications across almost every sector, such as better batteries for electric cars, quicker discovery of new medicines, or the discovery of new materials for things like solar panels or carbon capture.

But to take advantage of this opportunity, the UK needs to ensure that young people are developing skills in the right areas.

Brits are determined for this trend to change, however, with almost all of us (94 per cent) claiming there are numerous benefits to children having a good STEM education.

Whether its helping young people understand the world around us (71 per cent), promoting problem solving skills (69 per cent), improving career prospects (66 per cent) or supporting innovation and future economic growth (57 per cent), the nation is key to ensuring young people are provided with the appropriate skills to help protect the UK in the future.

IBM has extensive and successful programmes including P-TECH and SkillsBuild to address these skills gaps, and in October the company made a commitment to upskill 30 million people globally by 2030.

The new video series, entitled What How Why, invited students from schools and colleges in London Leeds and Dublin to directly ask IBM Research experts to explain Quantum computing. Introduced by teacher and mathematician Bobby Seagull, these conversations covered what Quantum computing is, why it is important for the skills they will need in future and how it will make a difference to the world they are growing up in.

And Dr Wootton added: The videos helped us explore Quantum computing in terms everyone can understand, the applications and skills needed for a future, and the impact it can make on some of the big challenges we face.

Even though we created these videos for young people, they are a great primer for anyone that wants to demystify quantum computing and understand its potential to change the world.

To watch the new series of videos, visit the IBM UK & Ireland Think blog: https://www.ibm.com/blogs/think/uk-en/what-how-why-quantum-explained/

For more stories from where you live, visit InYourArea.

Here is the original post:

Brits lift lid on need for more investment in technology and skills to help prevent future pandemics and climate change - Leicestershire Live -...

Posted in Quantum Computing | Comments Off on Brits lift lid on need for more investment in technology and skills to help prevent future pandemics and climate change – Leicestershire Live -…

Cloud Computing in 2021: ITPro Today’s Top 10 Stories of the Year – ITPro Today

Posted: at 12:32 am

Like many industries, cloud computing in 2021 was not immune to the continuing COVID-19 pandemic. Between the work-from-home movement caused by COVID and an acceleration of digital transformation efforts, enterprises have been migrating to the cloud in droves, providing opportunities for not only the "Big 3" public cloud providers, but tier 2 providers as well.

Read ITPro Todays 10 most-read stories about cloud computing in 2021 below to see how cloud providers are differentiating themselves, and what the cloud of the future will look like.

1. Microsoft Ignite 2021 Keynote: Nadella Expounds Cloud of the Future

In March, Microsoft CEO Satya Nadella took the stage at Microsoft Ignite 2021 to discuss the traits of the cloud that he believes will foster future innovation. Now is the time to focus on a cloud that is built for the innovations that will be coming in the next decade, he said. Find out what Nadella thinks the cloud of the future will look like.

2. Survey: Open Source Cloud Technologies Fit Devs Like a Glove

When choosing cloud providers, 70% of IT pros prefer those based on open source cloud technologies. Thats according to The Value of Open Source in the Cloud Era survey conducted by IBM and OReilly. This article discusses the most significant takeaways from the survey, including what kind of skills can boost a developers career.

3. Looking Back: Cloud Computing in 2020 Experienced Massive Growth

2020 turned out to be a breakout year for cloud computing, thanks to the COVID-19 pandemic forcing the masses to work from home and enterprises accelerating their digital transformation efforts. The result was massive growth in the public cloud, adoption of hybrid cloud approaches and more open source options. Take a look back at cloud computing in 2020 and then compare it to how cloud computing has fared in 2021.

4. Not Optimizing Cloud Computing Spend Costs Enterprises $24B

New research reveals that organizations are not choosing the best options for cloud cost optimization resulting in $24 billion in missed savings. Read on to learn about some of the myths and misconceptions about cloud cost optimization and how taking a multicloud approach can help.

5. 'Big 3' Public Cloud Providers Continue Their Explosive Growth

AWS, Google and Microsoft all started 2021 with a bang continuing the strong growth they experienced in 2020. The beginning of the year also brought a change in leadership at AWS, with CEO Andy Jassy becoming the CEO for all of Amazon and former Tableau CEO Adam Selipsky taking over as AWS CEO. Read more about AWS leadership change and how the Big 3 public cloud providers were able to keep their momentum going.

6. Need to Automate a Process? Start Offline Before Moving to the Cloud

The cloud should be a key component of organizations digital transformation efforts, but the journey to the public cloud is a complicated one. This report looks at the right and wrong things to do when using the cloud to enable digital transformation.

7. Public Cloud Provider DigitalOcean Looks to Expand with IPO

On the heels of expanding its "droplets" virtualunits of compute and memory DigitalOcean went public in March, becoming "the only pure-play publicly traded cloud company," according to CEO Yancey Spruill. ITPro Today contributor Sean Michael Kerner looks at what the IPO means to DigitalOcean and its SMB customers.

8. Why Conductor Chose HashiCorp Nomad over Kubernetes for VFX

Running visual effects (VFX) rendering is among the most intensive compute tasks, but that is what Conductor Technologies cloud platform does and does well, as evidenced by the big-name productions it has been used on, including Blade Runner 2049,Game of ThronesandStranger Things. This case study examines how Conductor built out its platform to run special effects rendering jobs in the cloud without using Kubernetes.

9. 10th IBM Cloud Multi-Zone Region Opens with More in the Works

IBM continued to grow its cloud operations in 2021, with even more Multi-Zone Regions planned for the near future. This article explores how IBM is differentiating itself from its rivals by adding more MZRs and the role quantum computing will play in the future with IBM Cloud.

10. Why Tier 2 Cloud Providers Could Be Your No. 1 Choice

The Big 3 public cloud providers may be showing no signs of slowing down, but they are coming under increasing pressure from tier 2 cloud vendors platforms like IBM Cloud, Oracle Cloud and others. Find out why the future is bright for tier 2 cloud providers and why you should consider one over a Big 3 cloud.

What was the most important lesson you learned about cloud computing in 2021? Tell us in the comments below!

Follow this link:

Cloud Computing in 2021: ITPro Today's Top 10 Stories of the Year - ITPro Today

Posted in Quantum Computing | Comments Off on Cloud Computing in 2021: ITPro Today’s Top 10 Stories of the Year – ITPro Today

What Is Quantum Computing? | NVIDIA Blog

Posted: December 19, 2021 at 6:50 pm

Twenty-seven years before Steve Jobs unveiled a computer you could put in your pocket, physicist Paul Benioff published a paper showing it was theoretically possible to build a much more powerful system you could hide in a thimble a quantum computer.

Named for the subatomic physics it aimed to harness, the concept Benioff described in 1980 still fuels research today, including efforts to build the next big thing in computing: a system that could make a PC look in some ways quaint as an abacus.

Richard Feynman a Nobel Prize winner whose wit-laced lectures brought physics to a broad audience helped establish the field, sketching out how such systems could simulate quirky quantum phenomena more efficiently than traditional computers. So,

Quantum computing is a sophisticated approach to making parallel calculations, using the physics that governs subatomic particles to replace the more simplistic transistors in todays computers.

Quantum computers calculate using qubits, computing units that can be on, off or any value between, instead of the bits in traditional computers that are either on or off, one or zero. The qubits ability to live in the in-between state called superposition adds a powerful capability to the computing equation, making quantum computers superior for some kinds of math.

Using qubits, quantum computers could buzz through calculations that would take classical computers a loooong time if they could even finish them.

For example, todays computers use eight bits to represent any number between 0 and 255. Thanks to features like superposition, a quantum computer can use eight qubits to represent every number between 0 and 255, simultaneously.

Its a feature like parallelism in computing: All possibilities are computed at once rather than sequentially, providing tremendous speedups.

So, while a classical computer steps through long division calculations one at a time to factor a humongous number, a quantum computer can get the answer in a single step. Boom!

That means quantum computers could reshape whole fields, like cryptography, that are based on factoring what are today impossibly large numbers.

That could be just the start. Some experts believe quantum computers will bust through limits that now hinder simulations in chemistry, materials science and anything involving worlds built on the nano-sized bricks of quantum mechanics.

Quantum computers could even extend the life of semiconductors by helping engineers create more refined simulations of the quantum effects theyre starting to find in todays smallest transistors.

Indeed, experts say quantum computers ultimately wont replace classical computers, theyll complement them. And some predict quantum computers will be used as accelerators much as GPUs accelerate todays computers.

Dont expect to build your own quantum computer like a DIY PC with parts scavenged from discount bins at the local electronics shop.

The handful of systems operating today typically require refrigeration that creates working environments just north of absolute zero. They need that computing arctic to handle the fragile quantum states that power these systems.

In a sign of how hard constructing a quantum computer can be, one prototype suspends an atom between two lasers to create a qubit. Try that in your home workshop!

Quantum computing takes nano-Herculean muscles to create something called entanglement. Thats when two or more qubits exist in a single quantum state, a condition sometimes measured by electromagnetic waves just a millimeter wide.

Crank up that wave with a hair too much energy and you lose entanglement or superposition, or both. The result is a noisy state called decoherence, the equivalent in quantum computing of the blue screen of death.

A handful of companies such as Alibaba, Google, Honeywell, IBM, IonQ and Xanadu operate early versions of quantum computers today.

Today they provide tens of qubits. But qubits can be noisy, making them sometimes unreliable. To tackle real-world problems reliably, systems need tens or hundreds of thousands of qubits.

Experts believe it could be a couple decades before we get to a high-fidelity era when quantum computers are truly useful.

Predictions of when we reach so-called quantum computing supremacy the time when quantum computers execute tasks classical ones cant is a matter of lively debate in the industry.

The good news is the world of AI and machine learning put a spotlight on accelerators like GPUs, which can perform many of the types of operations quantum computers would calculate with qubits.

So, classical computers are already finding ways to host quantum simulations with GPUs today. For example, NVIDIA ran a leading-edge quantum simulation on Selene, our in-house AI supercomputer.

NVIDIA announced in the GTC keynote the cuQuantum SDK to speed quantum circuit simulations running on GPUs. Early work suggests cuQuantum will be able to deliver orders of magnitude speedups.

The SDK takes an agnostic approach, providing a choice of tools users can pick to best fit their approach. For example, the state vector method provides high-fidelity results, but its memory requirements grow exponentially with the number of qubits.

That creates a practical limit of roughly 50 qubits on todays largest classical supercomputers. Nevertheless weve seen great results (below) using cuQuantum to accelerate quantum circuit simulations that use this method.

Researchers from the Jlich Supercomputing Centre will provide a deep dive on their work with the state vector method in session E31941 at GTC (free with registration).

A newer approach, tensor network simulations, use less memory and more computation to perform similar work.

Using this method, NVIDIA and Caltech accelerated a state-of-the-art quantum circuit simulator with cuQuantum running on NVIDIA A100 Tensor Core GPUs. It generated a sample from a full-circuit simulation of the Google Sycamore circuit in 9.3 minutes on Selene, a task that 18 months ago experts thought would take days using millions of CPU cores.

Using the Cotengra/Quimb packages, NVIDIAs newly announced cuQuantum SDK, and the Selene supercomputer, weve generated a sample of the Sycamore quantum circuit at depth m=20 in record time less than 10 minutes, said Johnnie Gray, a research scientist at Caltech.

This sets the benchmark for quantum circuit simulation performance and will help advance the field of quantum computing by improving our ability to verify the behavior of quantum circuits, said Garnet Chan, a chemistry professor at Caltech whose lab hosted the work.

NVIDIA expects the performance gains and ease of use of cuQuantum will make it a foundational element in every quantum computing framework and simulator at the cutting edge of this research.

Sign up to show early interest in cuQuantum here.

Link:

What Is Quantum Computing? | NVIDIA Blog

Posted in Quantum Computing | Comments Off on What Is Quantum Computing? | NVIDIA Blog

Page 52«..1020..51525354..6070..»