The Prometheus League
Breaking News and Updates
- Abolition Of Work
- Ai
- Alt-right
- Alternative Medicine
- Antifa
- Artificial General Intelligence
- Artificial Intelligence
- Artificial Super Intelligence
- Ascension
- Astronomy
- Atheism
- Atheist
- Atlas Shrugged
- Automation
- Ayn Rand
- Bahamas
- Bankruptcy
- Basic Income Guarantee
- Big Tech
- Bitcoin
- Black Lives Matter
- Blackjack
- Boca Chica Texas
- Brexit
- Caribbean
- Casino
- Casino Affiliate
- Cbd Oil
- Censorship
- Cf
- Chess Engines
- Childfree
- Cloning
- Cloud Computing
- Conscious Evolution
- Corona Virus
- Cosmic Heaven
- Covid-19
- Cryonics
- Cryptocurrency
- Cyberpunk
- Darwinism
- Democrat
- Designer Babies
- DNA
- Donald Trump
- Eczema
- Elon Musk
- Entheogens
- Ethical Egoism
- Eugenic Concepts
- Eugenics
- Euthanasia
- Evolution
- Extropian
- Extropianism
- Extropy
- Fake News
- Federalism
- Federalist
- Fifth Amendment
- Fifth Amendment
- Financial Independence
- First Amendment
- Fiscal Freedom
- Food Supplements
- Fourth Amendment
- Fourth Amendment
- Free Speech
- Freedom
- Freedom of Speech
- Futurism
- Futurist
- Gambling
- Gene Medicine
- Genetic Engineering
- Genome
- Germ Warfare
- Golden Rule
- Government Oppression
- Hedonism
- High Seas
- History
- Hubble Telescope
- Human Genetic Engineering
- Human Genetics
- Human Immortality
- Human Longevity
- Illuminati
- Immortality
- Immortality Medicine
- Intentional Communities
- Jacinda Ardern
- Jitsi
- Jordan Peterson
- Las Vegas
- Liberal
- Libertarian
- Libertarianism
- Liberty
- Life Extension
- Macau
- Marie Byrd Land
- Mars
- Mars Colonization
- Mars Colony
- Memetics
- Micronations
- Mind Uploading
- Minerva Reefs
- Modern Satanism
- Moon Colonization
- Nanotech
- National Vanguard
- NATO
- Neo-eugenics
- Neurohacking
- Neurotechnology
- New Utopia
- New Zealand
- Nihilism
- Nootropics
- NSA
- Oceania
- Offshore
- Olympics
- Online Casino
- Online Gambling
- Pantheism
- Personal Empowerment
- Poker
- Political Correctness
- Politically Incorrect
- Polygamy
- Populism
- Post Human
- Post Humanism
- Posthuman
- Posthumanism
- Private Islands
- Progress
- Proud Boys
- Psoriasis
- Psychedelics
- Putin
- Quantum Computing
- Quantum Physics
- Rationalism
- Republican
- Resource Based Economy
- Robotics
- Rockall
- Ron Paul
- Roulette
- Russia
- Sealand
- Seasteading
- Second Amendment
- Second Amendment
- Seychelles
- Singularitarianism
- Singularity
- Socio-economic Collapse
- Space Exploration
- Space Station
- Space Travel
- Spacex
- Sports Betting
- Sportsbook
- Superintelligence
- Survivalism
- Talmud
- Technology
- Teilhard De Charden
- Terraforming Mars
- The Singularity
- Tms
- Tor Browser
- Trance
- Transhuman
- Transhuman News
- Transhumanism
- Transhumanist
- Transtopian
- Transtopianism
- Ukraine
- Uncategorized
- Vaping
- Victimless Crimes
- Virtual Reality
- Wage Slavery
- War On Drugs
- Waveland
- Ww3
- Yahoo
- Zeitgeist Movement
-
Prometheism
-
Forbidden Fruit
-
The Evolutionary Perspective
Category Archives: Quantum Computing
AiThority Interview with Dr. Alan Baratz, CEO at D-Wave – AiThority
Posted: December 28, 2023 at 11:51 pm
Hi, welcome to the AiThority Interview Series. Please tell us a bit about yourself and what is D-Wave.
I am Dr. Alan Baratz, President and CEO of D-Wave (NYSE: QBTS).
D-Wave is a leader in quantum computing technology and the worlds first commercial supplier of quantum computers. Our technology has been used by some of the worlds most advanced organizations, including Volkswagen, Mastercard, Deloitte, Siemens Healthineers, Pattison Food Group Ltd, DENSO, Lockheed Martin, the University of Southern California, and Los Alamos National Laboratory.
The global quantum computing market is rapidly growing and some market analysts project it will reach upwards of 6 billion + by the end of this decade. As 2023 closes, it would be interesting to see how quantum computing influences 2024. The future of quantum computing would largely relate to a rapid government adoption, the future of work, and quantum supremacy.
With economists projecting a shallow recession in 2024, organizations will seek new technologies, such as quantum computing, to navigate adversity and bolster business resilience. Quantum technologies can accelerate problem-solving and decision-making for a wide range of common organizational processes, such as supply chain management, manufacturing efficiency, logistical planning, and employee scheduling. Amidst a challenging economic environment, quantums ability to fuel operational efficiencies is critical.
The industry will achieve a proven, defensible quantum supremacy result in 2024. Ongoing scientific and technical advancements indicate that we are far from achieving quantum supremacy. 2024 will be the year where quantum definitively outperforms classical, full stop. There will be clear evidence of quantums ability to solve a complex computational problem previously unsolvable by classical computing, and quantum will solve it faster, better, and with less power consumption.
The breakthrough weve all been pursuing is coming.
The US governments usage of annealing quantum computing will explode given the anticipated passing of legislation including the National Quantum Initiative and the National Defense Authorization Act. 2024 will see a rapid uptick in the quantum sandbox and test bed programs with directives to use all types of quantum technology, including annealing, hybrid, and gate models. These programs will focus on near-term application development to solve real-world public sector problems, from public vehicle routing to electric grid resilience.
The global quantum race will continue to heat up, as the U.S. and its allies aggressively push for near-term application development. While the U.S. is now starting to accelerate near-term applications, other governments like Australia, Japan, the U.K., and the E.U. have been making expedited moves to bring quantum in to solve public sector challenges. This effort will greatly expand in 2024.
Top public sector areas of focus will likely be sustainability, transportation and logistics, supply chain, and health care.
Quantum computing will show proven value and utility in daily business operations through in-production applications.
As we close 2023, companies are beginning to go into production with quantum-hybrid applications, so its no stretch of the imagination to see corporations using quantum solutions daily for ubiquitous business challenges such as employee scheduling, vehicle routing, and supply chain optimization. In time, it will become a part of every modern IT infrastructure, starting with the integration of annealing quantum computing.
See the original post here:
AiThority Interview with Dr. Alan Baratz, CEO at D-Wave - AiThority
Posted in Quantum Computing
Comments Off on AiThority Interview with Dr. Alan Baratz, CEO at D-Wave – AiThority
Anticipating the Next Technological Revolution: Trends and Insights – Medium
Posted: at 11:51 pm
Trends and Insights
In the ever-evolving landscape of technology, anticipating the next revolution is both a challenge and an exciting prospect. As we navigate the currents of innovation, identifying emerging trends provides valuable insights into the transformative technologies that will shape our future. This article explores the trends and insights that herald the arrival of the next technological revolution.
Problem:
The pace of technological change can be overwhelming, and industries must adapt to stay relevant. Disruptions caused by unforeseen technological shifts can catch businesses off guard, leading to obsolescence. The challenge lies in deciphering the signals of change and understanding how these trends will impact various sectors.
Solution:
AI and ML continue to dominate the technological landscape, promising transformative changes across industries. From autonomous vehicles to personalized healthcare, the integration of AI is reshaping how we live and work. Insights derived from massive datasets enable more informed decision-making and open new frontiers for innovation.
The rollout of 5G technology represents a quantum leap in connectivity. With faster speeds and lower latency, 5G is set to revolutionize communication, enabling the Internet of Things (IoT), augmented reality, and immersive experiences. Industries, from healthcare to manufacturing, will benefit from the unprecedented connectivity that 5G brings.
Quantum computing is on the cusp of a breakthrough that will redefine computational power. With the ability to process complex calculations at speeds unimaginable with classical computers, quantum computing holds promise for solving previously unsolvable problems in fields like cryptography, drug discovery, and optimization.
Read more:
Anticipating the Next Technological Revolution: Trends and Insights - Medium
Posted in Quantum Computing
Comments Off on Anticipating the Next Technological Revolution: Trends and Insights – Medium
The Holy Grail of Quantum Computing Is Finally Here. Or Is It? – WIRED
Posted: December 22, 2023 at 7:55 pm
Andersen and Lensky of Google disagree. They do not think the experiment demonstrates a topological qubit, because the object cannot reliably manipulate information to achieve practical quantum computing. It is repeatedly stated explicitly in the manuscript that error correction must be included to achieve topological protection and that this would need to be done in future work, they write to WIRED.
When WIRED spoke with Tony Uttley, the president and COO of Quantinuum, after the companys own announcement in May, he was steadfast. We created a topological qubit, he said. (Uttley said last month that he was leaving the company.) The companys experiments made non-Abelian anyons out of 27 ions of the metal ytterbium, suspended in electromagnetic fields. The team manipulated the ions to form non-Abelian anyons in a racetrack-shaped trap, and similar to the Google experiment, they demonstrated that the anyons could remember how they had moved. Quantinuum published its results in a preprint study on arXiv without peer review two days before Nature published Kims paper.
Room for Improvement
Ultimately, no one agrees whether the two demonstrations have created topological qubits because they havent agreed on what a topological qubit iseven if there is widespread agreement that such a thing is highly desirable. Consequently, Google and Quantinuum can perform similar experiments with similar results but end up with two very different stories to tell.
Regardless, Frolov at the University of Pittsburgh says that neither demonstration appears to have brought the field closer to the true technological purpose of a topological qubit. While Google and Quantinuum appear to have created and manipulated non-Abelian anyons, the underlying systems and materials used were too fragile for practical use.
David Pekker, another physicist at Pittsburgh, who previously used an IBM quantum computer to simulate the manipulation of non-Abelian anyons, says that the Google and Quantinuum projects dont showcase any quantum advantage in computational power. The experiments dont shift the field of quantum computing from where it has been for a while: Working on systems that are too small-scale to yet compete with existing computers. My iPhone can simulate 27 qubits with higher fidelity than the Google machine can do with actual qubits, Pekker says.
Still, technological breakthroughs sometimes grow from incremental progress. Delivering a practical topological qubit will require all kinds of studieslarge and smallof non-Abelian anyons and the math underpinning their quirky behavior. Along the way, the quantum computing industrys interest is helping further some fundamental questions in physics.
Originally posted here:
The Holy Grail of Quantum Computing Is Finally Here. Or Is It? - WIRED
Posted in Quantum Computing
Comments Off on The Holy Grail of Quantum Computing Is Finally Here. Or Is It? – WIRED
Year of covers: Tech and sport, quantum advances and Gen AI – Technology Magazine
Posted: at 7:55 pm
From groundbreaking breakthroughs in AI and quantum computing to the continued evolution of augmented and virtual reality, 2023 has witnessed a surge of innovation that is poised to revolutionise our world.
AI continues to evolve at an astonishing pace, with advancements in natural language processing (NLP) enabling more natural and intuitive human-computer interactions. Computer vision, another key AI domain, has made strides in image and video analysis, leading to improved object detection, facial recognition, and medical imaging capabilities. AI is also making significant contributions in drug discovery, medical diagnosis, and self-driving car development, further demonstrating its transformative potential.
The immersive worlds of augmented reality (AR) and virtual reality (VR) have taken significant steps forward, blurring the lines between the physical and digital realms. AR applications are becoming increasingly prevalent in gaming, education, and training, enhancing real-world experiences with digital overlays. VR, meanwhile, is gaining momentum in entertainment, healthcare, and remote collaboration, offering users immersive and interactive experiences.
Quantum computing, still in its early stages, holds immense promise for solving problems that are intractable for classical computers. Researchers are making progress in building and optimizing quantum computers, paving the way for breakthroughs in fields like materials science, drug discovery and AI.
All of these topics and more have featured in our magazine over the past 12 months, and the trends we have witnessed are likely to accelerate in the years to come. As 2023 comes to a close, join us for a review of Technology Magazine's covers from 2023.
Excerpt from:
Year of covers: Tech and sport, quantum advances and Gen AI - Technology Magazine
Posted in Quantum Computing
Comments Off on Year of covers: Tech and sport, quantum advances and Gen AI – Technology Magazine
IBM demonstrates useful Quantum computing within 133-qubit Heron, announces entry into Quantum-centric … – Tom’s Hardware
Posted: at 7:55 pm
At its Quantum Summit 2023, IBM took the stage with an interesting spirit: one of almost awe at having things go their way. But the quantum of today the one thats changing IBMs roadmap so deeply on the back of breakthrough upon breakthrough was hard enough to consolidate. As IBM sees it, the future of quantum computing will hardly be more permissive. IBM announced cutting-edge devices at the event, including the 133-qubit Heron Quantum Processing Unit (QPU), which is the company's first utility-scale quantum processor, and the self-contained Quantum System Two, a quantum-specific supercomputing architecture. And further improvements to the cutting-edge devices are ultimately required.
Each breakthrough that afterward becomes obsolete is another accelerating bump against what we might call quantum's "plateau of understanding." Weve already crested this plateau with semiconductors, so much so that the latest CPUs and GPUs are reaching practical, fundamental design limits where quantum effects start ruining our math. Conquering the plateau means that utility and understanding are now enough for research and development to be somewhat self-sustainable at least for a Moores-law-esque while.
IBMs Quantum Summit serves as a bookend of sorts for the companys cultural and operational execution, and its 2023 edition showcased an energized company that feels like it's opening up the doors towards a "quantum-centric supercomputing era." That vision is built on the company's new Quantum Processing Unit, Heron, which showcases scalable quantum utility at a 133-qubit count and already offers things beyond what any feasible classical system could ever do. Breakthroughs and a revised understanding of its own roadmap have led IBM to present its quantum vision in two different roadmaps, prioritizing scalability in tandem with useful, minimum-quality rather than monolithic, hard-to-validate, high-complexity products.
IBM's announced new plateau for quantum computing packs in two particular breakthroughs that occurred in 2023. One breakthrough relates to a groundbreaking noise-reduction algorithm (Zero Noise Extrapolation, or ZNE) which we covered back in July basically a system through which you can compensate for noise. For instance, if you know a pitcher tends to throw more to the left, you can compensate for that up to a point. There will always be a moment where you correct too much or cede ground towards other disruptions (such as the opponent exploring the overexposed right side of the court). This is where the concept of qubit quality comes into account the more quality your qubits, the more predictable both their results and their disruptions and the better you know their operational constraints then all the more useful work you can extract from it.
The other breakthrough relates to an algorithmic improvement of epic proportions and was first pushed to Arxiv on August 15th, 2023. Titled High-threshold and low-overhead fault-tolerant quantum memory, the paper showcases algorithmic ways to reduce qubit needs for certain quantum calculations by a factor of ten. When what used to cost 1,000 qubits and a complex logic gate architecture sees a tenfold cost reduction, its likely youd prefer to end up with 133-qubit-sized chips chips that crush problems previously meant for 1,000 qubit machines.
Enter IBMs Heron Quantum Processing Unit (QPU) and the era of useful, quantum-centric supercomputing.
Image 1 of 2
The two-part breakthroughs of error correction (through the ZNE technique) and algorithmic performance (alongside qubit gate architecture improvements) allow IBM to now consider reaching 1 billion operationally useful quantum gates by 2033. It just so happens that its an amazing coincidence (one born of research effort and human ingenuity) that we only need to keep 133 qubits relatively happy within their own environment for us to extract useful quantum computing from them computing that we wouldnt classically be able to get anywhere else.
The Development and Innovation roadmap showcase how IBM is thinking about its superconducting qubits: as weve learned to do with semiconductors already, mapping out the hardware-level improvements alongside the scalability-level ones. Because as weve seen through our supercomputing efforts, theres no such thing as a truly monolithic approach: every piece of supercomputing is (necessarily) efficiently distributed across thousands of individual accelerators. Your CPU performs better by knitting and orchestrating several different cores, registers, and execution units. Even Cerebras Wafer Scale Engine scales further outside its wafer-level computing unit. No accelerator so far no unit of computation - has proven powerful enough that we dont need to unlock more of its power by increasing its area or computing density. Our brains and learning ability seem to provide us with the only known exception.
IBMs modular approach and its focus on introducing more robust intra-QPU and inter-QPU communication for this years Heron shows its aware of the rope it's walking between quality and scalability. The thousands of hardware and scientist hours behind developing the tunable couplers that are one of the signature Heron design elements that allow parallel execution across different QPUs is another. Pushing one lever harder means other systems have to be able to keep up; IBM also plans on steadily improving its internal and external coupling technology (already developed with scalability in mind for Heron) throughout further iterations, such as Flamingos planned four versions which still only end scaling up to 156 qubits per QPU.
Considering how you're solving scalability problems and the qubit quality x density x ease of testing equation, the ticks - the density increases that don't sacrifice quality and are feasible from a testing and productization standpoint - may be harder to unlock. But if one side of development is scalability, the other relates to the quality of whatever youre actually scaling in this case, IBMs superconducting qubits themselves. Heron itself saw a substantial rearrangement of its internal qubit architecture to improve gate design, accessibility, and quantum processing volumes not unlike an Intel tock. The planned iterative improvements to Flamingo's design seem to confirm this.
Theres a sweet spot for the quantum computing algorithms of today: it seems that algorithms that fit roughly around a 60-gate depth are complex enough to allow for useful quantum computing. Perhaps thinking about Intels NetBurst architecture with its Pentium 4 CPUs is appropriate here: too deep an instruction pipeline is counterproductive, after a point. Branch mispredictions are terrible across computing, be it classical or quantum. And quantum computing as we still currently have it in our Noisy Intermediate-Scale Quantum (NISQ)-era is more vulnerable to a more varied disturbance field than semiconductors (there are world overclocking records where we chill our processors to sub-zero temperatures and pump them with above-standard volts, after all). But perhaps that comparable quantum vulnerability is understandable, given how were essentially manipulating the essential units of existence atoms and even subatomic particles into becoming useful to us.
Useful quantum computing doesnt simply correlate with an increasing number of available in-package qubits (announcements of 1,000-qubit products based on neutral atom technology, for instance). But useful quantum computing is always stretched thin throughout its limits, and if it isnt bumping against one fundamental limit (qubit count), its bumping against another (instability at higher qubit counts); or contending with issues of entanglement coherence and longevity; entanglement distance and capability; correctness of the results; and still other elements. Some of these scalability issues can be visualized within the same framework of efficient data transit between different distributed computing units, such as cores in a given CPU architecture, which can themselves be solved in a number of ways, such as hardware-based information processing and routing techniques (AMDs Infinity Fabric comes to mind, as does Nvidia's NVLink).
This feature of quantum computing already being useful at the 133-qubit scale is also part of the reason why IBM keeps prioritizing quantum computing-related challenges around useful algorithms occupying a 100 by 100 grid. That quantum is already useful beyond classical, even in gate grids that are comparably small to what we can achieve with transistors, and points to the scale of the transition of how different these two computational worlds are.
Then there are also the matters of error mitigation and error correction, of extracting ground-truth-level answers to the questions we want our quantum computer to solve. There are also limitations in our way of utilizing quantum interference in order to collapse a quantum computation at just the right moment that we know we will obtain from it the result we want or at least something close enough to correct that we can then offset any noise (non-useful computational results, or the difference of values ranging between the correct answer and the not-yet-culled wrong ones) through a clever, groundbreaking algorithm.
The above are just some of the elements currently limiting how useful qubits can truly be and how those qubits can be manipulated into useful, algorithm-running computation units. This is usually referred to as a qubits quality, and we can see how it both does and doesnt relate to the sheer number of qubits available. But since many useful computations can already be achieved with 133-qubit-wide Quantum Processing Units (theres a reason IBM settled on a mere 6-qubit increase from Eagle towards Heron, and only scales up to 156 units with Flamingo), the company is setting out to keep this optimal qubit width for a number of years of continuous redesigns. IBM will focus on making correct results easier to extract from Heron-sized QPUs by increasing the coherence, stability, and accuracy of these 133 qubits while surmounting the arguably harder challenge of distributed, highly-parallel quantum computing. Its a onetwo punch again, and one that comes from the bump in speed at climbing ever-higher stretches of the quantum computing plateau.
But there is an admission that its a barrier that IBM still wants to punch through its much better to pair 200 units of a 156-qubit QPU (that of Flamingo) than of a 127-qubit one such as Eagle, so long as efficiency and accuracy remain high. Oliver Dial says that Condor, "the 1,000-qubit product", is locally running up to a point. It was meant to be the thousand-qubit processor, and was a part of the roadmap for this years Quantum Summit as much as the actual focus, Heron - but its ultimately not really a direction the company thinks is currently feasible.
IBM did manage to yield all 1,000 Josephson Junctions within their experimental Condor chip the thousand-qubit halo product that will never see the light of day as a product. Its running within the labs, and IBM can show that Condor yielded computationally useful qubits. One issue is that at that qubit depth, testing such a device becomes immensely expensive and time-consuming. At a basic level, its harder and more costly to guarantee the quality of a thousand qubits and their increasingly complex possibility field of interactions and interconnections than to assure the same requirements in a 133-qubit Heron. Even IBM only means to test around a quarter of the in-lab Condor QPUs area, confirming that the qubit connections are working.
But Heron? Heron is made for quick verification that its working to spec that its providing accurate results, or at least computationally useful results that can then be corrected through ZNE and other techniques. That means you can get useful work out of it already, while also being a much better time-to-market product in virtually all areas that matter. Heron is what IBM considers the basic unit of quantum computation - good enough and stable enough to outpace classical systems in specific workloads. But that is quantum computing, and that is its niche.
Heron is IBMs entrance into the mass-access era of Quantum Processing Units. Next years Flamingo builds further into the inter-QPU coupling architecture so that further parallelization can be achieved. The idea is to scale at a base, post-classical utility level and maintain that as a minimum quality baseline. Only at that point will IBM maybe scale density and unlock the appropriate jump in computing capability - when that can be similarly achieved in a similarly productive way, and scalability is almost perfect for maintaining quantum usefulness.
Theres simply never been the need to churn out hundreds of QPUs yet the utility wasnt there. The Canaries, Falcons, and Eagles of IBMs past roadmap were never meant to usher in an age of scaled manufacturing. They were prototypes, scientific instruments, explorations; proofs of concept on the road towards useful quantum computing. We didnt know where usefulness would start to appear. But now, we do because weve reached it.
Heron is the design IBM feels best answers that newly-created need for a quantum computing chip that actually is at the forefront of human computing capability one that can offer what no classical computing system can (in some specific areas). One that can slice through specific-but-deeper layers of our Universe. Thats what IBM means when it calls this new stage the quantum-centric supercomputing one.
Classical systems will never cease to be necessary: both of themselves and the way they structure our current reality, systems, and society. They also function as a layer that allows quantum computing itself to happen, be it by carrying and storing its intermediate results or knitting the final informational state mapping out the correct answer Quantum computing provides one quality step at a time. The quantum-centric bit merely refers to how quantum computing will be the core contributor to developments in fields such as materials science, more advanced physics, chemistry, superconduction, and basically every domain where our classical systems were already presenting a duller and duller edge with which to improve upon our understanding of their limits.
However, through IBMs approach and its choice of transmon superconducting qubits, a certain difficulty lies in commercializing local installations. Quantum System Two, as the company is naming its new almost wholesale quantum computing system, has been shown working with different QPU installations (both Heron and Eagle). When asked about whether scaling Quantum System Two and similar self-contained products would be a bottleneck towards technological adoption, IBMs CTO Oliver Dial said that it was definitely a difficult problem to solve, but that he was confident in their ability to reduce costs and complexity further in time, considering how successful IBM had already proven in that regard. For now, its easier for IBMs quantum usefulness to be unlocked at a distance through the cloud and its quantum computing framework, Quiskit than it is to achieve it by running local installations.
Quiskit is the preferred medium through which users can actually deploy IBM's quantum computing products in research efforts just like you could rent X Nvidia A100s of processing power through Amazon Web Services or even a simple Xbox Series X console through Microsofts xCloud service. On the day of IBM's Quantum Summit, that freedom also meant access to the useful quantum circuits within IBM-deployed Heron QPUs. And it's much easier to scale access at home, serving them through the cloud, than delivering a box of supercooled transmon qubits ready to be plugged and played with.
Thats one devil of IBMs superconducting qubits approach not many players have the will, funding, or expertise to put a supercooled chamber into local operation and build the required infrastructure around it. These are complex mechanisms housing kilometers of wiring - another focus of IBMs development and tinkering culminating in last years flexible ribbon solution, which drastically simplified connections to and from QPUs.
Quantum computing is a uniquely complex problem, and democratized access to hundreds or thousands of mass-produced Herons in IBMs refrigerator-laden fields will ultimately only require, well a stable internet connection. Logistics are what they are, and IBMs Quantum Summit also took the necessary steps to address some needs within its Quiskit runtime platform by introducing its official 1.0 version. Food for thought is realizing that the era of useful quantum computing seems to coincide with the beginning of the era of Quantum Computing as a service as well. That was fast.
The era of useful, mass-producible, mass-access quantum computing is what IBM is promising. But now, theres the matter of scale. And theres the matter of how cost-effective it is to install a Quantum System Two or Five or Ten compared to another qubit approach be it topological approaches to quantum computing, or oxygen-vacancy-based, ion-traps, or others that are an entire architecture away from IBMs approach, such as fluxonium qubits. Its likely that a number of qubit technologies will still make it into the mass-production stage and even then, we can rest assured that everywhere in the road of human ingenuity lie failed experiments, like Intels recently-decapitated Itanium or AMDs out-of-time approach to x86 computing in Bulldozer.
It's hard to see where the future of quantum takes us, and its hard to say whether it looks exactly like IBMs roadmap the same roadmap whose running changes we also discussed here. Yet all roadmaps are a permanently-drying painting, both for IBM itself and the technology space at large. Breakthroughs seem to be happening daily on each side of the fence, and its a fact of science that the most potential exists the earlier the questions we ask. The promising qubit technologies of today will have to answer to actual interrogations on performance, usefulness, ease and cost of manipulation, quality, and scalability in ways that now need to be at least as good as what IBM is proposing with its transmon-based superconducting qubits, and its Herons, and scalable Flamingos, and its (still unproven, but hinted at) ability to eventually mass produce useful numbers of useful Quantum Processing Units such as Heron. All of that even as we remain in this noisy, intermediate-scale quantum (NISQ) era.
Its no wonder that Oliver Dial looked and talked so energetically during our interview: IBM has already achieved quantum usefulness and has started to answer the two most important questions quality and scalability, Development, and Innovation. And it did so through the collaboration of an incredible team of scientists to deliver results years before expected, Dial happily conceded. In 2023, IBM unlocked useful quantum computing within a 127-qubit Quantum Processing Unit, Eagle, and walked the process of perfecting it towards the revamped Heron chip. Thats an incredible feat in and of itself, and is what allows us to even discuss issues of scalability at this point. Its the reason why a roadmap has to shift to accommodate it and in this quantum computing world, its a great follow-up question to have.
Perhaps the best question now is: how many things can we improve with a useful Heron QPU? How many locked doors have sprung ajar?
Visit link:
Posted in Quantum Computing
Comments Off on IBM demonstrates useful Quantum computing within 133-qubit Heron, announces entry into Quantum-centric … – Tom’s Hardware
NVIDIA Unveils Breakthrough in Quantum Computing Capabilities – Game Is Hard
Posted: at 7:55 pm
NVIDIA has announced a groundbreaking update to its cuQuantum software development kit (SDK), stating that version 23.10 represents a significant leap in quantum computing capabilities. The new release integrates seamlessly with NVIDIA Tensor Core GPUs, delivering a substantial boost to the speed of quantum circuit simulations.
At the heart of cuQuantums power lies its ability to accelerate quantum circuit simulations using state vector and tensor network methods. This latest advancement is not just incremental but offers unprecedented speed and efficiency, measured in orders of magnitude.
One of the key highlights of the cuQuantum 23.10 update is the significant enhancements made to NVIDIAs cuTensorNet and cuStateVec. The new version now supports NVIDIA Grace Hopper systems, allowing for a broader range of hardware compatibility. This compatibility ensures that users can leverage the full potential of GPU acceleration for their quantum computing workloads.
cuTensorNet, a crucial component of cuQuantum, offers high-level APIs that simplify quantum simulator development. These APIs enable developers to program intuitively, abstracting away the complexities of tensor network knowledge. Performance-wise, cuTensorNet has demonstrated superior performance compared to existing technologies, such as TensorCircuit, PyTorch, and JAX, achieving a factor of 4-5.9x improvement on NVIDIA H100 GPUs.
Another notable advancement is the addition of experimental support for gradient calculations in quantum machine learning (QML) applications. This feature is expected to significantly accelerate QML and adjoint differentiation-based workflows by utilizing cuTensorNet.
Furthermore, cuStateVec now provides new APIs for host-to-device state vector swap. This development allows for the effective scaling of simulations by utilizing CPU memory alongside GPUs. For instance, simulations that previously required 128 NVIDIA H100 80GB GPUs for 40 qubit state vector simulations can now be achieved with just 16 NVIDIA Grace Hopper systems. This reduction not only speeds up computations but also leads to significant cost and energy savings.
Additionally, cuQuantum 23.10 has undergone API-level and kernel-level optimizations, resulting in enhanced performance. These improvements make Grace Hopper systems more efficient than other CPU and Hopper systems by offering faster runtimes due to improved chip-to-chip interconnects and CPU capabilities.
For those interested in exploring cuQuantum 23.10, NVIDIA provides comprehensive documentation and benchmark suites on GitHub. The company encourages feedback and queries through the GitHub platform to ensure continuous improvement and support for its user base. These updates demonstrate NVIDIAs commitment to pushing the boundaries of quantum computing, making it more accessible and efficient for a broader range of applications.
FAQ:
What is cuQuantum? cuQuantum is a software development kit (SDK) developed by NVIDIA that enhances the speed and efficiency of quantum circuit simulations by integrating with NVIDIA Tensor Core GPUs.
What are the key highlights of the cuQuantum 23.10 update? The cuQuantum 23.10 update includes significant enhancements to cuTensorNet and cuStateVec, compatibility with NVIDIA Grace Hopper systems, experimental support for gradient calculations in quantum machine learning (QML) applications, and new APIs for host-to-device state vector swap.
What is cuTensorNet? cuTensorNet is a component of cuQuantum that offers high-level APIs to simplify quantum simulator development. It allows developers to program intuitively and achieve superior performance compared to other technologies.
What are the benefits of using cuQuantum? Using cuQuantum, users can achieve substantial speed and efficiency improvements in quantum circuit simulations, reduce computational requirements, and save on costs and energy.
Where can I find more information about cuQuantum? NVIDIA provides comprehensive documentation and benchmark suites for cuQuantum on their GitHub page.
Definitions:
Quantum computing: A field that utilizes principles of quantum mechanics to perform computations, offering the potential to solve problems that are currently intractable for classical computers.
SDK: A software development kit is a set of tools, libraries, and documentation that developers use to create software applications for specific platforms.
Tensor Core GPUs: NVIDIA Tensor Core GPUs are specialized graphics processing units that feature hardware acceleration for tensor operations, which are often used in deep learning and scientific computing.
State vector: In quantum mechanics, a state vector represents the state of a quantum system, such as the position or momentum of a particle. It is typically represented as a complex vector.
Tensor network: A tensor network is a mathematical tool used in quantum physics and quantum computing to represent complex systems and manipulate quantum states efficiently.
Quantum machine learning (QML): Quantum machine learning combines principles from quantum computing and machine learning to develop algorithms that can process and analyze quantum data.
Suggested related links:
NVIDIA Official Website cuQuantum Documentation and Benchmark Suites on GitHub
Link:
NVIDIA Unveils Breakthrough in Quantum Computing Capabilities - Game Is Hard
Posted in Quantum Computing
Comments Off on NVIDIA Unveils Breakthrough in Quantum Computing Capabilities – Game Is Hard
Quantum AI Brings the Power of Quantum Computing to the Public – GlobeNewswire
Posted: at 7:54 pm
Luton, Dec. 20, 2023 (GLOBE NEWSWIRE) -- Quantum AI is set to bring the power of quantum computing to the public and has already reached a stunning quantum volume (QV) score of 14,082 in a year since its inception.
Quantum AI Ltd. was conceived by Finlay and Qaiser Sajjad during their time as students at MIT. They were inspired by the exclusive use of new-age technology by the elites on Wall Street. Recognising the transformative power of this technology, they were determined to make its potential accessible to all. Thus, the platform was born, and it has evolved and flourished in just a short time.
Quantum AI
Often, everyday traders have limited access to such advanced tools.
We are fueled by the belief that the power of quantum computing should not be confined to the financial giants but should be available to empower amateur traders as well, asserted the founders of the platform. Since its launch in 2022, they have worked to achieve this vision and have become a significant force in the industry.
The platform combines the power of the technology with the strength of artificial intelligence. By using these latest technologies, including machine learning, algorithms that are more than just lines of code have been created. They harness the potential of quantum mechanics and deep learning to analyse live data in unique ways.
Our quantum system leverages quantum superposition and coherence, providing a quantum advantage through sophisticated simulation and annealing techniques, added the founders.
Quantum AI has shown exceptional results in a brief period. It has received overwhelmingly positive reviews from customers, highlighting the enhanced speed and accuracy of trading. The transformative and groundbreaking impact the platform has had on trading is evident in its growth to 330,000 active members. Notably, it has nearly 898 million lines of code and an impressive quantum value score of 14,082. The performance on this benchmark that IBM established is a massive testament to the impact quantum AI has had in a short span of time.
According to the founders, they have bigger plans on the horizon to take the power of the technology to the public. Quantum AI is growing its team of experts and expanding its operations in Australia and Canada. Its goal of democratising the power of technology is well on its way to being realised. With trading being the first thing they cracked to pay the bills the main focus has turned to aviation, haulage and even e-commerce. The power of
To learn more about the platform and understand the transformative power of the technology for traders, one can visit https://quantumai.co/.
About Quantum AI
With the aim of democratising the power and potential of quantum computing, the company was founded by Finlay and Qaiser Sajjad during their time at MIT. Since its establishment, it has grown to over 330,000 active members and 18 full-time employees, alongside winning the trust of its customers.
###
Media Contact
Quantum AI PR Manager: Nadia El-Masri Email: nadia.el.masri@quantumai.co Address: Quantum AI Ltd, 35 John Street, Luton, United Kingdom, LU1 2JE Phone: +442035970878 URL: https://quantumai.co/
Go here to see the original:
Quantum AI Brings the Power of Quantum Computing to the Public - GlobeNewswire
Posted in Quantum Computing
Comments Off on Quantum AI Brings the Power of Quantum Computing to the Public – GlobeNewswire
Siemens collaborates with sureCore and Semiwise to pioneer quantum computing ready cryogenic semiconductor … – Design and Reuse
Posted: at 7:54 pm
Plano, Texas, USA December 20 2023 -- Siemens Digital Industries Software announced today its collaboration with sureCore and Semiwise to develop groundbreaking cryogenic CMOS circuits capable of operating at temperatures near absolute zero a fundamental component of quantum computing systems. The joint effort holds the potential for dramatic advances in both performance and power efficiency for next-generation integrated circuits (IC) targeting quantum computing considered the leading edge in the high-performance computing (HPC) research and development.
The key to unlocking the potential of quantum computing for HPC and other fast-growing applications lies in the availability of control electronics capable of operating at cryogenic temperatures. Using advanced analog/mixed-signal IC design technology from Siemens, Semiwise has developed cryogenic CMOS circuit designs featuring cryogenic SPICE models as well as SPICE simulator technology that can perform accurate analyses at cryogenic temperatures.
Semiwise is providing this intellectual property (IP), developed using Siemens Analog FastSPICE (AFS), to sureCore for the development of sureCores revolutionary line of CryoIP, which aims to enable the design of CryoCMOS control chips seen as crucial for unlocking the commercial potential for quantum computing.
In the development of its CryoIP product line, sureCore also used Siemenss Analog FastSPICE platform and Siemens Solido Design Environment software, both of which demonstrated reliable and accurate operation at cryogenic temperatures, empowering sureCore to construct analog circuits, standard cell libraries, and memory designs including SRAM, register files, and ROM, using Semiwises cryogenic transistor models. Further, Siemens Analog FastSPICE software showcased exceptional capabilities in handling foundry device models at cryogenic conditions, helping deliver efficient analog, mixed-signal, and digital circuit design and verification functionality without convergence issues. The result is a high level of accuracy and performance, setting the stage for potentially groundbreaking advancements in quantum computing.
Professor Asen Asenov, CEO of Semiwise and director for sureCore, highlighted the significance of this achievement: "For the first time, through cryogenic transistor measurements and Technology Computer-Aided Design (TCAD) analyses conducted with Siemens EDA technologies, we have developed process design kit (PDK)-quality compact transistor models, including corners and mismatch, enabling the production-worthy design of cryogenic CMOS circuits."
sureCore is rapidly progressing towards its first CryoIP tapeout, leveraging GlobalFoundries' 22FDX PDK.
Paul Wells, CEO of sureCore, underscored the pivotal role of this partnership. "The critical storage element and the bit cell must essentially be treated as an analog circuit that is highly sensitive to process variability and mismatch, said Wells. When we develop new memory designs and their associated compilers, we need to run thousands of statistical circuit simulations to guarantee the yield and reliability of our IP. Our partnership with Siemens EDA has enabled us to leverage Siemens' Custom IC verification technology to build robust cryogenic IP cores, specifically tailored for Quantum applications."
"This partnership symbolizes Siemens' unwavering dedication to advancing the quantum computing domain, said Amit Gupta, general manager and vice president of the Custom IC Verification Division, Siemens Digital Industries Software. The groundbreaking technologies and solutions developed have the potential to redefine the boundaries of high-performance computing."
Siemens' Analog FastSPICE platform, powered by technology from Siemens Analog FastSPICE eXTreme platform, offers cutting-edge circuit verification for nanometer analog, radio frequency (RF), mixed-signal, memory, and custom digital circuits. It holds foundry certifications across all major foundries and is qualified across various process nodes, from mature to the most advanced. Siemens' Analog FastSPICE platform offers a comprehensive use model, including small signal, transient, RF, noise, aging, and multi-sim verification capabilities, with drop-in compatibility with industry-standard SPICE-based flows. This all-encompassing solution boasts high performance, capacity, and flexible features.
Siemens' Solido Design Environment plays a pivotal role by providing a comprehensive cockpit for nominal and variation-aware analysis and encompasses SPICE-level circuit simulation setup, measurements, regressions, waveforms, and statistical results analysis. Powered by AI technology, Solido Design Environment assists users in identifying optimization paths to improve circuit power, performance, and area - facilitating production-accurate statistical yield analysis, reducing runtime compared to brute-force methods.
Siemens Digital Industries Software helps organizations of all sizes digitally transform using software, hardware and services from the Siemens Xcelerator business platform. Siemens' software and the comprehensive digital twin enable companies to optimize their design, engineering and manufacturing processes to turn today's ideas into the sustainable products of the future. From chips to entire systems, from product to process, across all industries, Siemens Digital Industries Software Accelerating transformation.
Read the original post:
Posted in Quantum Computing
Comments Off on Siemens collaborates with sureCore and Semiwise to pioneer quantum computing ready cryogenic semiconductor … – Design and Reuse
Does quantum theory imply the entire Universe is preordained? – Nature.com
Posted: at 7:54 pm
Is cosmic evolution a single track with no choice about the destination?Credit: Getty
Was there ever any choice in the Universe being as it is? Albert Einstein could have been wondering about this when he remarked to mathematician Ernst Strauss: What Im really interested in is whether God could have made the world in a different way; that is, whether the necessity of logical simplicity leaves any freedom at all.
US physicist James Hartle, who died earlier this year aged 83, made seminal contributions to this continuing debate. Early in the twentieth century, the advent of quantum theory seemed to have blown out of the water ideas from classical physics that the evolution of the Universe is deterministic. Hartle contributed to a remarkable proposal that, if correct, completely reverses a conventional story about determinisms rise with classical physics, and its subsequent fall with quantum theory. A quantum Universe might, in fact, be more deterministic than a classical one and for all its apparent uncertainties, quantum theory might better explain why the Universe is the one it is, and not some other version.
The brazen science that paved the way for the Higgs boson (and a lot more)
In physics, determinism means that the state of the Universe at any given time and the basic laws of physics fully determine the Universes backward history and forward evolution. This idea reached its peak with the strict, precise laws about how the Universe behaves introduced by classical physics. Take Isaac Newtons laws of motion. If someone knew the present positions and momenta of all particles, they could in theory use Newtons laws to deduce all facts about the Universe, past and future. Its only a lack of knowledge (or computational power) that prevents scientists from doing so.
Along with this distinctive predictive power, determinism underwrites scientific explanations that come close to the principle of sufficient reason most famously articulated by German polymath Gottfried Leibniz: that everything has an explanation. Every state of the Universe (with one obvious exception, which well come to) can be completely explained by an earlier one. If the Universe is a train, determinism says that its running on a track, with no option to switch to any other path because different tracks never cross.
Physicists have conventionally liked determinisms predictive and explanatory power. Others, including some philosophers, have generally been more divided, not least because of how determinism might seem to preclude human free will: if the laws of physics are deterministic, and our actions are just the summation of particle interactions, there seems to be no room for us to freely choose A instead of B, because the earlier states of the Universe will already have determined the outcome of our choice. And if we are not free, how can we be praised or blamed for our actions? Neuroendocrinologist Robert Sapolskys 2023 book Determined touches on this fascinating and controversial issue.
The strange behaviours of quantum particles that began to emerge in the twentieth century fundamentally shifted the debate surrounding determinism in physics. The laws of quantum mechanics give only the probabilities of outcomes, which can be illustrated with the thought experiment devised by Austrian physicist Erwin Schrdinger in 1935 (although when he devised it, he was concerned mainly with how the wavefunction represents reality). A cat is trapped in a box with a vial of poison that might or might not have been broken by a random event because of radioactive decay, for example. If quantum mechanics applied to the cat, it would be described by a wavefunction in a superposition of alive and dead. The wavefunction, when measured, randomly jumps to one of the two states, and quantum mechanics specifies only the probability of either possibility occurring. One consequence of the arrival of quantum mechanics was that it seemed to throw determinism out of the window.
How Stephen Hawking flip-flopped on whether the Universe has a beginning
But this accepted idea might not be the whole story, as developments in the second half of the twentieth century suggested. The quantum Universe could actually be more deterministic than a classical one, for two reasons. The first is technical. Newtons laws allow situations in which the past does not determine how things will move in the future. For example, the laws do not provide an upper bound on how much an object can be accelerated, so in theory a classical object can reach spatial infinity in finite time. Reverse this process, and you get what have been called space invaders objects that come from spatial infinity with no causal connection to anything else in the Universe, and which cant be predicted from any of the Universes past states.
In practice, this problem is solved by the universal speed limit, the speed of light, introduced by Einsteins special theory of relativity. But unruly infinities also plague Einsteinian relativity, which is a classical theory. The equations of general relativity lead to singularities of infinite curvature, most notoriously in black holes and at the Big Bang at the beginning of the Universe. Singularities are like gaps in space-time where the theory no longer applies; in some cases, anything can come out of them (or disappear into them), threatening determinism.
Many physicists think that quantum theory can come to the rescue by removing such singularities for example, by converting the Big Bang into a Big Bounce, with a Universe that continues to evolve smoothly on the other side of the singularity. If they are right, a theory of quantum gravity that fully unifies quantum theory, which predicts the behaviour of matter on the smallest scales, and Einsteins relativity, which encapsulates the large-scale evolution of the Universe, will smooth out the gaps in space-time and restore determinism.
Space-time singularities inside black holes could threaten a deterministic cosmic order.Credit: ESO/SPL
But there is a deeper reason why the quantum Universe might be more deterministic, to which Hartles scientific legacies are relevant. With US physicist Murray Gell-Mann, Hartle developed an influential approach to quantum theory, called decoherent histories1. This attempted to explain the usefulness of probabilistic statements in quantum physics, and the emergence of a familiar, classical realm of everyday experience from quantum superpositions. In their picture, the wavefunction never randomly jumps. Instead, it always obeys a deterministic law given by Schrdingers equation, which characterizes the smooth and continuous evolution of quantum states. In this respect, it is similar to US physicist Hugh Everett IIIs popular many worlds interpretation of quantum mechanics, which proposes that the quantum Universe splits into different branches according to the possibilities encoded in the wavefunction whenever anything is measured2. In what follows I assume, as Everett did, that the Universe can be completely described by a quantum wavefunction with no hidden variables that operate on a more fundamental level.
With Stephen Hawking, Hartle went on to become one of the founders of quantum cosmology, which applies quantum theory to the entire Universe. In a classical Universe, there is freedom in choosing how it all started. Even setting aside the extreme situations mentioned earlier, classical mechanics is deterministic merely in that it lays down many possible evolutionary histories for the Universe, and offers conditional statements about them: if this happens, then that must happen next. To return to the train analogy, a deterministic theory does not, by itself, say why the train is on any one given track out of many: why it is going from A to B via C, rather than from X to Y via Z. We can go back to earlier states to explain the current state, and do that all the way back to the initial state but this initial state is not explained by anything that precedes it. Ultimately, standard determinism fails to fully satisfy Leibnizs principle of sufficient reason: when it comes to the initial state, something remains without an explanation.
See me here, see me there
This failure is not just philosophical. A complete theory of the Universe should predict the phenomena we observe in it, including its large-scale structure and the existence of galaxies and stars. The dynamic equations we have, whether from Newtonian physics or Einsteinian relativity, cannot do this by themselves. Which phenomena show up in our observations depend sensitively on the initial conditions. We must look at what we see in the Universe around us, and use this information to determine the initial condition that might have given rise to such observations.
A theory that specifies deterministic laws of both the Universes temporal evolution and its exact initial condition satisfies what English physicist Roger Penrose called strong determinism in his 1989 book The Emperors New Mind. This is, according to Penrose, not just a matter of the future being determined by the past; the entire history of the universe is fixed, according to some precise mathematical scheme, for all time. Let us say that a Universe is strongly deterministic if its basic laws of physics fix a unique cosmic history. If determinism provides a set of non-crossing train tracks, without specifying which one is being used, then strong determinism lays down a single track that has no choice even about where it starts.
Strong determinism is hard to implement in classical physics. You might consider doing it by specifying the initial condition of the Universe as a law. But although the dynamical laws of classical physics are simple, the Universe itself is complex and so its initial condition must have been, too. Describing the precise positions and momenta of all the particles involved requires so much information that any statement of the initial condition is too complex to be a law.
Hartle suggested3 that quantum mechanics can solve this complexity problem. Because a quantum objects wavefunction is spread out across many classical states (cat alive or cat dead, for instance), you could propose a simple initial condition that includes all the complexities as emergent structures in the quantum superposition of these states. All the observed complexities can be regarded as partial descriptions of a simple fundamental reality: the Universes wavefunction. As an analogy, a perfect sphere can be cut into many chunks with complicated shapes, yet they can be put back together to form a simple sphere.
In 1983, Hartle and Hawking introduced4 one of the first (and highly influential) proposals about the quantum Universes initial state. Their no boundary wavefunction idea suggests that the shape of the Universe is like that of a shuttlecock: towards the past, it rounds off smoothly and shrinks to a single point. As Hawking said in a 1981 talk on the origin of the Universe in the Vatican: There ought to be something very special about the boundary conditions of the Universe, and what can be more special than the condition that there is no boundary?
Unique, or not unique?
In this perspective, the quantum Universe has two basic laws: a deterministic one of temporal evolution and a simple one that picks an initial wavefunction for the Universe. Hence, the quantum Universe satisfies strong determinism. The physical laws permit exactly one cosmic history of the Universe, albeit one described by a wavefunction that superposes many classical trajectories. There is no contingency in what the Universe as a whole could have been, and no alternative possibility for how it could have started. Every event, including the first one, is explained; the entire wavefunction of the Universe for all times is pinned down by the laws. The probabilities of quantum mechanics do not exist at the level of the basic physical laws, but can nonetheless be assigned to coarse-grained and partial descriptions of bits of the Universe.
This leads to a more predictive and explanatory theory. For example, the no-boundary proposal makes predictions for a relatively simple early Universe and for the occurrence of inflation a period of rapid expansion that the Universe seems to have undergone in its first instants.
There are still many wrinkles to this proposal, not least because some studies have shown that, contrary to initial expectations, the theory might not single out a unique wavefunction for the Universe5,6. But studies in quantum foundations research that is mostly independent from that of quantum cosmology could offer yet another method for implementing strong determinism. Several researchers have considered the controversial idea that quantum states of closed systems, including the Universe, need not be restricted to wavefunctions, but instead can come from a broader category: the space of density matrices710.
Density matrices can be thought of as superpositions of superpositions, and they provide extra options for the initial condition of the Universe. For example, if we have reasons to adopt the past hypothesis the idea, which seems likely, that the Universe began in a low-entropy state (and its entropy has been increasing steadily since) and that this theory corresponds to a set of wavefunctions, then we can choose a simple density matrix that corresponds to the uniform mixture of that set. As I have argued10, if we regard the density matrix as the initial state of the Universe and accept that it is specified by a law, then this choice, together with the deterministic von Neumann equation (a generalization of Schrdingers equation), can satisfy strong determinism. However, in this case, the laws fix a cosmic history of a quantum Universe that has many evolving branches a multiverse.
So how deterministic is the Universe? The answer will depend on the final theory that bridges the divide between quantum physics and relativity and that remains a far-off prospect. But if Hartle is right, the story of the rise and fall of determinism until now might be the reverse of the conventional tale. From a certain perspective, the quantum Universe is more deterministic than a classical one, providing stronger explanations and better predictions. That has consequences for humans, too, because that makes it harder to appeal to quantum theory to defend free will11. If the quantum Universe is strongly deterministic, then there is no other path to make the Universe than the way it is. The ultimate laws of the quantum cosmos might tell us why it is this one.
Link:
Does quantum theory imply the entire Universe is preordained? - Nature.com
Posted in Quantum Computing
Comments Off on Does quantum theory imply the entire Universe is preordained? – Nature.com
IBM Teams with Top Universities for Quantum Education in Japan, South Korea, and the U.S. – AiThority
Posted: at 7:54 pm
IBMannounced the company intends to engage withKeio University, TheUniversity of Tokyo,Yonsei University, SeoulNational University, and TheUniversity of Chicagoto work together to support quantum education activities inJapan, Korea, andthe United States. IBM intends to deliver educational offerings, in combination with contributions from each of the participating universities, to advance the training of up to 40,000 students over the next 10 years to prepare them for the quantum workforce and promote the growth of a globalquantum ecosystem.
Quantum computing offers a different approach to computation which may solve problems that are intractable. A skilled quantum workforce is critical to growing the quantum industry that will lead to economic development through leveraging quantum computing technology. Currently, people trained and skilled in quantum computing are needed as more higher-education and research institutions, national labs, and industries adopt quantum computing. To address the increasing demands of a growing quantum workforce, IBM and the five universities in Japan, Korea, and the U.S. intend to collaborate on the education of new and future generations of quantum computing users.
Recommended AI News:Riding on the Generative AI Hype, CDP Needs a New Definition in 2024
This international initiative may include materials for educators from broad disciplines of science and technology such as physics, computer science, engineering, math, life sciences, and chemistry departments. To prepare for era of quantum utility, and the coming era of quantum-centric supercomputing, the universities and IBM are focused on preparing a workforce capable of using the latest quantum computing technologies for scientific discovery and to explore industry applications that create new value in specific domains.
IBM intends to participate with the universities to develop a robust quantum curriculum to teach the next generation of computational scientists, who will be able to use quantum computers as a scientific tool. And all parties involved, whether individually or collectively, have the resources to engage in educator training, course material development, and community-driven educational events, including mentorships, joint summer programs, exchange programs and distinguished lecture programs.
Recommended AI News: Kasada Unleashes the Evolution of its Bot Defense Platform
This monumental trilateral collaboration between IBM and some of the worlds leading universities in the U.S.,Japan, andSouth Korea, is a significant step forward in quantum education, ensuring our continued technical leadership, and epitomizes the spirit of international cooperation and technological progress that are essential in interconnected world. By fostering a robust quantum workforce and supporting groundbreaking research, we are not only enhancing academic excellence but also contributing to economic development and technological innovation on a global scale. Rahm Emanuel, U.S. Ambassador toJapan.
Since the Camp David agreements, cooperation between Korea, the U.S. andJapan has expanded to various fields such as the security, economy, high-tech, health and cyber. I believe there is a true call for collaboration between the three countries, especially in high-tech fields such as quantum computing. announcement of the plan to train human resources and establish a research and industrial ecosystem in the quantum field will serve as a meaningful starting point for the trilateral high-tech cooperation. And I trust this will bring about substantial benefits for the people of the three countries through more investment and job creation. Yun Duk-min,South KoreaAmbassador toJapan.
Keiohas been a pioneer in quantum research and education for more than 20 years. Now is the right time to rethink how we train the scientists and engineers for careers in this growing field, and we are excited to work with IBM and with other top universities in the creation and use of the next generation of educational materials. Blending our online courses with hands-on exercises using IBMs materials will improve recruitment, the rate of learning, and retention among our quantum native students. ProfessorKohei Itoh, President,Keio University.
Among the various research fields, quantum computing, which excels at calculating equations containing many complex combinations, is expected to play a key role in the future of an advanced information-oriented and knowledge-intensive society and has been one of the most important fields that we emphasize at UTokyo, and we believe it very important to train the quantum professionals of tomorrow, quantum natives. Therefore, we aim to foster quantum natives and develop human capital that will lead quantum research in social implementation, industrial applications, and academic fields, by promoting education on quantum computing throughout the entire university. The education program starts with first-year undergraduate students, using actual quantum computing equipment, including the state-of-the-art IBM Quantum machines, even with those new to quantum mechanics. It extends to senior undergraduate and graduate courses in sciences, engineering, and information science by implementing educational programs that are seamlessly organized through undergraduate and postgraduate courses. In this collaboration in quantum education among the universities in the U.S.,Japan, andSouth Korea, we will make use of our respective strengths to contribute to the further promotion of quantum education and the solution of social issues. Dr.Teruo Fujii, President, TheUniversity of Tokyo.
With the goal to create a robust quantum computing ecosystem,Yonsei Universityplans to introduce IBM Quantum System One for the first time in Korea in 2024. This collaboration is anticipated to significantly contribute to the foundational framework of both domestic and international quantum computing ecosystems. Simultaneously, it should play a pivotal role in the training of experts and the facilitation of cutting-edge research within the quantum computing domain. The collaboration with IBM is poised to synergize withYonsei Universitysexisting prowess in education and research, yielding a combined effect that will propel the development in the field of quantum computing. ProfessorSeoung Hwan Suh, President, Yonsei University.
SeoulNational Universityis at the center of quantum science and technology in Korea, with over 30 groups working on the core problems of broad scientific and technological issues. More recently, we have been working to build a stronger research community at the SNU campus by bringing them together under a single organization. This new organization will lead our efforts in this fast-developing and vibrant field of quantum science and technologies. Our collaboration with IBM and four other affiliated universities in Korea,Japan, and the U.S. will boost our efforts. We look forward to working with IBM in the coming years. ProfessorHong Lim Ryu, President, SeoulNational University.
Recommended AI News: Fubo and EDO Partner to Measure Connected TV Advertising Engagement and Outcomes
TheUniversity of Chicagowas an early pioneer of the field of quantum engineering, and was the first university in the U.S. to award graduate degrees in this emerging area of technology. With other partners in theChicagoregion, UChicago has strived to develop a vibrant ecosystem for quantum technologies that is attracting companies and investments from around the world. These developments have underscored the need for a talented workforce. TheUniversity of Chicagois excited and proud to partner with IBM, and to build on its long-standing ties toKeio University,Yonsei University, SeoulNational University, and TheUniversity of Tokyo, to deliver world-class educational programs that will prepare thousands of students for jobs and opportunities in quantum information sciences.Paul Alivisatos, President, theUniversity of Chicago.
With the recent demonstrations that quantum computers at a scale of more than 100 qubits are capable of being used as scientific tools to deliver insights reaching beyond leading classical approaches, we have an even greater need to educate todays students to join the growing quantum workforce. This effort intends to provideKeio University, theUniversity of Tokyo,Yonsei University, SeoulNational University, and theUniversity of Chicagowith IBMs latest and most advanced quantum education materials is a crucial step toward exploring useful quantum applications. Daro Gil, Senior Vice President and Director of IBM Research.
[To share your insights with us, please write tosghosh@martechseries.com]
The rest is here:
Posted in Quantum Computing
Comments Off on IBM Teams with Top Universities for Quantum Education in Japan, South Korea, and the U.S. – AiThority