The Prometheus League
Breaking News and Updates
- Abolition Of Work
- Ai
- Alt-right
- Alternative Medicine
- Antifa
- Artificial General Intelligence
- Artificial Intelligence
- Artificial Super Intelligence
- Ascension
- Astronomy
- Atheism
- Atheist
- Atlas Shrugged
- Automation
- Ayn Rand
- Bahamas
- Bankruptcy
- Basic Income Guarantee
- Big Tech
- Bitcoin
- Black Lives Matter
- Blackjack
- Boca Chica Texas
- Brexit
- Caribbean
- Casino
- Casino Affiliate
- Cbd Oil
- Censorship
- Cf
- Chess Engines
- Childfree
- Cloning
- Cloud Computing
- Conscious Evolution
- Corona Virus
- Cosmic Heaven
- Covid-19
- Cryonics
- Cryptocurrency
- Cyberpunk
- Darwinism
- Democrat
- Designer Babies
- DNA
- Donald Trump
- Eczema
- Elon Musk
- Entheogens
- Ethical Egoism
- Eugenic Concepts
- Eugenics
- Euthanasia
- Evolution
- Extropian
- Extropianism
- Extropy
- Fake News
- Federalism
- Federalist
- Fifth Amendment
- Fifth Amendment
- Financial Independence
- First Amendment
- Fiscal Freedom
- Food Supplements
- Fourth Amendment
- Fourth Amendment
- Free Speech
- Freedom
- Freedom of Speech
- Futurism
- Futurist
- Gambling
- Gene Medicine
- Genetic Engineering
- Genome
- Germ Warfare
- Golden Rule
- Government Oppression
- Hedonism
- High Seas
- History
- Hubble Telescope
- Human Genetic Engineering
- Human Genetics
- Human Immortality
- Human Longevity
- Illuminati
- Immortality
- Immortality Medicine
- Intentional Communities
- Jacinda Ardern
- Jitsi
- Jordan Peterson
- Las Vegas
- Liberal
- Libertarian
- Libertarianism
- Liberty
- Life Extension
- Macau
- Marie Byrd Land
- Mars
- Mars Colonization
- Mars Colony
- Memetics
- Micronations
- Mind Uploading
- Minerva Reefs
- Modern Satanism
- Moon Colonization
- Nanotech
- National Vanguard
- NATO
- Neo-eugenics
- Neurohacking
- Neurotechnology
- New Utopia
- New Zealand
- Nihilism
- Nootropics
- NSA
- Oceania
- Offshore
- Olympics
- Online Casino
- Online Gambling
- Pantheism
- Personal Empowerment
- Poker
- Political Correctness
- Politically Incorrect
- Polygamy
- Populism
- Post Human
- Post Humanism
- Posthuman
- Posthumanism
- Private Islands
- Progress
- Proud Boys
- Psoriasis
- Psychedelics
- Putin
- Quantum Computing
- Quantum Physics
- Rationalism
- Republican
- Resource Based Economy
- Robotics
- Rockall
- Ron Paul
- Roulette
- Russia
- Sealand
- Seasteading
- Second Amendment
- Second Amendment
- Seychelles
- Singularitarianism
- Singularity
- Socio-economic Collapse
- Space Exploration
- Space Station
- Space Travel
- Spacex
- Sports Betting
- Sportsbook
- Superintelligence
- Survivalism
- Talmud
- Technology
- Teilhard De Charden
- Terraforming Mars
- The Singularity
- Tms
- Tor Browser
- Trance
- Transhuman
- Transhuman News
- Transhumanism
- Transhumanist
- Transtopian
- Transtopianism
- Ukraine
- Uncategorized
- Vaping
- Victimless Crimes
- Virtual Reality
- Wage Slavery
- War On Drugs
- Waveland
- Ww3
- Yahoo
- Zeitgeist Movement
-
Prometheism
-
Forbidden Fruit
-
The Evolutionary Perspective
Category Archives: Quantum Computing
Quantum and the art of noise – ComputerWeekly.com
Posted: February 15, 2022 at 5:29 am
Noise, huh, whats it good for? Absolutely nothin. Apart from the geniuses trying to further the advancement of noisy intermediate quantum computing (Nisq), noise means errors. Lowering the error rate in this emerging area of computing requires significantly more physical qubits for every useful logical qubit.
Computer Weekly recently spoke to a number of experts in the field of quantum computing and a picture is emerging of quantum computing, which illustrates the efforts going into making something practical, out of a technology that few truly understand. It promises so much. Imagine being able to solve problems in a way that is simply impossible with existing high performance computing. By being able to simulate chemistry at the quantum level, a quantum computer opens up huge opportunities in material science and a way to control chemical reactions in industrial processes to achieve outcomes such as reducing harmful emissions and waste or improving yield.
One of the new companies trying to make the most of existing tech is Algorithmiq. Its co-founder and CEO Sabrina Maniscalco, believes that full tolerance in quantum computing will require technical advances in manufacturing and may even require fundamental principles to be discovered because, as she says: The science doesnt exist yet. Her company has just received funding to help it develop algorithms for the pharmaceutical sector that can cope with todays noisy quantum computers.
Many of the labs running quantum computing systems, need to operate at close to absolute zero (-273 degrees celsius) to form superconducting qubits. But this level of cooling is not particularly scalable, so one of the on-going areas of research is how to achieve quantum computing at room temperature. This is the realm of the trapped ion quantum computer, and requires an entirely different approach. Winfried Hensinger, chief scientist at Universal Quantum, a spin out from Sussex University, believes that trapped ion quantum computers are more resilient to noise. He says: The ion is naturally much better isolated from the environment as it just levitates above a chip.
Another startup, Quantum Motion, spun out of UCL, is looking at how to industrialise quantum computing by being able to measure the quantum state of a single electron in a silicon transistor. Significantly, this transistor can be manufactured using the same chip fabrication techniques that are used in the manufacture of microprocessors.
These three examples represent a snapshot of the level of ingenuity that is being poured into quantum computer research. A universal quantum computer may be years off, but something usable and scalable is almost within earshot.
The rest is here:
Posted in Quantum Computing
Comments Off on Quantum and the art of noise – ComputerWeekly.com
Mark Zuckerbergs metaverse will require computing tech no one knows how to build – Protocol
Posted: at 5:29 am
The technology necessary to power the metaverse doesnt exist.
It will not exist next year. It will not exist in 2026. The technology might not exist in 2032, though its likely we will have a few ideas as to how we might eventually design and manufacture chips that could turn Mark Zuckerbergs fever dreams into reality by then.
Over the past six months, a disconnect has formed between the way corporate America is talking about the dawning concept of the metaverse and its plausibility, based on the nature of the computing power that will be necessary to achieve it. To get there will require immense innovation, similar to the multi-decade effort to shrink personal computers to the size of an iPhone.
Microsoft hyped its $68.7 billion bid for Activision Blizzard last month as a metaverse play. In October, Facebook transformed its entire corporate identity to revolve around the metaverse. Last year, Disney even promised to build its own version of the metaverse to allow storytelling without boundaries.
These ideas hinge on our ability to build the chips, data centers and networking equipment needed to deliver the computing horsepower required. And at the moment, we cant. No one knows how, or where to start, or even whether the devices will still be semiconductors. There arent enough chips right now to build all the things people want today, let alone whats promised by metaverse preachers.
The biggest things that we are looking at in supercomputers today still need to be improved in order to be able to deliver [a metaverse] type of experience, Jerry Heinz, the former head of Nvidias Enterprise Cloud unit, told Protocol.
What we now describe as the metaverse is at least as old as early 20th century speculative fiction.
E.M. Forsters 1909 story The Machine Stops, for example, renders a pre-chip, pre-digital version of the metaverse. Fast forward 70 years, and science-fiction writer William Gibson called this concept cyberspace in the 1984 book Neuromancer; Neal Stephenson popularized the word metaverse in his 1992 novel Snow Crash; Ernest Cline called it OASIS (an acronym for Ontologically Anthropocentric Sensory Immersive Simulation) in Ready Player One. Few of those stories describe a utopian community.
Its possible that what we now call the metaverse will forever remain the domain of science fiction. But like it or not, Mark Zuckerberg has vaulted the idea into the mainstream.
Zuckerbergs explanation of what the metaverse will ultimately look like is vague, but includes some of the tropes its boosters roughly agree on: He called it [an] embodied internet that youre inside of rather than just looking at that would offer everything you can already do online and some things that dont make sense on the internet today, like dancing.
If the metaverse sounds vague, thats because it is. That description could mutate over time to apply to lots of things that might eventually happen in technology. And arguably, something like the metaverse might eventually already exist in an early form produced by video game companies.
Roblox and Epic Games Fortnite play host to millions albeit in virtually separated groups of a few hundred people viewing live concerts online. Microsoft Flight Simulator has created a 2.5 petabyte virtual replica of the world that is updated in real time with flight and weather data.
But even todays most complex metaverse-like video games require a tiny fraction of the processing and networking performance we would need to achieve the vision of a persistent world accessed by billions of people, all at once, across multiple devices, screen formats and in virtual or augmented reality.
For something that is a true mass market, spend-many-hours-a-day doing [kind of activity, were looking] at generations of compute to leap forward to do that, Creative Strategies CEO Ben Bajarin told Protocol. What youre going to see over the next few years is an evolution to what you see today, with maybe a bit more emphasis on AR than VR. But its not going to be this rich, simulated 3D environment.
In the beginning, chips powered mainframes. Mainframes begat servers, home computers and smartphones: smaller, faster and cheaper versions of more or less the same technology that came before.
If the metaverse is next, nobody can describe the system requirements specifically because it will be a distinct departure from prior shifts in computing. But it has become clear that to achieve anything close to the optimistic version, chips of nearly every kind will have to be an order of magnitude more powerful than they are today.
Intels Raja Koduri took a stab at the question in a recent editorial, writing: Truly persistent and immersive computing, at scale and accessible by billions of humans in real time, will require even more: a 1,000-times increase in computational efficiency from todays state of the art.
Its difficult to understate how challenging it will be to reach the goal of a thousandfold increase in computing efficiency. Koduris estimate might be convservative, and the demands could easily exceed 10 times that amount.
Even assuming those onerous hardware requirements can be met, better communication between all layers of the software stack from chips at the bottom to end-user applications at the top will also be required, University of Washington computer science professor Pedro Domingos told Protocol.
We can get away with [inefficiency] now, but were not going to get away with it in the metaverse, he said. The whole [software] stack is going to be more tightly integrated, and this is already happening in areas such as AI and, of course, graphics.
The generational leap toward the metaverse probably wont be quantum computing, or at least how we think of it today: a largely theoretical platform decades from practical use that requires calculations to be performed at outer-space vacuum temperatures in room-sized computers. But the performance breakthrough promised by something like quantum computing will be necessary.
Google is exploring using algorithms to design more powerful chips, which could help move the needle. Special-purpose processors for AI models exist today, but by creating even more specialized chips, its possible to eke out more performance, Domingos said. Those designs can circumvent roadblocks to increasing the raw performance of existing silicon, such as making an application-specific integrated circuit that performs physics calculations.
These companies the chip-makers, or the providers of the metaverse, or who knows will make more and more advanced chips for this purpose, Domingos said. For every level of the stack, from the physics to the software, there are things you can do.
Domingos noted that, in the 1990s, ray tracing in real time would have been considered impossible, yet decades later its now done in real time with chips that power the PlayStation 5 and Xbox Series X. Googles AI chips, known as tensor processing units, are another example of a specialized type of chip that will only become more abundant in the future, and is necessary for the metaverse.
But generational shifts in computing also require equivalent shifts in manufacturing technology. Companies such as TSMC and Intel are already pushing the boundaries of physics with extreme ultraviolet lithography machines to print the most advanced chips.
The latest EUV machines are dedicated to squeezing larger numbers of ever-smaller transistors and features onto each chip, continuing down the path that has been established for decades. But at some point in the future, the chip-making machines will become too costly, or it will be impossible to shrink features any further.
If you look at where the architecture stands, if you look at where the performance per watt stands, I dont want to say we need a breakthrough, but were pretty close to needing a breakthrough, Bajarin said. Sub-one nanometer is roughly four or five years away, and thats not going to solve this problem.
Without a generational leap in computing, a lower-fidelity version of the Zuckerverse is attainable. Assuming users will settle for graphics somewhat better than Second Life was able to achieve a decade ago, it should be possible in the longer run to make something that achieves some of the goals, such as a persistent, internet-connected virtual world. Building that version of the metaverse will require better networking tech, the specialized chips Domingos described and possibly something like artificial intelligence computing in order to handle some of the more complex but mundane workloads.
Theres a lot of scaling up to do, which means that todays data centers are going to look miniscule compared with the ones of tomorrow, Domingos said.
But its going to take a long time to get there. Zuckerbergs vision of the metaverse could be decades away, and after losing $20 billion on the effort so far, it's not clear Meta will have the cash to turn that vision into reality.
Read more:
Mark Zuckerbergs metaverse will require computing tech no one knows how to build - Protocol
Posted in Quantum Computing
Comments Off on Mark Zuckerbergs metaverse will require computing tech no one knows how to build – Protocol
Duke University and IonQ Develop New Quantum Computing Gate – HPCwire
Posted: February 11, 2022 at 6:52 am
DURHAM, N.C. & COLLEGE PARK, Md., Feb. 10, 2022 Today, the Duke Quantum Center (DQC) at Duke University and IonQ announced the invention of a new quantum computing operation with the potential to accelerate several key quantum computing techniques and contribute to scaling quantum algorithms. The new quantum gate is a novel way to operate on many connected qubits at once and leverages the multi-qubit communication bus available only on IonQ and DQC quantum computers. Full details of the gate technique can be found on the preprint archive arXiv at arXiv:2202.04230.
The new gate family includes the N-qubit Toffoli gate, which flips a select qubit if and only if all the other qubits are in a particular state. Unlike standard two-qubit quantum computing gates, the N-qubit Toffoli gate acts on many qubits at once, leading to more efficient operations. The gate appears naturally in many common quantum algorithms.
IonQ and Dukes discovery may lead to significant efficiency gains in solving fundamental quantum algorithms, such as Grovers search algorithm, variational quantum eigensolvers (VQEs), and arithmetic operations like addition and multiplication. These use cases are ubiquitous across quantum computing applications, and are core to IonQs work in quantum chemistry, quantum finance, and quantum machine learning. They are also key components of commonly accepted industry benchmarks for quantum computers, which have alreadyshown IonQs computers to be industry leaders.
This discovery is an example of us continuing to build on the leading technical architecture weve established. It adds to the unique and powerful capabilities we are developing for quantum computing applications, said Peter Chapman, CEO at IonQ.
This research, conducted at Duke by Dr. Or Katz, Prof. Marko Cetina, and IonQ co-Founder and Chief Scientist Prof. Christopher Monroe, will be integrated into IonQs quantum computing operating system for the general public to use. Monroe notes that, no other available quantum computing architecturesnot even other ion-based quantum computersare able to utilize this new family of N-qubit gates. This is because IonQs quantum computers uniquely feature full connectivity and a wide communication bus that allows all qubits to talk to each other simultaneously.
This discovery follows a series of announcements around IonQs research efforts and preparations for scale. In December, IonQ announced that itplans to use barium ions as qubitsin its systems, bringing about a wave of advantages it believes will enable advanced quantum computing architectures. Last year, the team also debuted the industrys firstReconfigurable Multicore Quantum Architecture and Evaporated Glass Trap technology, both of which are expected to contribute to scaling the number of qubits in IonQs quantum computers.
About IonQ
IonQ, Inc. is a leader in quantum computing, with a proven track record of innovation and deployment. IonQs next-generation quantum computer is the worlds most powerful trapped-ion quantum computer, and IonQ has defined what it believes is the best path forward to scale.
IonQ is the only company with its quantum systems available through the cloud on Amazon Braket, Microsoft Azure, and Google Cloud, as well as through direct API access. IonQ was founded in 2015 by Christopher Monroe and Jungsang Kim based on 25 years of pioneering research. To learn more, visitwww.ionq.com.
Source: IonQ
Read this article:
Duke University and IonQ Develop New Quantum Computing Gate - HPCwire
Posted in Quantum Computing
Comments Off on Duke University and IonQ Develop New Quantum Computing Gate – HPCwire
Global $1.6 Billion Quantum Computing Technologies and Markets to 2026 – PRNewswire
Posted: at 6:52 am
DUBLIN, Feb. 10, 2022 /PRNewswire/ -- The "Quantum Computing: Technologies and Global Markets to 2026" report has been added to ResearchAndMarkets.com's offering.
The global quantum computing technologies market should reach $1.6 billion by 2026 from $390.7 million in 2021 at a compound annual growth rate (CAGR) of 33.2% for the forecast period of 2021 to 2026.
Report Scope
This report provides an overview of the global market for quantum computing and analyzes market trends. Using 2020 as the base year, the report provides estimated market data for the forecast period 2021 through 2026. Revenue forecasts for this period are segmented based on offering, deployment, technology, application, end-user industry and region.
Quantum computing is the gateway to the future. It can revolutionize computation by making certain types of classically stubborn problems solvable. Currently, no quantum computer is mature enough to perform calculations that traditional computers cannot, but great progress has been made in the last few years. Several large and small start-ups are using non-error-corrected quantum computers made up of dozens of qubits, some of which are even publicly accessible via the cloud. Quantum computing helps scientists accelerate their discoveries in related areas, such as machine learning and artificial intelligence.
Early adoption of quantum computers in the banking and financial industries, increased investment in quantum computing technology, and the rise of numerous strategic partnerships and collaborations are the main drivers behind the market growth.
The trend towards strategic approaches such as partnerships and collaborations is expected to continue. As quantum computer vendors move to quantum development, the consumer industries will seek to adopt current and new quantum technologies to gain a competitive advantage. The technological hurdles in the implementation of the quantum systems, as well as the lack of quantum skills, can limit the market growth. However, increasing adoption of quantum technology in healthcare, increasing demand for computing power, and the introduction of cloud-based quantum computing services are expected to open up new market opportunities during the forecast period.
Between 2021 and 2026, many companies with optimization problems may adopt a hybrid approach where some of the problems are handled by classical computing and the rest by quantum computers. The demand for quantum computers is expected to grow from multiple end-user industries, from finance to pharmaceuticals, automobiles to aerospace. Many industries, such as banks, are now using cloud-based quantum services.
There is no doubt that quantum computers will be expensive machines to develop and will be operated by a small number of key players. Companies like Google and IBM plan to double the performance of quantum computers each year. In addition, a small but important cohort of promising start-ups is steadily increasing the number of qubits a computer can process. This creates an immersive opportunity for the global quantum computing market growth in the coming years.
This report has divided the global quantum computing market based on offering, technology, deployment, application, end-user industry, and region. Based on offering, the market is segmented into systems and services. The services memory segment held the largest market share, and it is expected to register the highest CAGR during the forecast period. The services segment includes quantum computing as a service (QCaaS) and consulting services.
The report also focuses on the major trends and challenges that affect the market and the competitive landscape. It explains the current market trends and provides detailed profiles of the major players and the strategies they adopt to enhance their market presence. The report estimates the size of the global quantum computing market in 2020 and provides projections of the expected market size through 2026.
Competitive Landscape
Company profiles of the key industry players include
Patent Analysis
For more information about this report visit https://www.researchandmarkets.com/r/o1td8j
Media Contact:
Research and Markets Laura Wood, Senior Manager [emailprotected]
For E.S.T Office Hours Call +1-917-300-0470 For U.S./CAN Toll Free Call +1-800-526-8630 For GMT Office Hours Call +353-1-416-8900
U.S. Fax: 646-607-1907 Fax (outside U.S.): +353-1-481-1716
SOURCE Research and Markets
Read more:
Global $1.6 Billion Quantum Computing Technologies and Markets to 2026 - PRNewswire
Posted in Quantum Computing
Comments Off on Global $1.6 Billion Quantum Computing Technologies and Markets to 2026 – PRNewswire
Germany Expands its Quantum Computing Roadmap with QuaST – Quantum Computing Report
Posted: at 6:52 am
Germany Expands its Quantum Computing Roadmap with QuaST
By Carolyn Mathas
Germany aims to become a leader in quantum technologies and is rapidly rolling out its roadmap. A newly launched Quantum-enabling Services and Tools for Industrial Applications (QuaST) Consortium will enable rapid quantum adoption without requiring relevant prior knowledge or major investment. End users will simply submit their complex optimization problem and a solution including a co-design process will be automatically generated. Potential applications include logistic optimization, scheduling in production management, health care and drug development and cases from automotive and cybersecurity.
The project is managed by the Fraunhofer Institute for Cognitive Systems IKS, with the additional involvement of such industry partners as the Fraunhofer Institutes for Applied and Integrated Security (AISEC), for Integrated Circuits (IIS), and for Integrated Systems and Device Technology (IISB), the Leibniz Supercomputing Center, and the Technical University of Munich (TUM), as well as companies DATEV eG, Infineon Technologies AG, IQM and ParityQC. The project sponsor is German Aerospace Center (DLR).
Each member brings guidance, technology, training, or funding to the effort. ParityQC, through its architecture and operating system, for example, offers a new approach to optimization encoding and is developing a solution path that automatically finds ideal algorithmic building blocks to solve a problem, and suggesting the most efficient way to encode it on a quantum computer.
The QuaST project emerged from the Munich Quantum Valley initiative for the promotion of Quantum Sciences and Quantum Technologies in Bavaria. QuaST will run until the end of 2024. It has so far received 5.5 million euros ($6.3M USD) in funding, and the total volume of the project amounts to 7.7 million euros ($8.8M USD) with funds provided by Germanys Federal Ministry of Economic Affairs and Climate Action. For more information, access the press release here.
February 9, 2022
This site uses Akismet to reduce spam. Learn how your comment data is processed.
Here is the original post:
Germany Expands its Quantum Computing Roadmap with QuaST - Quantum Computing Report
Posted in Quantum Computing
Comments Off on Germany Expands its Quantum Computing Roadmap with QuaST – Quantum Computing Report
Hidden in Plain Sight: The Growing Role of Computation in Science – Research Blog – Duke Today
Posted: at 6:52 am
One of downtown Durhams most memorable landmarks, the Chesterfield building looks like it was aesthetically designed to maintain the countrys morale during World War II. On the former cigarette factorys roof rests a brilliant red sign thats visible from miles away:
But dont mistake the buildings quaint exterior for antiquity: the Chesterfield Building is home to one of the nations most powerful quantum computers. Managed by the Duke Quantum Center, the computer is part of Dukes effort to bolster the Scalable Quantum Computing Laboratory (SQLab).
On February 2nd, the labs director Christopher Monroe joined engineering professor Michael Reiter and English professor Charlotte Sussman to discuss the growing presence of computation at Duke and in research institutions across the country.
Monroe opened by detailing the significance of quantum computing in the modern world. He explained that quantum mechanics are governed by two golden rules: first, that quantum objects are waves and can be in superposition, and second, that the first rule only applies when said objects are not being measured.
The direct impact of quantum mechanics is that electrons can be in two orbits at the same time, which revolutionizes computing. Quantum computers factor numbers exponentially faster than classical computers, converge to more desirable solutions in optimization problems and have been shown to bolster research in fields like biomolecular modeling.
Still, Monroe insists that the future reach of quantum computing is beyond anyones current understanding. Says Monroe, quantum computing is an entirely new way of dealing with information, so we dont know all the application areas it will touch. What we do know, he says, is that quantum computers are poised to take over where conventional computers and Moores Law leave off.
While Monroe discussed computing innovations, Michael Reiter James B. Duke Professor of Computer Science and Electrical and Computer Engineering demonstrated the importance of keeping computing systems safe. By pointing to the 2010 Stuxnet virus, a series of cyberattacks against Iranian nuclear centrifuges, and the 2017 Equifax Data Breach, which stole the records of 148 million people, Dr. Reiter provided evidence to show that modern data systems are vulnerable and attractive targets for cyber warfare.
To show the interdisciplinary responsibilities associated with the nations cybersecurity needs, Reiter posed two questions to the audience. First, what market interventions are appropriate to achieve more accountability for negligence in cybersecurity defenses? Second, what are the rules of war as it relates to cyber warfare and terrorism?
After Reiters presentation, Charlotte Sussman transitioned the conversation from the digital world to the maritime world. A professor of English at Duke, Sussman has always been interested in ways to both memorialize and understand the middle passage, the route slave trading ships took across the Atlantic from Africa to the Americas. Through the Universitys Bass Connections and Data+ research programs, she and a group of students were able to approach this problem through the unlikely lens of data science.
Sussman explained that her Data+ team used large databases to find which areas of the Atlantic Ocean had the highest mortality rates during the slave trade, while the Bass Connections team looked at a single journey to understand one young migrants path to the bottom of the sea.
Monroe, Reiter, and Sussman all showed that the applications of computing are growing without bound. Both the responsibility to improve computing infrastructures and the ability to leverage computing resources are rapidly expanding to new fields, from medicine and optimization to cybersecurity and history.
With so many exciting paths for growth, one point is clear about the future of computing: it will outperform anyones wildest expectations. Be prepared to find computing in academia, business, government, and other settings that require advanced information.
Many of these areas, like the Chesterfield Building, will probably see the impact of computing before you know it.
Post by Shariar Vaez-Ghaemi, Class of 2025
Read the original post:
Hidden in Plain Sight: The Growing Role of Computation in Science - Research Blog - Duke Today
Posted in Quantum Computing
Comments Off on Hidden in Plain Sight: The Growing Role of Computation in Science – Research Blog – Duke Today
The worst thought experiments imaginable – The Next Web
Posted: at 6:52 am
While the rest of us are doing good, honest work like podcasting and influencer-ing, theres a group of thinkers out there conducting horrific experiments. Theyre conjuring pedantic monsters, murdering innumerable cats, and putting humans inside of computers.
Sure, these thought experiments are all in their heads. But thats how it starts. First you dont know whether the cats dead or alive and then a demon opens the box and were all in the Matrix.
Unfortunately, there are only two ways to fight science and philosophy:
Thus, well arm ourselves with the collective knowledge of those whove gone before us (ahem, Google Scholar) and critique so snarky it could tank a Netflix Original. And well decide once-and-for-all whose big, bright ideas are the worst.
What if I told you there was a box that gave away a free lunch every time it was opened? Some of you are reading this and thinking is Neural suggesting we eat dead cats?
No. Im talking about a different box from a different thought experiment. Erwin Schrdingers cat actually came along some 68 years after James Clerk Maxwells Demon.
In Maxwells Demon, we have a box with a gate in the middle separating its contents (a bunch of particles) into two sides. Outside the box, theres what Maxwell calls a finite being (who other scientists later inexplicably decided was a demon) who acts as the gatekeeper.
So this demon being controls which particles go from one side of the box to the other. And, because particle behavior varies at different temperatures, this means the demons able to exploit physics to harness energy from the universes tendency towards entropy.
This particular thought experiment is awful. As in: its awfully good at being awesome!
Maxwells Demon has managed to stand the test of time and, a century-and-a-half later, its at the heart of the quantum computing industry. It might be the best scientific thought experiment ever.
The worst is actually Szilards Engine. But you have to go through Maxwells Demon to get there. Because in Szilards box, rather than Maxwells Demon exploiting the tendencies of the universe, the universe exploits Maxwells Demon.
Szilards work imagines a single-molecule engine inside of the box that results in a system where entropy works differently than it does in Maxwells experiment.
This difference in opinion over the efficacy of entropy caused a kerfuffle.
It all started when scientists came up with the second law of thermodynamics, which basically just says that if you drop an ice cube in a pot of boiling water, it wont make the water hotter.
Well, Maxwells Demon essentially says sure, but what if were talking about really tiny things experiencing somewhat quantum interactions? This made a lot of sense and has led to numerous breakthroughs in the field of quantum physics.
But then Szilard comes along and says, Oh yeah, what if the system only had one molecule and, like, the demon was really bored?
Those probably arent their exact words. Im, admittedly, guessing. The point is that Szilards Engine was tough to swallow back when he wrote it in 1929 and its only garnered more scrutiny since.
Dont just take my word for it. Its so awful that John D. Norton, a scientist from the department of history and philosophy of science at the University of Pittsburgh, once wrote an entire research paper describing it as the worst thought experiment.
In their criticism, Norton wrote:
In its capacity to engender mischief and confusion, Szilards thought experiment is unmatched. It is the worst thought experiment I know in science. Let me count the ways it has misled us.
Thats borderline hate-poetry and I love it. The only criticism I have to add is that its preposterous Szilard didnt reimagine the whole thing as Szilards Lizard.
The missed opportunity alone gets it our stamp for worst scientific thought experiment.
Honestly, Id say Ren Descartesscogito, ergo sum is the worst thought experiment of all time. But theres not much to discuss.
You ever meet someone who, if they started a sentence with I think, youd want to interrupt them to disagree? Imagine that, but at the multiverse level.
Accepting Descartesspremise requires two leaps of faith in just three words and Im not prepared to give anyone that much credit.
But, admittedly, thats low hanging fruit. So lets throw another twist in this article and discuss my favorite paper of all time because its also the worst philosophical thought experiment ever.
Nick BostromsSimulation Argument lies at the intersection of lazy physics and brilliant philosophy. Its like the Han Solo of thought experiments: you love itbecause its so simple, not in spite of it.
It goes like this: Uh, what if, like, we live inside a computer?
For the sake of fairness, this is how Bostrom puts it:
This paper argues that at least one of the following propositions is true: (1) the human species is very likely to go extinct before reaching a posthuman stage; (2) any posthuman civilization is extremely unlikely to run a significant number of simulations of their evolutionary history (or variations thereof); (3) we are almost certainly living in a computer simulation.
Think about it for a second.
Done? Good. It doesnt go any deeper. It really is just, what if all of this is just a dream? But instead of a dream, were digital entities in a computer simulation.
Its uh, kinda dumb, right?
But that doesnt mean Bostroms paper isnt important. I think its the most influential thought experiment since Descartess off-puttinginsistence upon his own existence (self involved much D?)
Bostroms a master philosopher because he understands that the core of explanation lies not in burdening a reader with unessential thought, but in stripping it away. He understands perfection as Antoine de Saint Exupry did when he declared it was attained not when there is nothing more to add, but when there is nothing more to remove.
Bostrom whittled the Simulation Argument down with Occams Razor until it became a paper capable of pre-empting your biggest yeah but, what about. queries before you could think them.
Still though, you dont have to be the head of Oxfords philosophy department to wonder if life is but a dream.
Theres no official name for this one, so well just call it That time the people building the A-bomb had to spend a few hours wondering if they were about to set the atmosphere on fire before deciding the math looked good and everything was going to be fine.
A close runner-up for this prize is That time the Nazis most famous quantum physicist was asked if it was possible that Germanys weapons could blow up the Earth by setting all the oceans aflame and he was all like: lol, maybe.
If I can channel our pal John D. Norton from above: these thought experiments are the worst. Allow me to list the ways I hate them.
The Axis and Allies werent far apart in their respective endeavors to create a weapon of mass destruction during World War II.
Of course we know how things played out: the Germans never got there and the US managed to avoid lighting the planet on fire when it dropped atomic bombs on the civilian populations of Hiroshima and Nagasaki.
In reality, Albert Einstein and company on the Allies side and Warner Heisenberg and his crew on the Axis were never concerned with setting off a globally-catastrophic chain reaction by detonating an atomic bomb. Both sides had done the math and determined it wasnt really a problem.
Unfortunately, the reason were aware of this is because both sides were also keen to talk to outsiders. Heisenberg famously joked about it to a German politician; Arthur Compton, whod worked with Einstein and others on The Manhattan Project, gave a now infamous interview wherein he made it seem like the possibility of such a tragic event was far greater than it actually was.
This is our selection for the absolute worst thought experiment(s) of all time because its clear that both the Axis and the Allies were pretty far along in the process of actually building atomic bombs before anyone stopped and thought hey guys, are we going to blow up the planet if we do this?
Thats Day One stuff right there. Thats a question you should have to answer during orientation. You dont start building a literal atom bomb and then hold an all-hands meeting to dig into the whole killing all life thing.
Those are all great examples of terrible thought experiments. For scientists and philosophers anyway. But everyone knows the worst ideascome from journalists.
I think I can come up with a terrible thought experiment thatll trump each of the above. All I have to do is reverse-engineer someone elses work and restate it with added nonsense (hey, it worked for Szilard right?).
So lets do this. The most important part of any thought experiment is its title. We need to combine the name of an important scientist with a science-y creature if we want to be taken seriously like Maxwell and his Demon or Schrdinger and his Cat.
And, while substance isnt really what were going for here, we still need a real problem that remains unsolved, can be addressed with a vapid premise, and is accessible to intellects of any level.
Thus, without further ado, I present: Ogres Ogre, athought experiment that uses all the best ideas from the dumb ones mentioned above but contains none of their weaknesses (such as math and the scientific method).
Unlike those theories, Ogres Ogre doesnt require you to understand or know anything. Its just quietly cajoling you into a natural state of curiosity.
In short, Ogres Ogre isnt some overeager overachiever like those others. Where Maxwells Demon demonizes particles by maximizing the tendency toward entropy, and Szilars Engine engages in entropy in only isolated incidents, Ogres Ogre egregiously accepts all eventualities.
It goes like this: What if C-A-T really spelled dog?
Follow this link:
Posted in Quantum Computing
Comments Off on The worst thought experiments imaginable – The Next Web
IBM and SAP Partnership to Help Clients Move Workloads from SAP Solutions to the Cloud – HPCwire
Posted: at 6:52 am
ARMONK, N.Y.and WALLDORF,Germany,Feb. 10, 2022 IBM today announced it is teaming with SAP to provide technology and consulting expertise to make it easier for clients to embrace a hybrid cloud approach and move mission-critical workloads from SAP solutionsto the cloudfor regulated and non-regulated industries.
As clients look to adopt hybrid cloud strategies,moving the workloads and applications that are the backbone of their enterprise operation requires a highly secured and reliable cloud environment. With todays launch of thepremium supplier option with IBM for RISE with SAP, clients will have the tools to help accelerate the migration of their on-premise SAP software workloads to IBM Cloud, backed by industry-leading security capabilities1.
IBM is also unveiling a new program,BREAKTHROUGH with IBM for RISE with SAP, a portfolio of solutions and consulting services that help accelerate and amplify the journey to SAP S/4HANA Cloud. Built on a flexible and scalable platform, the solutions and services use intelligent workflows to streamline operations. They provide an engagement model that helps plan, execute and support holistic business transformation. Clients are also offered the flexibility and choice to migrate SAP solution workloads to the public cloud with the support of deep industry expertise.
Todays announcement of IBM becoming a premium supplier makes IBM the first cloud provider to offer infrastructure, business transformation and application management services as part of RISE with SAP. IBMs premium supplier designation is a continuation of SAPs long-standing efforts to provide choice and optionality to customers, further supporting IBM customers that have a preference for their RISE with SAP package to run on IBM Cloud.
Additionally, migration to SAP S/4HANA on IBM Cloud from on-premise data centers can potentially deliver the following benefits, according to a study by IDC, sponsored by IBM:2
We are thrilled to advance our long-standing partnership through RISE with SAP, saidJohn Granger, Senior Vice President, IBM Consulting. Our shared commitment is to meet our clients, especially those in highly regulated industries, where they are in their digital journey, while giving them choices for migrating or modernizing their mission critical workloads with a hybrid cloud approach.
BREAKTHROUGH with IBM is an outstanding complement to RISE with SAP as it lays the foundation for our customers to embark on or advance their business transformation journeys. Further, it reaffirms the value customers recognize from RISE with SAP and the impact and innovation opportunity RISE with SAP offers to organizations that move to the cloud. I have every confidence that the combined expertise and experience SAP and IBM offer will accelerate cloud adoption and business growth for customers across the globe, saidBrian Duffy, President of Cloud, SAP.
IBM and SAP have worked with hundreds of clients globally on thousands of individual projects to modernize their systems and business processes based on an open, hybrid cloud approach. Recent examples includeCoca-Cola European Partners,Parle Products,Harmont & Blaine,Puravankara LtdandVirgin Megastore KSA.
Underscoring its commitment to SAP S/4HANA both as an SAP customer and a business partner for 50 years, IBM has also made a significant investment in RISE with SAP to help transform its own infrastructure. IBM is a new premium supplier for the RISE with SAP offering and is using the IBM Hybrid Cloud, including IBM Power-enabled Infrastructure as a Service, to enhance the performance, availability and security of deployments of private editions of SAP S/4HANA Cloud.
To learn more about the BREAKTHROUGH with IBM program for the RISE with SAP offering, please visit:https://www.ibm.com/services/sap/rise-with-sap.
Notes
1 Based on IBM Hyper Protect Crypto Service, the only service in the industry built on FIPS 140-2 Level 4certified hardware. FIPS 140-2 Level 4 provides the highest level of security defined in this standard. At this security level, the physical security mechanisms provide a comprehensive envelope of protection around the cryptographic module with the intent of detecting and responding to all unauthorized attempts at physical access.
2IDC White Paper, sponsored by IBM, Business Benefits Possible by Choosing the Right Cloud Provider to Run SAP Workloads. Doc #US47166220, December 2020.
About IBM
IBM is a leading global hybrid cloud and AI, and business services provider. We help clients in more than 175 countries capitalize on insights from their data, streamline business processes, reduce costs and gain the competitive edge in their industries. Nearly 3,000 government and corporate entities in critical infrastructure areas such as financial services, telecommunications and healthcare rely on IBMs hybrid cloud platform and Red Hat OpenShift to affect their digital transformations quickly, efficiently and securely. IBMs breakthrough innovations in AI, quantum computing, industry-specific cloud solutions and business services deliver open and flexible options to our clients. All of this is backed by IBMs legendary commitment to trust, transparency, responsibility, inclusivity and service.
Visitwww.ibm.comfor more information.
About SAP
SAPs strategy is to help every business run as an intelligent enterprise. As a market leader in enterprise application software, we help companies of all sizes and in all industries run at their best: SAP customers generate 87% of total global commerce. Our machine learning, Internet of Things (IoT), and advanced analytics technologies help turn customers businesses into intelligent enterprises. SAP helps give people and organizations deep business insight and fosters collaboration that helps them stay ahead of their competition. We simplify technology for companies so they can consume our software the way they want without disruption. Our end-to-end suite of applications and services enables business and public customers across 25 industries globally to operate profitably, adapt continuously, and make a difference. With a global network of customers, partners, employees, and thought leaders, SAP helps the world run better and improve peoples lives. For more information, visitwww.sap.com.
Source: IBM, SAP
Here is the original post:
IBM and SAP Partnership to Help Clients Move Workloads from SAP Solutions to the Cloud - HPCwire
Posted in Quantum Computing
Comments Off on IBM and SAP Partnership to Help Clients Move Workloads from SAP Solutions to the Cloud – HPCwire
Quantum computing venture backed by Jeff Bezos will leap into public trading with $1.2B valuation – GeekWire
Posted: February 9, 2022 at 1:38 am
A team member at D-Wave Systems, based in Burnaby, B.C.,, works on the dilution refrigerator system that cools the processors in the companys quantum computer. (D-Wave Systems Photo / Larry Goldstein)
Burnaby, B.C.-based D-Wave Systems, the quantum computing company that counts Jeff Bezos among its investors and NASA among its customers, has struck a deal to go public with a $1.2 billion valuation.
The deal involves a combination with DPMC Capital, a publicly traded special-purpose acquisition company, or SPAC. Its expected to bring in $300 million in gross proceeds from DPMCs trust account, plus $40 million in gross proceeds from investors participating in a PIPE arrangement. (PIPE stands for private investment in public equity.)
Quantum computing takes advantage of phenomena at the quantum level, processing qubits that can represent multiple values simultaneously as opposed to the one-or-zero paradigm of classical computing. The approach is theoretically capable of solving some types of problems much faster than classical computers.
Founded in 1999, D-Wave has focused on a type of technology called quantum annealing, which uses quantum computing principles and hardware to tackle tasks relating to network optimization and probabilistic sampling.
Physicists have debated whether D-Waves Advantage system should be considered an honest-to-goodness quantum computer, but the company says that question has been settled by research that, among other things, turned up signatures of quantum entanglement. D-Wave is included among the quantum resources offered by Amazon and Microsoft, and it also has its own cloud-based platform, known as Leap.
The SPAC deal has already been cleared by the boards of directors for D-Wave and DPCM Capital. If the transaction proceeds as expected, with approval by DPCMs stockholders, it should close by midyear. The result would be a combined company called D-Wave Quantum Inc. that would remain headquartered in Burnaby a suburb of Vancouver, B.C. and trade on the New York Stock Exchange under the QBTS stock symbol.
Today marks an inflection point signaling that quantum computing has moved beyond just theory and government-funded research to deliver commercial quantum solutions for business, D-Wave CEO Alan Baratz said in a news release.
Among the investors involved in the PIPE transaction are PSP Investments, NEC Corp., Goldman Sachs, Yorkville Advisors and Aegis Group Partners. Other longtime D-Wave investors include Bezos Expeditions as well as In-Q-Tel, a venture capital fund backed by the CIA and other intelligence agencies.
In what was described as an innovative move, the SPAC deal sets aside a bonus pool of 5 million shares for DPCMs non-redeeming public stockholders.
D-Wave says it will use the fresh funding to accelerate its delivery of in-production quantum applications for its customers, and to build on a foundation of more than 200 U.S. patents. The company is aiming to widen its offerings beyond quantum annealing by developing more versatile gate-model quantum computers.
Emil Michael, DPMC Capitals CEO, said the total addressable market for quantum computing services could amount to more than $1 billion in the near term, and rise to $150 billion as applications mature.
While quantum computing is complex, its value and benefits are quite simple: finding solutions to problems that couldnt be previously solved, or solving problems faster with more optimal results, Michael said. D-Wave is at the forefront of developing this market, already delivering the significant benefits of quantum computing to major companies across the globe.
Read the original here:
Posted in Quantum Computing
Comments Off on Quantum computing venture backed by Jeff Bezos will leap into public trading with $1.2B valuation – GeekWire
Breaking the noise barrier: The startups developing quantum computers – ComputerWeekly.com
Posted: at 1:38 am
Today is the era of noisy intermediate scale quantum (Nisq) computers. These can solve difficult problems, but they are said to be noisy, which means many physical qubits are required for every logical qubit that can be applied to problem-solving. This makes it hard for the industry to demonstrate a truly practical advantage that quantum computers have over classical high-performance computing (HPC) architectures.
Algorithmiq recently received $4m in seed funding to enable it to deliver what it claims are truly noise-resilient quantum algorithms. The company is targeting one specific application area drug discovery and hopes to work with major pharmaceutical firms to develop molecular simulations that are accurate at the quantum level.
Algorithmiq says it has a unique strategy of using standard computers to un-noise quantum computers. The algorithms it is developing offer researchers the ability to boost the speed of chemical simulations on quantum computers by a factor of 100x compared with current industry benchmarks.
Sabrina Maniscalco, co-founder and CEO at Algorithmiq and a professor of quantum information, computing and logic at the University of Helsinki, has been studying noise quantum computers for 20 years. My main field of research is about extracting noise, she said. Quantum information is very fragile.
In Maniscalcos experience, full tolerance requires technological advances in manufacturing and may even require fundamental principles to be discovered because the science does not exist yet. But she said: We can work with noisy devices. there is a lot we can do but you have to get your hands dirty.
Algorithmiqs approach is about making a mindset shift. Rather than waiting for the emergence of universal fault-tolerant quantum computing, Maniscalco said: We look for what types of algorithms we can develop with noisy [quantum] devices.
To work with noisy devices, algorithms need to take account of quantum physics in order to model and understand what is going on in the quantum computer system.
The target application area for Algorithmiq is drug discovery. Quantum computing offers researchers the possibility to simulate molecules accurately at the quantum level, something that is not possible in classical computing, as each qubit can map onto an electron.
According to a quantum computing background paper by Microsoft, if an electron had 40 possible states, to model every state would have 240 configurations, as each position can either have or not have an electron. To store the quantum state of the electrons in a conventional computer memory would require more than 130GB of memory. As the number of states increases, the memory required grows exponentially.
This is one of the limitations of using a classical computing architecture for quantum chemistry simulations. According to Scientific America, quantum computers are now at the point where they can begin to model the energetics and properties of small molecules, such as lithium hydride.
In November 2021, a consortium led by Universal Quantum, a University of Sussex spin-out company, was awarded a 7.5m grant from Innovate UKs Industrial Strategy Challenge Fund to build a scalable quantum computer. Its goal is to achieve a million qubit system.
Many of todays quantum computing systems rely on supercooling to just a few degrees above absolute zero to achieve superconducting qubits. Cooling components to just above absolute zero is required to build the superconducting qubits that are encoded in a circuit. The circuit only exhibits quantum effects when supercooled, otherwise it behaves like a normal electrical circuit.
Significantly, Universals quantum technology, based on the principle of a trapped ion quantum computer, can operate at much more normal temperatures. Explaining why its technology does not require supercooling, co-founder and chief scientist Winfied Hensinger said: Its the nature of the hardware platform. The qubit is the atom that exhibits quantum effects. The ions levitate above the surface of the chip, so there is no requirement on cooling the chip in order to make a better qubit.
Just as a microprocessor may run at 150W and operate at room temperature, the quantum computer that Universal Quantum is building should not require anything more than is needed in an existing server room for cooling.
The design is also more resilient to noise, which introduces errors in quantum computing. Hensinger added: In a superconducting qubit, the circuit is on the chip, so it is much harder to isolate from the environment and so is prone to much more noise. The ion is naturally much better isolated from the environment as it just levitates above a chip.
The key reason why Hensinger and the Universal Quantum team believe they are better placed to further the scalability of quantum computers is down to the cooling power of a fridge. According to Hensinger, the cooling needed for superconducting qubits is very difficult to scale to large numbers of qubits.
Another startup, Quantum Motion, a spin-out from University College London (UCL), is looking at a way to achieve quantum computing that can be industrialised. The company is leading a three-year project, Altnaharra, funded by UK Research and Innovations National Quantum Technologies Programme (NQTP), which combines expertise in qubits based on superconducting circuits, trapped ions and silicon spins.
The company says it is developing fault-tolerant quantum computing architectures. John Morton, co-founder of Quantum Motion and professor of nanoelectronics at UCL, said: To build a universal quantum computer, you need to scale to millions of qubits.
But because companies like IBM are currently running only 127-qubit systems, the idea of universal quantum computing comprising millions of physical qubits, built using existing processes, is seen by some as a pipedream. Instead, said Morton: We are looking at how to take a silicon chip and make it exhibit quantum properties.
Last April, Quantum Motion and researchers at UCL were able to isolate and measure the quantum state of a single electron (the qubit) in a silicon transistor manufactured using a CMOS (complementary metal-oxide-semiconductor) technology similar to that used to make chips in computer processors.
Rather than being at a high-tech campus or university, the company has just opened its new laboratory just off Londons Caledonian Road, surrounded by a housing estate, a community park and a gym. But in this lab, it is able to lower the temperature of components to a shade above absolute zero.
James Palles-Dimmock, COO of Quantum Motion, said: Were working with technology that is colder than deep space and pushing the boundaries of our knowledge to turn quantum theory into reality. Our approach is to take the building blocks of computing the silicon chip and demonstrate that it is the most stable, reliable and scalable way of mass manufacturing quantum silicon chips.
The discussion Computer Weekly had with these startups shows just how much effort is going into giving quantum computing a clear advantage over HPC. What is clear from these conversations is that these companies are all very different. Unlike classical computing, which has chosen the stored program architecture described by mathematician John von Neumann in the 1940s, there is unlikely to be one de-facto standard architecture for quantum computing.
Continued here:
Breaking the noise barrier: The startups developing quantum computers - ComputerWeekly.com
Posted in Quantum Computing
Comments Off on Breaking the noise barrier: The startups developing quantum computers – ComputerWeekly.com