The Prometheus League
Breaking News and Updates
- Abolition Of Work
- Ai
- Alt-right
- Alternative Medicine
- Antifa
- Artificial General Intelligence
- Artificial Intelligence
- Artificial Super Intelligence
- Ascension
- Astronomy
- Atheism
- Atheist
- Atlas Shrugged
- Automation
- Ayn Rand
- Bahamas
- Bankruptcy
- Basic Income Guarantee
- Big Tech
- Bitcoin
- Black Lives Matter
- Blackjack
- Boca Chica Texas
- Brexit
- Caribbean
- Casino
- Casino Affiliate
- Cbd Oil
- Censorship
- Cf
- Chess Engines
- Childfree
- Cloning
- Cloud Computing
- Conscious Evolution
- Corona Virus
- Cosmic Heaven
- Covid-19
- Cryonics
- Cryptocurrency
- Cyberpunk
- Darwinism
- Democrat
- Designer Babies
- DNA
- Donald Trump
- Eczema
- Elon Musk
- Entheogens
- Ethical Egoism
- Eugenic Concepts
- Eugenics
- Euthanasia
- Evolution
- Extropian
- Extropianism
- Extropy
- Fake News
- Federalism
- Federalist
- Fifth Amendment
- Fifth Amendment
- Financial Independence
- First Amendment
- Fiscal Freedom
- Food Supplements
- Fourth Amendment
- Fourth Amendment
- Free Speech
- Freedom
- Freedom of Speech
- Futurism
- Futurist
- Gambling
- Gene Medicine
- Genetic Engineering
- Genome
- Germ Warfare
- Golden Rule
- Government Oppression
- Hedonism
- High Seas
- History
- Hubble Telescope
- Human Genetic Engineering
- Human Genetics
- Human Immortality
- Human Longevity
- Illuminati
- Immortality
- Immortality Medicine
- Intentional Communities
- Jacinda Ardern
- Jitsi
- Jordan Peterson
- Las Vegas
- Liberal
- Libertarian
- Libertarianism
- Liberty
- Life Extension
- Macau
- Marie Byrd Land
- Mars
- Mars Colonization
- Mars Colony
- Memetics
- Micronations
- Mind Uploading
- Minerva Reefs
- Modern Satanism
- Moon Colonization
- Nanotech
- National Vanguard
- NATO
- Neo-eugenics
- Neurohacking
- Neurotechnology
- New Utopia
- New Zealand
- Nihilism
- Nootropics
- NSA
- Oceania
- Offshore
- Olympics
- Online Casino
- Online Gambling
- Pantheism
- Personal Empowerment
- Poker
- Political Correctness
- Politically Incorrect
- Polygamy
- Populism
- Post Human
- Post Humanism
- Posthuman
- Posthumanism
- Private Islands
- Progress
- Proud Boys
- Psoriasis
- Psychedelics
- Putin
- Quantum Computing
- Quantum Physics
- Rationalism
- Republican
- Resource Based Economy
- Robotics
- Rockall
- Ron Paul
- Roulette
- Russia
- Sealand
- Seasteading
- Second Amendment
- Second Amendment
- Seychelles
- Singularitarianism
- Singularity
- Socio-economic Collapse
- Space Exploration
- Space Station
- Space Travel
- Spacex
- Sports Betting
- Sportsbook
- Superintelligence
- Survivalism
- Talmud
- Technology
- Teilhard De Charden
- Terraforming Mars
- The Singularity
- Tms
- Tor Browser
- Trance
- Transhuman
- Transhuman News
- Transhumanism
- Transhumanist
- Transtopian
- Transtopianism
- Ukraine
- Uncategorized
- Vaping
- Victimless Crimes
- Virtual Reality
- Wage Slavery
- War On Drugs
- Waveland
- Ww3
- Yahoo
- Zeitgeist Movement
-
Prometheism
-
Forbidden Fruit
-
The Evolutionary Perspective
Category Archives: Quantum Computing
40 years ago the first IBM PC was presented, that’s how it was and what it knew how to do – Tech Gaming Report
Posted: August 14, 2021 at 1:16 am
It is incredible to think that 40 years have passed since the birth of the first personal computer, launched by IBM and cloned from its first months of life.
To be exact, 40 years and a day have passed since the IBM PC was launched on August 12, 1981: at the Waldorf Astoria in New York, at that time one of the most renowned hotels in the Big Apple, the 5150, first personal computer of the great company of Armonk.
The IBM personal computer was a real revolution at the time, although of course its cost was very high: in 1981. In the first months almost 200 thousand copies were sold, demonstrating how much it was appreciated by the general public.
The novelty was so appreciated that in the immediate future the first clones of the 5150 were born, the so-called IBM-compatible PCs. The 5150 was sold until 1987 it had an x86 microprocessor, the first computer of its kind to have one, and was eventually replaced by IBM Personal Computer XT.
As a demonstration of how technology is working at a rapid pace, 40 years after the launch of this personal computer, IBM will soon release its first quantum computer with more than 1000 qubits.
The shape of the Personal Computer has, from 1981 onwards, inspired the entire architecture of personal computers since then, although in 2004 the company stopped producing such models.
International Business Machines Corporation (commonly known as IBM and nicknamed Big Blue) is an American company, the oldest and among the largest in the world in the information technology sector. It produces and markets hardware, computer software, middleware, and IT services, offering infrastructure, hosting services, cloud computing, artificial intelligence, quantum computing, and IT and strategic consulting.
Also important scientific research organization, which holds the record for most US patents issued by a company (as of 2020) for 27 consecutive years; Also active in the field of quantum computing, he also produced the first quantum cloud computer called IBM Q experience and the production of the first truly marketable quantum computer, called the IBM Q System One.
READ ALSO >>> Instagram, security update and new rules will help against social abuse
IBM announced in October 2020 that it will be split into two separate public companies. The future target will be the Cloud Computing high margin andartificial intelligence, built on the foundation of the Red Hat acquisition in 2019.
READ ALSO >>> The ISS receives the landing of the Cygnus, the heaviest cargo composer in history with also the kit to make pizza on board
The new company NewCo, yet to be formally named, created by the unit Global Technology Services Managed Infrastructure ServicesIt will have 90,000 employees, 4,600 clients in 115 countries, with an order book of $ 60 billion. The IBM spin-off will be larger than any of its previous sales and will be well received by investors.
Read this article:
Posted in Quantum Computing
Comments Off on 40 years ago the first IBM PC was presented, that’s how it was and what it knew how to do – Tech Gaming Report
Quantum Computing Market 2021 with Top Countries Data Analysis by Industry Trends, Size, Share and Company Overview – Digital Journal
Posted: at 1:16 am
Global Quantum Computing Market Size, Status And Forecast 2021-2025
MarketInsightsReports, a leading global market research firm, is pleased to announce its new report on Quantum Computing market, forecast for 2021-2025, covering all aspects of the market and providing up-to-date data on current trends.
The report covers comprehensive data on emerging trends, market drivers, growth opportunities, and restraints that can change the market dynamics of the industry. It provides an in-depth analysis of the market segments which include products, applications, and competitor analysis. The report also includes a detailed study of key companies to provide insights into business strategies adopted by various players in order to sustain competition in this highly competitive environment.
With our Quantum Computing market research reports, we offer a comprehensive overview of this sector such as sales analysis, impact of domestic and global market players, value chain optimization, trade regulations, recent developments, opportunities analysis, strategic market growth analysis, product launches, area marketplace explaining, and technological innovations
Top Companies in the Global Quantum Computing Market: The Quantum Computing market was dominated by International Business Machines (US), D-Wave Systems (Canada), Microsoft (US), Amazon (US), Rigetti Computing (US), Google (US), Intel (US), Honeywell International (US), Quantum Circuits (US),and QC Ware (US).
Recent Developments
In January 2020, IBM partnered with Daimler AG, the parent company of Mercedes-Benz, to enhance the capacity and increase the charging speed of batteries of electric vehicles. These companies used a quantum computer to model the dipole moment of three lithium-containing molecules that paves the way for the development of the next-generation lithium sulfur (Li-S) batteries that will be more powerful, long-lasting, and cost-effective than lithium-ion batteries. The Quantum Computing market is expected to grow from USD 472 million in 2021 to USD 1,765 million by 2026, at a CAGR of 30.2%.
In November 2019, IBM partnered with the Unitary Fund to provide grants and priority access to certain IBM Q systems. Similar to the quantum computing mission of IBM, the Unitary Fund aims to create a quantum technology industry that benefits most of the people.
For comprehensive understanding of market dynamics, the global Quantum Computing market is analyzed across key geographies namely: United States, China, Europe, Japan, South-east Asia, India and others. Each of these regions is analyzed on basis of market findings across major countries in these regions for a macro-level understanding of the market.
Key Takeaways from Quantum Computing Report
Evaluate the supply-demand gaps, import-export statistics and regulatory landscape for more than top 20countries globally for the Quantum Computing market.
Browse the report description and TOC: https://www.marketinsightsreports.com/reports/05202915819/global-quantum-computing-market-analysis-by-solution-type-hardware-software-full-stack-application-optimization-simulation-sampling-machine-learning-end-user-by-region-by-country-2020-edition-market-insight-competition-and-forecast-2020-2025?mode=54
-Key Strategic Developments: The study also includes the key strategic developments of the market, comprising R&D, new product launch, M&A, agreements, collaborations, partnerships, joint ventures, and regional growth of the leading competitors operating in the market on a global and regional scale.
-Key Market Features: The report evaluates key market features, including revenue, price, capacity, capacity utilization rate, gross, production, production rate, consumption, import/export, supply/demand, cost, market share, CAGR, and gross margin. In addition, the study offers a comprehensive study of the key market dynamics and their latest trends, along with pertinent market segments and sub-segments.
-Analytical Tools: The Global Quantum Computing Market report includes the accurately studied and assessed data of the key industry players and their scope in the market by means of a number of analytical tools. The analytical tools such as Porters five forces analysis, SWOT analysis, feasibility study, and investment return analysis have been used to analyze the growth of the key players operating in the market.
Customization of the Report: This report can be customized as per your needs for additional data up to 3 companies or countries or 40 analyst hours.
MarketInsightsReports provides syndicated market research on industry verticals including Healthcare, Information and Communication Technology (ICT), Technology and Media, Chemicals, Materials, Energy, Heavy Industry, etc.MarketInsightsReports provides global and regional market intelligence coverage, a 360-degree market view which includes statistical forecasts, competitive landscape, detailed segmentation, key trends, and strategic recommendations.
How we have factored the effect of Covid-19 in our report:
All the reports that we list have been tracking the impact of COVID-19. Both upstream and downstream of the entire supply chain has been accounted for while doing this. Also, where possible, we will provide an additional COVID-19 update supplement/report to the report in Q3, please check for with the sales team.
IrfanTamboli (Head of Sales) Market Insights Reports
Phone: + 1704 266 3234 | +91-750-707-8687
[emailprotected] | [emailprotected]
This Press Release has been written with the intention of providing accurate market information which will enable our readers to make informed strategic investment decisions. If you notice any problem with this content, please feel free to reach us on [emailprotected]
Here is the original post:
Posted in Quantum Computing
Comments Off on Quantum Computing Market 2021 with Top Countries Data Analysis by Industry Trends, Size, Share and Company Overview – Digital Journal
IBM’s Quantum Computing Compromisea Road to Scale? – IEEE Spectrum
Posted: August 6, 2021 at 10:34 pm
Looking to such specialized nervous systems as a model for artificial intelligence may prove just as valuable, if not more so, than studying the human brain. Consider the brains of those ants in your pantry. Each has some 250,000 neurons. Larger insects have closer to 1 million. In my research at Sandia National Laboratories in Albuquerque, I study the brains of one of these larger insects, the dragonfly. I and my colleagues at Sandia, a national-security laboratory, hope to take advantage of these insects' specializations to design computing systems optimized for tasks like intercepting an incoming missile or following an odor plume. By harnessing the speed, simplicity, and efficiency of the dragonfly nervous system, we aim to design computers that perform these functions faster and at a fraction of the power that conventional systems consume.
Looking to a dragonfly as a harbinger of future computer systems may seem counterintuitive. The developments in artificial intelligence and machine learning that make news are typically algorithms that mimic human intelligence or even surpass people's abilities. Neural networks can already perform as wellif not betterthan people at some specific tasks, such as detecting cancer in medical scans. And the potential of these neural networks stretches far beyond visual processing. The computer program AlphaZero, trained by self-play, is the best Go player in the world. Its sibling AI, AlphaStar, ranks among the best Starcraft II players.
Such feats, however, come at a cost. Developing these sophisticated systems requires massive amounts of processing power, generally available only to select institutions with the fastest supercomputers and the resources to support them. And the energy cost is off-putting. Recent estimates suggest that the carbon emissions resulting from developing and training a natural-language processing algorithm are greater than those produced by four cars over their lifetimes.
It takes the dragonfly only about 50 milliseconds to begin to respond to a prey's maneuver. If we assume 10 ms for cells in the eye to detect and transmit information about the prey, and another 5 ms for muscles to start producing force, this leaves only 35 ms for the neural circuitry to make its calculations. Given that it typically takes a single neuron at least 10 ms to integrate inputs, the underlying neural network can be at least three layers deep.
But does an artificial neural network really need to be large and complex to be useful? I believe it doesn't. To reap the benefits of neural-inspired computers in the near term, we must strike a balance between simplicity and sophistication.
Which brings me back to the dragonfly, an animal with a brain that may provide precisely the right balance for certain applications.
If you have ever encountered a dragonfly, you already know how fast these beautiful creatures can zoom, and you've seen their incredible agility in the air. Maybe less obvious from casual observation is their excellent hunting ability: Dragonflies successfully capture up to 95 percent of the prey they pursue, eating hundreds of mosquitoes in a day.
The physical prowess of the dragonfly has certainly not gone unnoticed. For decades, U.S. agencies have experimented with using dragonfly-inspired designs for surveillance drones. Now it is time to turn our attention to the brain that controls this tiny hunting machine.
While dragonflies may not be able to play strategic games like Go, a dragonfly does demonstrate a form of strategy in the way it aims ahead of its prey's current location to intercept its dinner. This takes calculations performed extremely fastit typically takes a dragonfly just 50 milliseconds to start turning in response to a prey's maneuver. It does this while tracking the angle between its head and its body, so that it knows which wings to flap faster to turn ahead of the prey. And it also tracks its own movements, because as the dragonfly turns, the prey will also appear to move.
The model dragonfly reorients in response to the prey's turning. The smaller black circle is the dragonfly's head, held at its initial position. The solid black line indicates the direction of the dragonfly's flight; the dotted blue lines are the plane of the model dragonfly's eye. The red star is the prey's position relative to the dragonfly, with the dotted red line indicating the dragonfly's line of sight.
So the dragonfly's brain is performing a remarkable feat, given that the time needed for a single neuron to add up all its inputscalled its membrane time constantexceeds 10 milliseconds. If you factor in time for the eye to process visual information and for the muscles to produce the force needed to move, there's really only time for three, maybe four, layers of neurons, in sequence, to add up their inputs and pass on information
Could I build a neural network that works like the dragonfly interception system? I also wondered about uses for such a neural-inspired interception system. Being at Sandia, I immediately considered defense applications, such as missile defense, imagining missiles of the future with onboard systems designed to rapidly calculate interception trajectories without affecting a missile's weight or power consumption. But there are civilian applications as well.
For example, the algorithms that control self-driving cars might be made more efficient, no longer requiring a trunkful of computing equipment. If a dragonfly-inspired system can perform the calculations to plot an interception trajectory, perhaps autonomous drones could use it to avoid collisions. And if a computer could be made the same size as a dragonfly brain (about 6 cubic millimeters), perhaps insect repellent and mosquito netting will one day become a thing of the past, replaced by tiny insect-zapping drones!
To begin to answer these questions, I created a simple neural network to stand in for the dragonfly's nervous system and used it to calculate the turns that a dragonfly makes to capture prey. My three-layer neural network exists as a software simulation. Initially, I worked in Matlab simply because that was the coding environment I was already using. I have since ported the model to Python.
Because dragonflies have to see their prey to capture it, I started by simulating a simplified version of the dragonfly's eyes, capturing the minimum detail required for tracking prey. Although dragonflies have two eyes, it's generally accepted that they do not use stereoscopic depth perception to estimate distance to their prey. In my model, I did not model both eyes. Nor did I try to match the resolution of a dragonfly eye. Instead, the first layer of the neural network includes 441 neurons that represent input from the eyes, each describing a specific region of the visual fieldthese regions are tiled to form a 21-by-21-neuron array that covers the dragonfly's field of view. As the dragonfly turns, the location of the prey's image in the dragonfly's field of view changes. The dragonfly calculates turns required to align the prey's image with one (or a few, if the prey is large enough) of these "eye" neurons. A second set of 441 neurons, also in the first layer of the network, tells the dragonfly which eye neurons should be aligned with the prey's image, that is, where the prey should be within its field of view.
The model dragonfly engages its prey.
Processingthe calculations that take input describing the movement of an object across the field of vision and turn it into instructions about which direction the dragonfly needs to turnhappens between the first and third layers of my artificial neural network. In this second layer, I used an array of 194,481 (214) neurons, likely much larger than the number of neurons used by a dragonfly for this task. I precalculated the weights of the connections between all the neurons into the network. While these weights could be learned with enough time, there is an advantage to "learning" through evolution and preprogrammed neural network architectures. Once it comes out of its nymph stage as a winged adult (technically referred to as a teneral), the dragonfly does not have a parent to feed it or show it how to hunt. The dragonfly is in a vulnerable state and getting used to a new bodyit would be disadvantageous to have to figure out a hunting strategy at the same time. I set the weights of the network to allow the model dragonfly to calculate the correct turns to intercept its prey from incoming visual information. What turns are those? Well, if a dragonfly wants to catch a mosquito that's crossing its path, it can't just aim at the mosquito. To borrow from what hockey player Wayne Gretsky once said about pucks, the dragonfly has to aim for where the mosquito is going to be. You might think that following Gretsky's advice would require a complex algorithm, but in fact the strategy is quite simple: All the dragonfly needs to do is to maintain a constant angle between its line of sight with its lunch and a fixed reference direction.
Readers who have any experience piloting boats will understand why that is. They know to get worried when the angle between the line of sight to another boat and a reference direction (for example due north) remains constant, because they are on a collision course. Mariners have long avoided steering such a course, known as parallel navigation, to avoid collisions
Translated to dragonflies, which want to collide with their prey, the prescription is simple: keep the line of sight to your prey constant relative to some external reference. However, this task is not necessarily trivial for a dragonfly as it swoops and turns, collecting its meals. The dragonfly does not have an internal gyroscope (that we know of) that will maintain a constant orientation and provide a reference regardless of how the dragonfly turns. Nor does it have a magnetic compass that will always point north. In my simplified simulation of dragonfly hunting, the dragonfly turns to align the prey's image with a specific location on its eye, but it needs to calculate what that location should be.
The third and final layer of my simulated neural network is the motor-command layer. The outputs of the neurons in this layer are high-level instructions for the dragonfly's muscles, telling the dragonfly in which direction to turn. The dragonfly also uses the output of this layer to predict the effect of its own maneuvers on the location of the prey's image in its field of view and updates that projected location accordingly. This updating allows the dragonfly to hold the line of sight to its prey steady, relative to the external world, as it approaches.
It is possible that biological dragonflies have evolved additional tools to help with the calculations needed for this prediction. For example, dragonflies have specialized sensors that measure body rotations during flight as well as head rotations relative to the bodyif these sensors are fast enough, the dragonfly could calculate the effect of its movements on the prey's image directly from the sensor outputs or use one method to cross-check the other. I did not consider this possibility in my simulation.
To test this three-layer neural network, I simulated a dragonfly and its prey, moving at the same speed through three-dimensional space. As they do so my modeled neural-network brain "sees" the prey, calculates where to point to keep the image of the prey at a constant angle, and sends the appropriate instructions to the muscles. I was able to show that this simple model of a dragonfly's brain can indeed successfully intercept other bugs, even prey traveling along curved or semi-random trajectories. The simulated dragonfly does not quite achieve the success rate of the biological dragonfly, but it also does not have all the advantages (for example, impressive flying speed) for which dragonflies are known.
More work is needed to determine whether this neural network is really incorporating all the secrets of the dragonfly's brain. Researchers at the Howard Hughes Medical Institute's Janelia Research Campus, in Virginia, have developed tiny backpacks for dragonflies that can measure electrical signals from a dragonfly's nervous system while it is in flight and transmit these data for analysis. The backpacks are small enough not to distract the dragonfly from the hunt. Similarly, neuroscientists can also record signals from individual neurons in the dragonfly's brain while the insect is held motionless but made to think it's moving by presenting it with the appropriate visual cues, creating a dragonfly-scale virtual reality.
Data from these systems allows neuroscientists to validate dragonfly-brain models by comparing their activity with activity patterns of biological neurons in an active dragonfly. While we cannot yet directly measure individual connections between neurons in the dragonfly brain, I and my collaborators will be able to infer whether the dragonfly's nervous system is making calculations similar to those predicted by my artificial neural network. That will help determine whether connections in the dragonfly brain resemble my precalculated weights in the neural network. We will inevitably find ways in which our model differs from the actual dragonfly brain. Perhaps these differences will provide clues to the shortcuts that the dragonfly brain takes to speed up its calculations.
This backpack that captures signals from electrodes inserted in a dragonfly's brain was created by Anthony Leonardo, a group leader at Janelia Research Campus.Anthony Leonardo/Janelia Research Campus/HHMI
Dragonflies could also teach us how to implement "attention" on a computer. You likely know what it feels like when your brain is at full attention, completely in the zone, focused on one task to the point that other distractions seem to fade away. A dragonfly can likewise focus its attention. Its nervous system turns up the volume on responses to particular, presumably selected, targets, even when other potential prey are visible in the same field of view. It makes sense that once a dragonfly has decided to pursue a particular prey, it should change targets only if it has failed to capture its first choice. (In other words, using parallel navigation to catch a meal is not useful if you are easily distracted.)
Even if we end up discovering that the dragonfly mechanisms for directing attention are less sophisticated than those people use to focus in the middle of a crowded coffee shop, it's possible that a simpler but lower-power mechanism will prove advantageous for next-generation algorithms and computer systems by offering efficient ways to discard irrelevant inputs
The advantages of studying the dragonfly brain do not end with new algorithms; they also can affect systems design. Dragonfly eyes are fast, operating at the equivalent of 200 frames per second: That's several times the speed of human vision. But their spatial resolution is relatively poor, perhaps just a hundredth of that of the human eye. Understanding how the dragonfly hunts so effectively, despite its limited sensing abilities, can suggest ways of designing more efficient systems. Using the missile-defense problem, the dragonfly example suggests that our antimissile systems with fast optical sensing could require less spatial resolution to hit a target.
The dragonfly isn't the only insect that could inform neural-inspired computer design today. Monarch butterflies migrate incredibly long distances, using some innate instinct to begin their journeys at the appropriate time of year and to head in the right direction. We know that monarchs rely on the position of the sun, but navigating by the sun requires keeping track of the time of day. If you are a butterfly heading south, you would want the sun on your left in the morning but on your right in the afternoon. So, to set its course, the butterfly brain must therefore read its own circadian rhythm and combine that information with what it is observing.
Other insects, like the Sahara desert ant, must forage for relatively long distances. Once a source of sustenance is found, this ant does not simply retrace its steps back to the nest, likely a circuitous path. Instead it calculates a direct route back. Because the location of an ant's food source changes from day to day, it must be able to remember the path it took on its foraging journey, combining visual information with some internal measure of distance traveled, and then calculate its return route from those memories.
While nobody knows what neural circuits in the desert ant perform this task, researchers at the Janelia Research Campus have identified neural circuits that allow the fruit fly to self-orient using visual landmarks. The desert ant and monarch butterfly likely use similar mechanisms. Such neural circuits might one day prove useful in, say, low-power drones.
And what if the efficiency of insect-inspired computation is such that millions of instances of these specialized components can be run in parallel to support more powerful data processing or machine learning? Could the next AlphaZero incorporate millions of antlike foraging architectures to refine its game playing? Perhaps insects will inspire a new generation of computers that look very different from what we have today. A small army of dragonfly-interception-like algorithms could be used to control moving pieces of an amusement park ride, ensuring that individual cars do not collide (much like pilots steering their boats) even in the midst of a complicated but thrilling dance.
No one knows what the next generation of computers will look like, whether they will be part-cyborg companions or centralized resources much like Isaac Asimov's Multivac. Likewise, no one can tell what the best path to developing these platforms will entail. While researchers developed early neural networks drawing inspiration from the human brain, today's artificial neural networks often rely on decidedly unbrainlike calculations. Studying the calculations of individual neurons in biological neural circuitscurrently only directly possible in nonhuman systemsmay have more to teach us. Insects, apparently simple but often astonishing in what they can do, have much to contribute to the development of next-generation computers, especially as neuroscience research continues to drive toward a deeper understanding of how biological neural circuits work.
So next time you see an insect doing something clever, imagine the impact on your everyday life if you could have the brilliant efficiency of a small army of tiny dragonfly, butterfly, or ant brains at your disposal. Maybe computers of the future will give new meaning to the term "hive mind," with swarms of highly specialized but extremely efficient minuscule processors, able to be reconfigured and deployed depending on the task at hand. With the advances being made in neuroscience today, this seeming fantasy may be closer to reality than you think.
This article appears in the August 2021 print issue as "Lessons From a Dragonfly's Brain."
Follow this link:
IBM's Quantum Computing Compromisea Road to Scale? - IEEE Spectrum
Posted in Quantum Computing
Comments Off on IBM’s Quantum Computing Compromisea Road to Scale? – IEEE Spectrum
Google says it has created a time crystal in a quantum computer, and it’s weirder than you can imagine – ZDNet
Posted: at 10:34 pm
Google's scientists now rather excitingly say that their results establish a "scalable approach" to study time crystals on current quantum processors.
In a new research paper, Google scientists claim to have used a quantum processor for a useful scientific application: to observe a genuine time crystal.
If 'time crystal' sounds pretty sci-fi that's because they are. Time crystals are no less than a new "phase of matter", as researchers put it, which has been theorized for some years now as a new state that could potentially join the ranks of solids, liquids, gases, crystals and so on. Thepaper remains in pre-print and still requires peer review.
Time crystals are also hard to find. But Google's scientists now rather excitingly say that their results establish a "scalable approach" to study time crystals on current quantum processors.
SEE: What is quantum computing? Everything you need to know about the strange world of quantum computers
Understanding why time crystals are interesting requires a little bit of background in physics particularly, knowledge of the second law of thermodynamics, which states that systems naturally tend to settle in a state known as "maximum entropy".
To take an example: if you pour some milk into a coffee cup, the milk will eventually dissolve throughout the coffee, instead of sitting on the top, enabling the overall system to come to an equilibrium. This is because there are many more ways for the coffee to randomly spread throughout the coffee than there are for it to sit, in a more orderly fashion, at the top of the cup.
This irresistible drive towards thermal equilibrium, as described in the second law of thermodynamics, is reflective of the fact that all things tend to move towards less useful, random states. As time goes on, systems inevitably degenerate into chaos and disorder that is, entropy.
Time crystals, on the other hand, fail to settle in thermal equilibrium. Instead of slowly degenerating towards randomness, they get stuck in two high-energy configurations that they switch between and this back-and-forth process can go on forever.
To explain this better, Curt von Keyserlingk, lecturer at the school of physics and astronomy at the University of Birmingham, who did not participate in Google's latest experiment, pulls out some slides from an introductory talk to prospective undergraduate students. "They usually pretend to understand, so it might be useful," von Keyserlingk warns ZDNet.
It starts with a thought experiment: take a box in a closed system that is isolated from the rest of the universe, load it with a couple of dozens of coins and shake it a million times. As the coins flip, tumble and bounce off each other, they randomly move positions and increasingly become more chaotic. Upon opening the box, the expectation is that you will be faced with roughly half the coins on their heads side, and half on their tails.
It doesn't matter if the experiment started with more coins on their tails or more coins on their heads: the system forgets what the initial configuration was, and it becomes increasingly random and chaotic as it is shaken.
This closed system, when it is translated into the quantum domain, is the perfect setting to try and find time crystals, and the only one known to date. "The only stable time crystals that we've envisioned in closed systems are quantum mechanical," says von Keyserlingk.
Enter Google's quantum processor, Sycamore,which is well known for having achieved quantum supremacyand is now looking for some kind of useful application for quantum computing.
A quantum processor, by definition, is a perfect tool to replicate a quantum mechanical system. In this scenario, Google's team represented the coins in the box with qubits spinning upwards and downwards in a closed system; and instead of shaking the box, they applied a set of specific quantum operations that can change the state of the qubits, which they repeated many times.
This is where time crystals defy all expectations. Looking at the system after a certain number of operations, or shakes, reveals a configuration of qubits that is not random, but instead looks rather similar to the original set up.
"The first ingredient that makes up a time crystal is that it remembers what it was doing initially. It doesn't forget," says von Keyserlingk. "The coins-in-a-box system forgets, but a time crystal system doesn't."
It doesn't stop here. Shake the system an even number of times, and you'll get a similar configuration to the original one but shake it an odd number of times, and you'll get another set up, in which tails have been flipped to heads and vice-versa.
And no matter how many operations are carried out on the system, it will always flip-flop, going regularly back-and-forth between those two states.
Scientists call this a break in the symmetry of time which is why time crystals are called so. This is because the operation carried out to stimulate the system is always the same, and yet the response only comes every other shake.
"In the Google experiment, they do a set of operations on this chain of spins, then they do exactly the same thing again, and again. They do the same thing at the hundredth step that they do at the millionth step, if they go that far," says von Keyserlingk.
"So they subject the system to a set of conditions that have symmetry, and yet the system responds in a manner that breaks that symmetry. It's the same every two periods instead of every period. That's what makes it literally a time crystal."
SEE:Bigger quantum computers, faster: This new idea could be the quickest route to real world apps
The behavior of time crystals, from a scientific perspective, is fascinating: contrary to every other known system, they don't tend towards disorder and chaos. Unlike the coins in the box, which get all muddled up and settle at roughly half heads and half tails, they buck the entropy law by getting stuck in a special, time-crystal state.
In other words, they defy the second law of thermodynamics, which essentially defines the direction that all natural events take. Ponder that for a moment.
Such special systems are not easy to observe. Time crystals have been a topic of interest since 2012, when Nobel Prize-winning MIT professor Frank Wilczek started thinking about them; and the theory has been refuted, debated and contradicted many times since then.
Several attempts have been made to create and observe time crystals to date, with varying degrees of success. Only last month, a team from Delft University of Technology in the Netherlandspublished a pre-print showing that they had built a time crystal in a diamond processor, although a smaller system than the one claimed by Google.
The search giant's researchers used a chip with 20 qubits to serve as the time crystal many more, according to von Keyserlingk, than has been achieved until now, and than could be achieved with a classical computer.
Using a laptop, it is fairly easy to simulate around 10 qubits, explains von Keyserlingk. Add more than that, and the limits of current hardware are soon reached: every extra qubit requires exponential amounts of memory.
The scientist stops short of stating that this new experiment is a show of quantum supremacy. "They're not quite far enough for me to be able to say it's impossible to do with a classical computer, because there might be a clever way of putting it on a classical computer that I haven't thought of," says von Keyserlingk.
"But I think this is by far the most convincing experimental demonstration of a time crystal to date."
SEE: Quantum computing just took on another big challenge, one that could be as tough as steel
The scope and control of Google's experiment means that it is possible to look at time crystals for longer, do detailed sets of measurements, vary the size of the system, and so on. In other words, it is a useful demonstration that could genuinely advance science and as such, it could be key in showing the central role that quantum simulators will play in enabling discoveries in physics.
There are, of course, some caveats. Like all quantum computers, Google's processor still suffers from decoherence, which can cause a decay in the qubits' quantum states, and means that time crystals' oscillations inevitably die out as the environment interferes with the system.
The pre-print, however, argues that as the processor becomes more effectively isolated, this issue could be mitigated.
One thing is certain: time crystals won't be sitting in our living rooms any time soon, because scientists are yet to find a definitive useful application for them. It is unlikely, therefore, that Google's experiment was about exploring the business value of time crystals; rather, it shows what could potentially be another early application of quantum computing, and yet another demonstration of the company's technological prowess in a hotly contested new area of development.
Go here to see the original:
Posted in Quantum Computing
Comments Off on Google says it has created a time crystal in a quantum computer, and it’s weirder than you can imagine – ZDNet
From theory to reality: Google claims to created physics-defying ‘time crystal’ inside its quantum computer – Silicon Canals
Posted: at 10:34 pm
Image credits: Google Quantum AI
As the Quantum computing race is heating up, many companies across countries are spending billions on different qubit technologies to stabilise and commercialise the technology. While it is too early to declare a winner in quantum computing, Googles quantum computing lab may have created something truly remarkable.
In the latest development, researchers at Google, in collaboration with physicists at Princeton, Stanford, and other universities, have created the worlds first Time Crystal inside a quantum computer.
Get to know the amazing finalists here
Time crystals developed by Google could be the biggest scientific accomplishment for fundamental physics and quantum physics. Dreamt up by the Nobel Prize-winning physicist Frank Wilczek in 2012, the notion of time crystals is now moving from theory to reality.
In a recently published study, Observation of Time-Crystalline Eigenstate Order on a Quantum Processor, the researchers claim that Time Crystal is a new phase of matter that violates Newtons law of Thermodynamics.
Well, a time crystal sounds like a complicated component of a time machine, but it is not. So, what exactly are Time Crystals? As per researchers, a time crystal is a new phase of matter that alternates between two shapes, never losing any energy during the process.
To make it simple, regular crystals are an arrangement of molecules or atoms that form a regular repeated pattern in space. A time crystal, on the other hand, is an arrangement of molecules or atoms that form a regular, repeated pattern but in time. Meaning, theyll sit in one pattern for a while, then flip to another, and repeat back and forth.
Explaining about Time Crystal in layman terms to Silicon Canals, Loc Henriet, head of Applications and Quantum Software, Pasqal, explains, Some phases of matter are known to spontaneously break symmetries. A crystal breaks spatial translation: one finds atoms only at well-defined positions. Magnets break discrete spin symmetry: the magnetisation points to a well-defined direction. However, no known physical system was known to break one of the simplest symmetries: translation in time. Googles DTC result is the most convincing experimental evidence of the existence of non-equilibrium states of matter that break time-translation symmetry.
Further, Time crystals can withstand energy processes without entropy and transform endlessly within an isolated system without expending any fuel or energy.
Our work employs a time-reversal protocol that discriminates external decoherence from intrinsic thermalisation, and leverages quantum typicality to circumvent the exponential cost of densely sampling the eigenspectrum, says researchers. In addition, we locate the phase transition out of the DTC with experimental finite-size analysis. These results establish a scalable approach to study non-equilibrium phases of matter on current quantum processors.
For the demonstration, the researchers used a chip with 20 qubits to serve as the time crystal. Its worth mentioning that researchers performed the experiments on Googles Sycamore device, which solved a task in 200 seconds that would take a conventional computer 10,000 years.
According to the researchers, their experiment offers preliminary evidence that their system could create time crystals. This discovery could have profound implications in the world of quantum computing if its proven.
Henriet shares, This result is most interesting from a fundamental physics standpoint, as an identification of a novel quantum phase of matter. In itself, it will not directly impact our day-to-day life but it illustrates the richness of many-body quantum physics out-of-equilibrium. It also proves that quantum processors are now powerful enough to discover new interesting regimes for quantum matter with disruptive properties.
The consequence is amazing: You evade the second law of thermodynamics, says Roderich Moessner, director of the Max Planck Institute for the Physics of Complex Systems in Dresden, Germany, and a co-author on the Google paper.
This is just this completely new and exciting space that were working in now, says Vedika Khemani, a condensed matter physicist now at Stanford who co-discovered the novel phase, while she was a graduate student and co-authored the new paper with the Google team.
In 2012, Frank Wilczek came up with the idea of time crystals while teaching a class about ordinary (spatial) crystals.
If you think about crystals in space, its very natural also to think about the classification of crystalline behaviour in time, he told Quanta.
Googles quantum computer has certainly achieved what many thought was impossible. Having said that, the experiment is in the preliminary stage and requires a lot of work. Moreover, the pre-print version of the research awaits validation from the scientists community and has to be reviewed by peers as well.
There are good reasons to think that none of those experiments completely succeeded, and a quantum computer like [Googles] would be particularly well placed to do much better than those earlier experiments, University of Oxford physicist John Chalker, who wasnt involved in the research, told Quanta.
How partnering up with Salesforce helped him succeed!
Go here to read the rest:
Posted in Quantum Computing
Comments Off on From theory to reality: Google claims to created physics-defying ‘time crystal’ inside its quantum computer – Silicon Canals
Will there be enough quantum engineers in APAC? – Tech Wire Asia
Posted: at 10:34 pm
The National University of Singapore and AWS are collaborating to boost the development of quantum communication and computing technologies(Photo by ROSLAN RAHMAN / AFP)
With quantum computing gaining traction in the Asia Pacific, quantum engineers are now being highly sought after by companies looking to leverage the technology. From Japan launching its most powerful quantum computer last month to China developing its quantum computers, quantum engineers are a key ingredient in the quantum computing workforce.
Compared to other analytical tools, quantum computing has the potential to solve computational problems that are beyond the reach of normal computers. Harnessing the laws of quantum mechanics, developing quantum algorithms, and designing useful quantum applications require skills and approaches.
The quantum computing market is expected to grow to US$ 1.76 billion by 2026 with early adoption in the banking and finance sector expecting to fuel the growth of the market globally. QuantumComputing-as-a-Service (QcaaS) is now also being offered by some tech giants to companies looking to experiment with the technology.
As such, most use cases for quantum computing are still limited but growing globally. To ensure the development of the technology keeps going, big tech vendors are working with universities to develop next-generation quantum engineers with the hope of having sufficient talent available once the technology becomes mainstream.
Japans most powerful quantum computer with IBM is used specifically for research and development while Chinas own quantum computer supercomputer can solve problems faster than some of the worlds most powerful supercomputers.
In Southeast Asia, the skills shortage gap is still a big concern. While the region has one of the fastest tech adoptions in the world, the skills shortage is still hindering most companies from going all out in their digital transformation.
An Amazon Web Services (AWS) report released earlier this year stated that between666 million and 819 million workers in the Asia Pacificwill use digital skills by 2025, up from just 149 million today, with the average employee requiring seven new digital skills to meet the growing demands in the industry.
Despite that, quantum computing is gaining traction in the region. Higher learning institutions in Malaysia, Singapore, Vietnam, and Indonesia are offering more courses on the subject and are hoping to develop more quantum engineers in the near future.
The National University of Singapore and AWS are collaborating to boost the development of quantum communication and computing technologies, as well as explore potential applications of quantum capabilities.
As part of the Quantum Engineering Program (QEP), AWS will support QEP in the development of quantum computing research and projects and connect to the National Quantum-Safe Network for quantum communications. Both areas include the identification of use cases and the development of applications to support the future commercialization of Singapore-designed quantum computing and communication technologies.
(Photo by Roslan RAHMAN / AFP)
QEP has supported eight major research projects to further the development of quantum technologies. They include exploring more powerful hardware and software solutions for quantum computers for commercial tasks like optimizing delivery routes for goods, simulating chemicals to help design drugs, or making manufacturing more efficient.
According to Professor Chen Tsuhan, NUS Deputy President (Research & Technology), Singapores journey to becoming a knowledge-based economy requires a right mix of world-class talent, cutting-edge infrastructure, and a well-established knowledge transfer ecosystem.
A cornerstone of this vision is the QEP hosted at NUS, which brings together expertise in quantum science and engineering and aims to translate radical innovations into commercial sable solutions. This collaboration between QEP and AWS is a crucial enabler for the nations full digital transformation and opens the door to a quantum-ready future.
Amazon Braket, a fully managed quantum computing service, provides access to three types of quantum hardware, including quantum annealers and gate-based systems built on superconducting qubits and on trapped ions, as well as tools to run hybrid quantum and classical algorithms.
Its cross-platform developer tools provide a consistent experience, reduces the need for multiple development environments, and make it easy to explore which quantum computing technology is the best fit for an application.
With NUS looking to develop more use cases and skilled professionals in quantum engineering and other tech-related fields, Singapore can become a hub for quantum computing in the region in the years to come.
Read more:
Will there be enough quantum engineers in APAC? - Tech Wire Asia
Posted in Quantum Computing
Comments Off on Will there be enough quantum engineers in APAC? – Tech Wire Asia
AI, quantum computing and other technologies poised to transform healthcare – Healthcare Finance News
Posted: at 10:34 pm
Photo: Al David Sacks/Getty Images
The COVID-19 pandemic has created numerous challenges in healthcare, but challenges can sometimes breed innovation. Technological innovation in particular is poised to change the way care is delivered, driving efficiency in the process. Efficiency will be key as hospitals and health systems look to recover from the initial, devastating wave of the pandemic.
Ryan Hodgin, chief technology officer for IBM Global Healthcare, and Kate Huey, partner at IBM Healthcare, will speak about some of these technological innovations in their digital HIMSS21 session, "Innovation Driven Resiliency: Redefining What's Possible."
The technology in question can encompass telehealth, artificial intelligence, automation, blockchain, chatbots, apps and other elements that have become mainstays of healthcare during the course of the pandemic.
In a way, science fiction is becoming science fact: Technologies that were once in the experimental phase are now coming to life and driving innovation, particularly quantum computing. The power of quantum computing has the potential to transform healthcare just by sheer force of its impressive computational power.
One of the big factors accelerating technological innovation is the healthcare workforce, which has been placed under enormous stress over the past 18 months, with many doctors and clinicians reporting burnout or feelings of being overwhelmed. These technologies promise to reduce the burden being felt by providers.
Importantly, they also promise to more actively engage healthcare consumers, who increasingly expect healthcare to be as user friendly and experience driven as their favorite apps or online shopping portals.
Hodgin and Huey will speak more on the topic when their digital session debuts on Tuesday, August 10, from 11:45 a.m. to12:15 p.m.
Read this article:
Posted in Quantum Computing
Comments Off on AI, quantum computing and other technologies poised to transform healthcare – Healthcare Finance News
T-Hub, HCL to collaborate on Quantum Computing and Deep Tech. – The Hindu
Posted: at 10:34 pm
Startup ecosystem enabler T-Hub and HCL Technologies have announced a collaboration to explore emerging technologies like Quantum Computing and DeepTech.
As part of the collaboration, T-Hub will connect HCLs Open Innovation Program eSTiP with select startups. This partnership will enable HCL to leverage T-Hubs innovation expertise and ecosystem of start-ups, corporates and investors to accelerate its open innovation initiatives, T-Hub said in a release.
Additionally, HCL will look to curate solutions of the startups for its clients and for focused programme statements, while gaining access to T-Hubs events and demo days.
T-Hub CEO Ravi Narayan said, with this partnership, we are focusing on aiding HCL in its vision of strengthening the approach of creating value for its customers and partners through some disruptive startups, whereas also providing our startups with growth opportunities.
Our partnership with T-Hub cements our ecosystem innovation journey with additional investments in Quantum Computing experiments as the technology continues to evolve", said Kalyan Kumar, Chief Technology Officer and Head-Ecosystems of HCL Technologies.
As Quantum Computing continue to mature and become commercially viable, we hope our continued engagement will bring insights into relevant startups, academia, business collaborators and other innovation ecosystem players, he added.
Read the original post:
T-Hub, HCL to collaborate on Quantum Computing and Deep Tech. - The Hindu
Posted in Quantum Computing
Comments Off on T-Hub, HCL to collaborate on Quantum Computing and Deep Tech. – The Hindu
Why it’s time to wake up to the quantum threat – Finextra – Finextra – Finextra
Posted: at 10:33 pm
Quantum computing is proving to be enormously exciting for financial institutions. Already,Goldman Sachs and Deutsche Brse are exploring quantum algorithms to calculate risk model simulations 1,000 times faster than currently possible, whileBBVA is looking to quantum to optimise investment portfolio management.
But a more sinister aspect to the technology also lurks just around the corner. Because of their computing power, quantum machines will be able to smash through the mathematical algorithms underpinning all modern encryption - posing an unparalleled cybersecurity risk.
It would take a traditional computer years to break the public-key encryption relied on today by just about every financial services company, but a fully-scalable quantum computer could achieve the same in a matter of hours.
According to roadmaps laid out by major players in the field, we will have a quantum computer capable of doing this within the next decade.
Mapping the vulnerabilities
Banks and financial institutions use a range of cryptographic algorithms to ensure the security of transactions, including symmetric key cryptography (e.g. 3DES) and public key cryptography. Although public key cryptography is most exposed to the quantum threat, some types of symmetric key cryptography are also vulnerable to attack.
Core to these operations are hardware security modules (HSMs). These form a key part of the physical infrastructure that stores and generates secure keys using cryptographic asymmetric algorithms to authenticate and validate transaction information.
A chain is only as strong as its weakest link, so unless up-to-date, quantum-secure HSMs are in place, theres a risk of quantum attackers exploiting a single vulnerability to expose all data within the payments ecosystem.
What complicates the issue is that quantum decryption can be applied retrospectively.
Bad actors could begin collecting encrypted data from institutions today, with the intent to harvest now, decrypt later. Financial services companies could unknowingly be victim to an attack today, and only suffer the consequences in the future when quantum computers become available.
Thankfully, some institutions are already paying attention, with early movers likeScotiabank,JP Morgan and Visaall taking the threat seriously.
Beginning the fight back
The world began to take note of the quantum threat when, in 2016, the US National Security Agency issued an officialwarning to industry. Shortly thereafter, the US National Institute of Standards and Technology (NIST) launched a post-quantum cryptography standardisation project to lay out the path to a quantum-secure future.
NIST is running the process as a competition. The project is now in its final stages, with seven finalist algorithms left after 80 submissions from six continents. The final algorithms will be chosen in 2021, with draft standards to be published thereafter.
Its anticipated that the US government will require contractors to incorporate the new NIST standards in order to conduct business with its agencies. As critical infrastructure, financial institutions are also likely to find that quantum-secure cryptography soon becomes a technical necessity.
The path to quantum security
The migration to new cryptography standards will be a massive undertaking - one of the biggest cybersecurity shifts in decades.
The transition will be complicated for banks, too. Each institution will be starting from its own unique position, with its own legacy systems and infrastructure, and each will be vulnerable to the quantum threat in a different way.
Financial institutions can save time in the long-run by taking steps to plan their own transition before NISTs new standards are even announced.
The first step is to conduct an audit, pinpointing each and every place where encryption is being used within the organisation. This will help to identify weak spots, find areas in need of rationalisation, and so on.
NISTagrees that companies should start preparing for the transition today: 'Itis critical to begin planning for the replacement of hardware, software, and services that use public-key algorithmsnow, so the information is protected from future attacks'.
Looking ahead
Institutions have invested huge amounts of time and effort building customer trust in digital banking, and cryptography was the main mathematical tool that allowed this to happen.
Now that quantum computers threaten to break it, its time for the sector to fight back.
The security of all sensitive data, past and present, relies on it.
View original post here:
Why it's time to wake up to the quantum threat - Finextra - Finextra - Finextra
Posted in Quantum Computing
Comments Off on Why it’s time to wake up to the quantum threat – Finextra – Finextra – Finextra
Google announces that it may have created a "time crystal" that breaks physics – Texasnewstoday.com
Posted: at 10:33 pm
Researchers in Googles quantum computing division have just published a study on the preprint server ArXiv that claims to have used the companys Sycamore quantum computer to create a time crystal that goes against physics, and how big this is. It is impossible to be honest about whether it will be a deal. To do.
As Quanta Magazine explains, time crystals are stable, always fluid, and definable states repeat at predictable intervals without melting into completely random states.
Without getting stuck in the upspins and downspins of qubits (subatomic particles that can represent both 1s and 0s and are the basis of quantum computing), claiming that Google did is essentially a checker. A board with all the red parts on one side and all the black parts on the other side figuratively hit the table in such a way that it completely switched between the two sides without consuming energy.
The second law of thermodynamics says this simply cant happen, but the time crystal doesnt seem to give a bite about entropy, and now Google has seen it actually move. Not only does it say that the process that created it is scalable and its impact can be enormous.
Its hard to say that what Google researchers have done will continue to be scrutinized, as Googles results need to be repeated unpeer-reviewed.
That said, if you can recreate what Googles quantum computer has done, time crystals arent just real, they could actually be used. The impact of such technologies on computer memory alone is difficult to understand, especially for computer processing itself.
After all, its very difficult to say what The nature we know does not work that way, so it will come from a system that opposes entropy-and the entropy assumptions are built into every system we have created or observed so far. Assuming these results are maintained, its really hard to predict what you can do with it, as youve never seen anything like this, but its incredibly exciting. Mystery is.
Google announces that it may have created a time crystal that breaks physics
Source link Google announces that it may have created a time crystal that breaks physics
Go here to read the rest:
Google announces that it may have created a "time crystal" that breaks physics - Texasnewstoday.com
Posted in Quantum Computing
Comments Off on Google announces that it may have created a "time crystal" that breaks physics – Texasnewstoday.com