The Prometheus League
Breaking News and Updates
- Abolition Of Work
- Ai
- Alt-right
- Alternative Medicine
- Antifa
- Artificial General Intelligence
- Artificial Intelligence
- Artificial Super Intelligence
- Ascension
- Astronomy
- Atheism
- Atheist
- Atlas Shrugged
- Automation
- Ayn Rand
- Bahamas
- Bankruptcy
- Basic Income Guarantee
- Big Tech
- Bitcoin
- Black Lives Matter
- Blackjack
- Boca Chica Texas
- Brexit
- Caribbean
- Casino
- Casino Affiliate
- Cbd Oil
- Censorship
- Cf
- Chess Engines
- Childfree
- Cloning
- Cloud Computing
- Conscious Evolution
- Corona Virus
- Cosmic Heaven
- Covid-19
- Cryonics
- Cryptocurrency
- Cyberpunk
- Darwinism
- Democrat
- Designer Babies
- DNA
- Donald Trump
- Eczema
- Elon Musk
- Entheogens
- Ethical Egoism
- Eugenic Concepts
- Eugenics
- Euthanasia
- Evolution
- Extropian
- Extropianism
- Extropy
- Fake News
- Federalism
- Federalist
- Fifth Amendment
- Fifth Amendment
- Financial Independence
- First Amendment
- Fiscal Freedom
- Food Supplements
- Fourth Amendment
- Fourth Amendment
- Free Speech
- Freedom
- Freedom of Speech
- Futurism
- Futurist
- Gambling
- Gene Medicine
- Genetic Engineering
- Genome
- Germ Warfare
- Golden Rule
- Government Oppression
- Hedonism
- High Seas
- History
- Hubble Telescope
- Human Genetic Engineering
- Human Genetics
- Human Immortality
- Human Longevity
- Illuminati
- Immortality
- Immortality Medicine
- Intentional Communities
- Jacinda Ardern
- Jitsi
- Jordan Peterson
- Las Vegas
- Liberal
- Libertarian
- Libertarianism
- Liberty
- Life Extension
- Macau
- Marie Byrd Land
- Mars
- Mars Colonization
- Mars Colony
- Memetics
- Micronations
- Mind Uploading
- Minerva Reefs
- Modern Satanism
- Moon Colonization
- Nanotech
- National Vanguard
- NATO
- Neo-eugenics
- Neurohacking
- Neurotechnology
- New Utopia
- New Zealand
- Nihilism
- Nootropics
- NSA
- Oceania
- Offshore
- Olympics
- Online Casino
- Online Gambling
- Pantheism
- Personal Empowerment
- Poker
- Political Correctness
- Politically Incorrect
- Polygamy
- Populism
- Post Human
- Post Humanism
- Posthuman
- Posthumanism
- Private Islands
- Progress
- Proud Boys
- Psoriasis
- Psychedelics
- Putin
- Quantum Computing
- Quantum Physics
- Rationalism
- Republican
- Resource Based Economy
- Robotics
- Rockall
- Ron Paul
- Roulette
- Russia
- Sealand
- Seasteading
- Second Amendment
- Second Amendment
- Seychelles
- Singularitarianism
- Singularity
- Socio-economic Collapse
- Space Exploration
- Space Station
- Space Travel
- Spacex
- Sports Betting
- Sportsbook
- Superintelligence
- Survivalism
- Talmud
- Technology
- Teilhard De Charden
- Terraforming Mars
- The Singularity
- Tms
- Tor Browser
- Trance
- Transhuman
- Transhuman News
- Transhumanism
- Transhumanist
- Transtopian
- Transtopianism
- Ukraine
- Uncategorized
- Vaping
- Victimless Crimes
- Virtual Reality
- Wage Slavery
- War On Drugs
- Waveland
- Ww3
- Yahoo
- Zeitgeist Movement
-
Prometheism
-
Forbidden Fruit
-
The Evolutionary Perspective
Category Archives: Quantum Physics
Coldest Temperature Ever Recorded | What Is Absolute Zero? – Popular Mechanics
Posted: October 7, 2021 at 3:50 pm
Erik Von WeberGetty Images
Researchers from four universities in Germany have created the coldest temperature ever recorded in a lab38 trillionths of a degree warmer than absolute zero to be exact, according to their new work, recently published in the journal Physical Review Letters.
The bone-chilling temperature only persisted for a few seconds at the University of Bremen's Center for Applied Space Technology and Microgravity, but the breakthrough could have longstanding ramifications for our understanding of quantum mechanics.
That's because the closer we get to absolute zerothe lowest possible temperature that we could ever theoretically reach, as outlined by the laws of thermodynamicsthe more peculiarly particles, and therefore substances, act. Liquid helium, for instance, becomes a "superfluid" at significantly low temperatures, meaning that it flows without any resistance from friction. Nitrogen freezes at -210 degrees Celsius. At cool enough temperatures, some particles even take on wave-like characteristics.
Absolute zero is equal to 273.15 degrees Celsius, or -459.67 degrees Fahrenheit, but most commonly, it's measured as 0 Kelvins. This is the point at which "the fundamental particles of nature have minimal vibrational motion," according to ScienceDaily. However, it's impossible for scientists to create absolute zero conditions in the lab.
In this case, the researchers were studying wave properties of atoms when they came up with a process that could lower a system's temperature by slowing particles to virtually a total standstill. For several seconds, the particles held completely still, and the temperature lowered to an astonishing 38 picokelvins, or 38 trillionths of a degree above absolute zero. This temperature is so low that it's not even detectable with a regular thermometer of any kind. Instead, the temperature is based on the lack of kinetic movement of the particles.
The mechanism at play here is "a time-domain matter-wave lens system," according to the team's research paper. A matter wave is just what it sounds like: matter that is behaving like a wave. This is part of quantum physics, where everything we previously thought we knew gets a little wobbly upon close examination. In this case, scientists used an magnetic lens to shape a quantum gas, and used that to make a matter wave focus and behave in a particular way. A regular gas is made of a loose arrangement of discrete particles, but a quantum gas is no such predictable material. In this case, the quantum gas is a perplexing state of matter called a Bose-Einstein condensate.
The lens is "tuned" using careful excitation. Think of the lenses on a pair of glasses, where the bend is designed to focus closer or further away depending on the patient's eyes. For this experiment, the scientists tuned the focus to literally infinity. Within the subset of quantum physics known as optics, this means the quantum gas confines the passing particles until they pass one at a time and at an astonishingly slow speed.
"By combining an excitation of a Bose-Einstein condensate (BEC) with a magnetic lens, we form a time-domain matter-wave lens system," the researchers write. "The focus is tuned by the strength of the lensing potential. By placing the focus at infinity, we lower the total internal kinetic energy of a BEC to 38pK."
The researchers, from the University of Bremen, Leibniz University Hannover, the Humboldt University of Berlin, and the Johannes Gutenberg University Mainz, say they envision future researchers making the particles go even slower, with a top potential "weightlessness" period of up to 17 seconds.
This content is created and maintained by a third party, and imported onto this page to help users provide their email addresses. You may be able to find more information about this and similar content at piano.io
See original here:
Coldest Temperature Ever Recorded | What Is Absolute Zero? - Popular Mechanics
Posted in Quantum Physics
Comments Off on Coldest Temperature Ever Recorded | What Is Absolute Zero? – Popular Mechanics
Scientists are using quantum computing to help them discover signs of life on other planets – ZDNet
Posted: at 3:50 pm
Scientists will use quantum computing tools to eventually help them detect molecules in outer space that could be precursors to life.
Quantum computers are assisting researchers in scouting the universe in search of life outside of our planet -- and although it's far from certain they'll find actual aliens, the outcomes of the experiment could be almost as exciting.
Zapata Computing, which provides quantum software services, has announced a new partnership with the UK's University of Hull, which will see scientists use quantum computing tools to eventually help them detect molecules in outer space that could be precursors to life.
During the eight-week program, quantum resources will be combined with classical computing tools to resolve complex calculations with better accuracy, with the end goal of finding out whether quantum computing could provide a useful boost to the work of astrophysicists, despite the technology's current limitations.
See also: There are two types of quantum computing. Now one company says it wants to offer both.
Detecting life in space is as tricky a task as it sounds. It all comes down to finding evidence of molecules that have the potential to create and sustain life -- and because scientists don't have the means to go out and observe the molecules for themselves, they have to rely on alternative methods.
Typically, astrophysicists pay attention to light, which can be analyzed through telescopes. This is because light -- for example, infrared radiation generated by nearby stars -- often interacts with molecules in outer space. And when it does, the particles vibrate, rotate, and absorb some of the light, leaving a specific signature on the spectral data that can be picked up by scientists back on Earth.
Therefore, for researchers, all that is left to do is detect those signatures and trace back to which molecules they correspond.
The problem? MIT researchershave previously established that over 14,000 moleculescould indicate signs of life in exoplanets' atmospheres. In other words, there is still a long way to go before astrophysicists have drawn a database of all the different ways that those molecules might interact with light -- of all the signatures that they should be looking for when pointing their telescopes to other planets.
That's the challenge that the University of Hull has set for itself: the institution's Centre for Astrophysics is effectively hoping to generate a database of detectable biological signatures.
For over two decades, explains David Benoit, senior lecturer in molecular physics and astrochemistry at the University of Hull, researchers have been using classical means to try and predict those signatures. Still, the method is rapidly running out of steam.
The calculations carried out by the researchers at the center in Hull involve describing exactly how electrons interact with each other within a molecule of interest -- think hydrogen, oxygen, nitrogen and so on. "On classical computers, we can describe the interactions, but the problem is this is a factorial algorithm, meaning that the more electrons you have, the faster your problem is going to grow," Benoit tells ZDNet.
"We can do it with two hydrogen atoms, for example, but by the time you have something much bigger, like CO2, you're starting to lose your nerve a little bit because you're using a supercomputer, and even they don't have enough memory or computing power to do that exactly."
Simulating these interactions with classical means, therefore, ultimately comes at the cost of accuracy. But as Benoit says, you don't want to be the one claiming to have detected life on an exo-planet when it was actually something else.
Unlike classical computers, however, quantum systems are built on the principles of quantum mechanics -- those that govern the behavior of particles when they are taken at their smallest scale: the same principles as those that underlie the behavior of electrons and atoms in a molecule.
This prompted Benoit to approach Zapata with a "crazy idea": to use quantum computers to solve the quantum problem of life in space.
"The system is quantum, so instead of taking a classical computer that has to simulate all of the quantum things, you can take a quantum thing and measure it instead to try and extract the quantum data we want," explains Benoit.
Quantum computers, by nature, could therefore allow for accurate calculations of the patterns that define the behavior of complex quantum systems like molecules without calling for the huge compute power that a classical simulation would require.
The data that is extracted from the quantum calculation about the behavior of electrons can then be combined with classical methods to simulate the signature of molecules of interest in space when they come into contact with light.
It remains true that the quantum computers that are currently available to carry out this type of calculation are limited: most systems don't break the 100-qubit count, which is not enough to model very complex molecules.
See also: Preparing for the 'golden age' of artificial intelligence and machine learning.
Benoit explains that this has not put off the center's researchers. "We are going to take something small and extrapolate the quantum behavior from that small system to the real one," says Benoit. "We can already use the data we get from a few qubits, because we know the data is exact. Then, we can extrapolate."
That is not to say that the time has come to get rid of the center's supercomputers, continues Benoit. The program is only starting, and over the course of the next eight weeks, the researchers will be finding out whether it is possible at all to extract those exact physics on a small scale, thanks to a quantum computer, in order to assist large-scale calculations.
"It's trying to see how far we can push quantum computing," says Benoit, "and see if it really works, if it's really as good as we think it is."
If the project succeeds, it could constitute an early use case for quantum computers -- one that could demonstrate the usefulness of the technology despite its current technical limitations. That in itself is a pretty good achievement; the next milestone could be the discovery of our exo-planet neighbors.
Read the rest here:
Scientists are using quantum computing to help them discover signs of life on other planets - ZDNet
Posted in Quantum Physics
Comments Off on Scientists are using quantum computing to help them discover signs of life on other planets – ZDNet
A novel way to heat and cool things – The Economist
Posted: at 3:50 pm
Oct 7th 2021
REFRIGERATORS AND air-conditioners are old and clunky technology, and represent a field ripe for disruption. They consume a lot of electricity. And they generally rely on chemicals called hydrofluorocarbons which, if they leak into the atmosphere, have a potent greenhouse-warming effect. Buildings central-heating systems, meanwhile, are often powered by methane in the form of natural gas, which releases carbon dioxide, another greenhouse gas, when it is burned, and also has a tendency to leak from the pipes that deliver itwhich is unfortunate, because methane, too, is a greenhouse gas, and one much more potent than CO2.
Your browser does not support the
Enjoy more audio and podcasts on iOS or Android.
One potential way of getting around all this might be to exploit what is known as the thermoelectric effect, a means of carrying heat from place to place as an electric current. Thermoelectric circuits can be used either to cool things down, or to heat them up. And a firm called Phononic, based in Durham, North Carolina, has developed a chip which does just that.
The thermoelectric effect was discovered in 1834 by Jean Charles Peltier, a French physicist. It happens in an electrical circuit that includes two materials of different conductivity. A flow of electrons from the more conductive to the less conductive causes cooling. A flow in the other direction causes heating.
The reason for this is that electrons are able to vibrate more freely when pushed into a conductive material. They thereby transfer energy to their surroundings, warming them. When shunted into a less conductive one, electrons vibrations are constrained, and they absorb energy from their surroundings, cooling those surroundings down. An array of thermoelectric circuits built with all the high-conductivity materials facing in one direction and all the low conductivity ones in the other can thus move heat in either direction, by switching the polarity of the current. For reasons buried in the mathematics of quantum physics, the heat thus flowing does so in discrete packages, called phonons. Hence the name of the firm.
The thermoelectric effect works best when the conductors involved are actually semiconductors, with bismuth and tin being common choices. Fancy cameras contain simple cooling chips which use these, as do some scientific instruments. But Phononics boss, Tony Atti, thinks that is small beer. Using the good offices of Fabrinet, a chipmaker in Thailand, he has started making more sophisticated versions at high volume, using the set of tools and techniques normally employed to etch information-processing circuits onto wafers made of silicon. In this case, though, the wafers are made of bismuth.
The results are, admittedly, still a long way from something that could heat or cool a building. But they are already finding lucrative employment in applications where space is at a premium. At the moment, the fastest-growing market is cooling the infrared lasers used to fire information-encoding photons through fibre-optic cables, for the long-distance transmission of data. They are also being used, though, in the 5G mobile-phone base stations now starting to blanket street corners, to keep the batteries of electric vehicles at optimal operating temperatures, and as components of the optical-frequency radar-like systems known as LIDAR, that help guide autonomous vehicles.
The crucial question from Mr Attis point of view is whether semiconductor-based thermoelectronics can break out of these niches and become more mainstream, in the way that semiconductor-based electronics and lighting have done. In particular, he would like to incorporate heat-pumping chips into buildings, to provide them with integral thermoregulation.
In their current form, thermoelectric chips are unlikely to replace conventional air conditioning and central heating because they cannot move heat over the distances required to pump it in and out of a building in bulk. But they could nonetheless be used as regulators. Instead of turning a big air-conditioning system on or off, to lower or raise the temperature by the small amounts required to maintain comfort, with all the cost that entails, thermoelectric chips might tweak matters by moving heat around locally.
Phononic has already run trials of such local-temperature-tweaking chips in Singapore, in partnership with Temasek, that countrys state-run investment fund. In 2019 SP Group, Singapores utility company, installed eight of the firms heat pumps, which comprise an array of chips pointed down at people, pumping heat out of the air above them, on the boardwalk on Clarke Quay in the city. Phononic claims the devices lowered the temperature in their vicinity by up to 10C and, as a bonus, consequently reduced humidity by 15%. If that can be scaled up, it would certainly be a cool result.
This article appeared in the Science & technology section of the print edition under the headline "Cool thinking"
The rest is here:
Posted in Quantum Physics
Comments Off on A novel way to heat and cool things – The Economist
Quantum computing will break today’s encryption standards – here’s what to do about it – Verizon Communications
Posted: at 3:50 pm
When you come to the fork in the road, take it. Yogi Berra
For cryptologists, Yogi Berras words have perhaps never rang more true. As a future with quantum computing approaches, our internet and stored secrets are at risk. The tried-and-true encryption mechanisms that we use every day, like Transport Layer Security (TLS) and Virtual Private Networks (VPN), could be cracked and exposed by a hacker equipped with a large enough quantum computer using Shors algorithm, a powerful algorithm with exponential speed over classical algorithms. The result?The security algorithms we use today that would take roughly 10 billion years to decrypt could take as little as 10 seconds. To prevent this, its imperative that we augment our security protocols, and we have two options to choose from: one using physics as its foundation, or one using math our figurative fork in the road.
To understand how to solve the impending security threats in a quantum era, we need to first understand the fundamentals of our current encryption mechanism. The most commonly used in nearly all internet activities TLS is implemented anytime someone performs an online activity involving sensitive information, like logging into a banking app, completing a sale on an online retailer website, or simply checking email. It works by combining the data with a 32-byte key of random 1s and 0s in a complicated and specific way so that the data is completely unrecognizable to anyone except for the two end-to-end parties sending and receiving the data. This process is called public key encryption, and currently it leverages a few popular algorithms for key exchange, e.g., Elliptic curve Diffie-Hellman (ECDH) or RSA (each named after cryptologists,) each of which are vulnerable to quantum computers. The data exchange has two steps: the key exchange and the encryption itself. The encryption of the data with a secure key will still be safe, but the delivery of the key to unlock that information (key distribution) will not be secure in the future quantum era.
To be ready for quantum computers, we need to devise a new method of key distribution, a way to safely deliver the key from one end of the connection to the other.
Imagine a scenario wherein you and a childhood friend want to share secrets, but can only do so once you each have the same secret passcode in front of you (and there are no phones.) One friend has to come up with a unique passcode, write it down on a piece of paper (while maintaining a copy for themselves,) and then walk it down the block so the other has the same passcode. Once you and your friend have the shared key, you can exchange secrets (encrypted data) that even a quantum computer cannot read.
While walking down the block though, your friend could be vulnerable to the school bully accosting him or her and stealing the passcode, and we cant let this happen. What if your friend lives across town, and not just down the block? Or even more difficult in a different country? (And where is that secret decoder ring we got from a box of sugar-coated-sugar cereal we ate as kids?)
In a world where global information transactions are happening nonstop, we need a safe way of delivering keys no matter the distance. Quantum physics can provide a way to securely deliver shared keys quicker and in larger volume, and, most importantly, immune to being intercepted. Using fiber optic cables (like the ones used by telecommunications companies,) special Quantum Key Distribution (QKD) equipment can send tiny particles (or light waves) called photons to each party in the exchange of data. The sequence of the photons encapsulates the identity of the key, a random sequence of 1s and 0s that only the intended recipients can receive to construct the key.
Quantum Key Distribution also has a sort of built-in anti-hacker bonus. Because of the no-cloning theorem (which essentially states that by their very nature, photons cannot be cloned,) QKD also renders the identity of the key untouchable by any hacker. If an attacker tried to grab the photons and alter them, it would automatically be detected, and the affected key material would be discarded.
The other way we could choose to solve the security threats posed by quantum computers is to harness the power of algorithms. Although its true the RSA and ECDH algorithms are vulnerable to Shors algorithm on a suitable quantum computer, the National Institute of Standards and Technology (NIST) is working to develop replacement algorithms that will be safe from quantum computers as part of its post-quantum cryptography (PQC) efforts. Some are already in the process of being vetted, like ones called McEliece, Saber, Crystals-Kyber, and NTRU.
Each of these algorithms has its own strong and weak points that the NIST is working through. For instance, McEliece is one of the most trusted by virtue of its longstanding resistance to attack, but it is also handicapped by its excessively long public keys that may make it impractical for small devices or web browsing. The other algorithms, especially Saber, run very well on practically any device, but, because they are relatively new, the confidence level in them from cryptographers is still relatively low.
With such a dynamic landscape of ongoing efforts, there is promise that a viable solution will emerge in time to keep our data safe.
The jury is still out. We at Verizon and most of the world rely heavily on e-commerce to sell our products and encryption to communicate via email, messaging, and cellular voice calls.All of these need secure encryption technologies in the coming quantum era. But whether we choose pre-shared keys (implemented by the awesome photon) or algorithms, further leveraging mathematics, our communications software will need updating. And while the post quantum cryptography effort is relatively new, it is not clear which algorithms will withstand scrutiny from the cryptographic community. In the meantime, we continue to peer down each fork in the road to seek the best option to take.
Excerpt from:
Posted in Quantum Physics
Comments Off on Quantum computing will break today’s encryption standards – here’s what to do about it – Verizon Communications
New Fundamental Limit of Trapping and Exploiting Light at the Nanoscale – SciTechDaily
Posted: at 3:50 pm
Metasurface of split-ring resonators, partially overlaid with 3D colourmaps showing the simulated electric-field distribution. High-momentum magnetoplasmons lead to the break-down of polaritons (blue spheres with photon energies in red). Credit: Urban Senica, ETH Zurich
Physicists from the University of Southampton and ETH Zrich have reached a new threshold of light-matter coupling at the nanoscale.
The international research, published recently in Nature Photonics, combined theoretical and experimental findings to establish a fundamental limitation of our ability to confine and exploit light.
The collaboration focused on photonic nano-antennas fabricated in ever reducing sizes on the top of a two-dimensional electron gas. The setup is commonly used in laboratories all over the world to explore the effect of intense electromagnetic coupling, taking advantage of the antennas ability to trap and focus light close to electrons.
Professor Simone De Liberato, Director of the Quantum Theory and Technology group at the University of Southampton, says: The fabrication of photonic resonators able to focus light in extremely small volumes is proving a key technology which is presently enabling advances in fields as different as material science, optoelectronics, chemistry, quantum technologies, and many others.
In particular, the focussed light can be made to interact extremely strongly with matter, making electromagnetism non-perturbative. Light can then be used to modify the properties of the materials it interacts with, thus becoming a powerful tool for material science. Light can be effectively woven into novel materials.
Scientists discovered that light could no longer be confined in the system below a critical dimension, of the order of 250nm in the sample under study, when the experiment started exciting propagating plasmons. This caused waves of electrons to move away from the resonator and spill the energy of the photon.
Experiments performed in the group of Professors Jrme Faist and Giacomo Scalari at ETH Zrich had obtained results that could not be interpreted with state-of-the-art understanding of light-matter coupling. The physicists approached Southamptons School of Physics and Astronomy, where researchers led theoretical analysis and built a novel theory able to quantitatively reproduce the results.
Professor De Liberato believes the newfound limits could yet be exceeded by future experiments, unlocking dramatic technological advances that hinge on ultra-confined electromagnetic fields.
Read Exploring the Quantitative Limits of LightMatter Coupling at the Nanoscale for more on this research.
Reference: Polaritonic nonlocality in lightmatter interaction by Shima Rajabali, Erika Cortese, Mattias Beck, Simone De Liberato, Jrme Faist and Giacomo Scalari, 9 August 2021, Nature Photonics.DOI: 10.1038/s41566-021-00854-3
Go here to read the rest:
New Fundamental Limit of Trapping and Exploiting Light at the Nanoscale - SciTechDaily
Posted in Quantum Physics
Comments Off on New Fundamental Limit of Trapping and Exploiting Light at the Nanoscale – SciTechDaily
The best way to win a Nobel is to get nominated by another laureate – The Economist
Posted: at 3:50 pm
Oct 9th 2021
THE NOBEL prizes, whose winners are announced this month (see Science), may be the worlds most coveted awards. As soon as a new crop of laureates is named, critics start comparing the victors achievements with those of previous winners, reigniting debates over past snubs.
Your browser does not support the
Enjoy more audio and podcasts on iOS or Android.
A full account of why, say, Stephen Hawking was passed over will have to wait until 2068: the Nobel Foundations rules prevent disclosure about the selection process for 50 years. But once this statute of limitations ends, the foundation reveals who offered nominations, and whom they endorsed. Its data start in 1901 and end in 1953 for medicine; 1966 for physics, chemistry and literature; and 1967 for peace. (The economics prize was first awarded in 1969.)
Nomination lists do not explain omissions like Leo Tolstoy (who got 19 nominations) or Mahatma Gandhi (who got 12). But they do show that in 1901-66, Nobel voters handed out awards more in the style of a private members club than of a survey of expert opinion. Whereas candidates with lots of nominations often fell short, those with the right backerslike Albert Einstein or other laureatesfared better.
The bar to a Nobel nomination is low. For the peace prize, public officials, jurists and the like submit names to a committee, chosen by Norways parliament, that picks the winner. For the others, Swedish academies solicit names from thousands of people, mostly professors, and hold a vote for the laureate. On average, 55 nominations per year were filed for each prize in 1901-66.
Historically, voters paid little heed to consensus among nominators. In literature and medicine, the candidate with the most nominations won just 11% and 12% of the time; in peace and chemistry, the rates were 23% and 26%. Only in physics, at 42%, did nomination leaders have a big advantage. In 1956 Ramn Menndez Pidal, a linguist and historian, got 60% of nominations for the literature prize, but still lost.
However, voters did make one group of nominators happy: current and future laureates. Candidates put forward by past victors went on to win at some point in the future 40% more often than did those whose nominators never won a Nobel. People whose nominators became laureates later on also won unusually often. This implies that being accomplished enough to merit future Nobel consideration was sufficient to gain extra influence over voters.
In theory, this imbalance could simply reflect laureates nominating stronger candidates. However, at least one Nobel winner seems to have boosted his nominees chances, rather than merely naming superstars who would have won anyway.
According to the Nobel Foundations online archive, all 11 of Einsteins nominees won a prize. Some were already famous, like Max Planck; others, like Walther Bothe, were lesser-known. In two cases, his support seems to have been decisive.
In 1940 Einstein supported Otto Stern, a physicist who had already had 60 nominations. Stern won the next time the prize was given. Similarly, Wolfgang Pauli, whose exclusion principle is central to quantum mechanics, had received 20 nominations before Einstein backed him in 1945. He got his prize that same year.
Source: Nobel Foundation
This article appeared in the Graphic detail section of the print edition under the headline "Noblesse oblige"
View original post here:
The best way to win a Nobel is to get nominated by another laureate - The Economist
Posted in Quantum Physics
Comments Off on The best way to win a Nobel is to get nominated by another laureate – The Economist
Black Holes Might Conceal a Huge Wall of Fire. But We May Never See Them – Interesting Engineering
Posted: at 3:50 pm
Alice and Bob are two of the most famous explorers you've probably never heard of. If there is a quantum experiment being discussed, Alice and Bob are usually involved, and they've been through a lot together. But in the last 50 years, classical physics and quantum mechanics have come into a direct conflict at the bleeding edge of the most extreme objects in the universe, black holes, and things have not turned out great for Alice.
See, Alice is a sub-atomic particle, and she's been everywhere from hanging out with Schrodinger's Cat to performing immensely complex computations in a quantum computer. But, if a recent theory about an especially thorny physics paradox is correct, Alice just might end her intrepid travels for good by falling past the event horizon of a black hole, only to be immediately incinerated by a massive wall of intense energy that runs all along the entire event horizon, forever beyond our ability to ever see it.
This black hole firewall, as it's become known, was immediately dismissedas ludicrous, and even insulting, when it was initially proposed in 2012, but nearly a decade later, scientists are still struggling to refute it, and the controversy could have profound implications for physics as we know it.
Before we can wrangle with the mysterious interior of a black hole, we should start by describing what we know about black holes.
Black holes were first predicted by a humble English rector John Michell in 1783, who used Newtonian mechanics to posit the existence of "Dark Stars" whose gravity was stronger than a particle of light's capacity to escape it. However, the concept of black holes we are more familiar with arose from Albert Einstein and his theory of relativity in 1915.
Karl Schwarzschild, a German physicist and astronomer, read Einstein's 1905 paper on special relativity a few months and produced the first exact solution to Einstein's general gravitational equations, which impressed even Einstein himself."I had not expected that one could formulate the exact solution of the problem in such a simple way," he wrote to Schwarzchild in 1916.
What Schwarzchild is perhaps best known for, however, is applying the math of Einstein's relativity and deriving the possible existence of black holes based on the escape velocity of light (much as Michell had done with Newtonian mechanics). Schwarzschild himself didn't believe that black holes actually existed, but his work provided the mathematical basis on which our modern understanding of black holes was built.
The key feature of the black holes he described was an event horizon, a boundary located a predictable distance from the center of the black hole's mass which represented the gravitational threshold where the escape velocity from the black hole exceeds the speed of light. On the outside of the event horizon, escape was possible, but once you passed that boundary, relativity meant you could never leave, since nothing can travel faster than light.
There have been some major developments in our understanding of black holes since Schwarzchild, but these basic features have stayed more or less the same since he first proposed them.
Stepping away from the macroscale for a moment, we now need to dive below the level of the atom and discuss subatomic particles.
Subatomic matter does not behave in the same way as matter at the macroscale level. Instead, at the quantum level, the universe is governed by a strange world of probabilities and physics-defying features like quantum entanglement.
This feature of quantum entanglement, where two subatomic particles interact with one another and in the process become inextricably linked so that they behave as if they were a single object, seems to pay no mind to relativity, happily transmitting information between two entangled particles instantaneously over distances so vast that this information can be said to be traveling faster, sometimes exponentially faster, than light.
Einstein and other noted physicists in the first half of the 20th century were so bothered by some of the peculiarities of quantum mechanics, particularly quantum entanglement, that they went to great lengths to try to refute its results, but its math has held up sound and some of the fundamental laws have proved to be as unassailable as Relativity. Quantum entanglement isn't just predictable, it's become the bedrock of actual working technology like quantum computing.
Quantum mechanics isn't constructed using the same kind of math as classical physics, though. Classical physics relies on predictable mathematical techniques like calculus, while quantum mechanics is built largely on probabilities, the math of the card game, and the craps table.
The probabilities that form the basis of quantum mechanics, however, rely on an important principle that cannot be violated: the preservation of information.
If you roll a six-sided die, you have an equal one-in-six chance of rolling any of its values, but the probability that you will role a result is 1, which is the sum of adding up all the individual probabilities for all possible outcomes (in the case of the die, rolling a 1, 2, 3, 4, 5, or 6 all have one-sixth probability, so add all six one-sixths together and you get six-sixths, which is equal to 1). This summing up of probabilities in quantum mechanics is known as the principle of unitarity.
This predictive quality of probability relies on an even more fundamental rule, though, which is that knowing the current quantum state of a particle is predictive of its future state and also allows you to wind the particle back to its previous state.
Theoretically, if you had perfect knowledge of how a die was rolled, as well as the result, you could move back in time to identify which side was facing up when it was in your hand.
In order for this to work, though, that information about a previous quantum state must be preserved somehow in the universe. If it were to suddenly disappear, it would be like taking one of the die faces off the die and leaving nothing in its place.
When that die is rolled again, its five remaining sides still have a one in six probability, but now those sides add up to five-sixths rather than 1.So destroying information, like removing one of those die faces, breaks the quantum probabilities of that die roll.
This sort of transgression in quantum mechanics can't be allowed, since the information being destroyed directly leads tous not even being able to tell how many die faces we started out with originally and, thus, we couldn't actually know the true probabilities for anything.
Quantum mechanics as we know it would no longer work if quantum information is somehow destroyed.
What's more, there is also a principle in quantum mechanics known as monogamous quantum entanglement. Essentially, a particle can only be maximally entangled with one other particle, to the exclusion of all others, and this is key to how information in a quantum system is preserved.
There is a lot more to quantum mechanics than just these principles, but these are the essential ones to understanding how a black hole's event horizon could really be a gigantic, invisible shell of blazing hot energy.
When Steven Hawking did his most important work on black holes in the 1970s, he wasn't setting out to lay the foundation for a black hole firewall that annihilates anything unfortunate enough to fall into it, but may be what he did when he proposed the existence ofHawking radiation in 1974.
In even the emptiest of space, there is a roiling boil of quantum activity. It is thought that, spontaneously, virtual quantum particle and anti-particle pairs entangled together are constantly materializing and annihilating each other, drawing energy from the universe to create themselves and returning that same energy when they destroy each other.
Hawking realized, though, that if a pair of virtual particles materialize along the edge of a black hole's event horizon, though, one particle could fall into the black hole while its entangled partner on the outside is able to break free of the black hole and escape, producing what is now known as Hawking radiation.
The problem is that, according to the first law of thermodynamics, energy in a closed system must be conserved. If two virtual particles draw from the energy of the universe to materialize but don't immediately annihilate each other, then energy has been drawn from the universe without crediting it back. The only way something like this can happen is that the infalling particle must have negative energy in equal absolute value to the positive energy of the escaping particle.
But black holes, while immensely massive and energetic, aren't infinite they have a defined mass, and any infalling, negative-energy particle subtracts an infinitesimally small amount of that black hole's mass when it enters. If the black hole doesn't accrete any additional material to add more mass, these tiny substractions due to Hawking radiation start to add up, and as more mass gets evaporated away, the evaporation of the black hole accelerates.
Eventually, enough Hawking radiation is emitted that the largest black holes shrink to nothing and simply wink out of existence.
The challenge presented by Hawking radiation is that even if spacetime becomes infinitely warped at a black hole's singularity, it is held that whatever quantum information enters a black hole is still somehow preserved and therefore, theoretically, retrievable.
If nothing else, all that information hangs out at the black hole's infinite singularity and can at least still factor into any quantum probabilities so everything continues to add up to 1.
Critically, Hawking said that this radiation, even as it is still entangled with its infalling anti-particle, contains no encoded information about the black hole or its contents.
This means that all of the information that falls into a black hole never leaves it and would presumably evaporate into nothing, along with the black hole, due to Hawking radiation. This would take all of that information out of the overall quantum equation and the probabilities would suddenly stop adding up correctly.
Other physicists, likeJohn Preskill of the California Institute of Technology, have argued that Hawking radiation actually becomes entangled with the area immediately outside the event horizon where the quantum information from infalling particles must be encoded. So long as the infalling particle and the outside particle do not share this information between them, quantum information need not be destroyed.
This was a tangled knot to begin with, but in 2012, a group of University of California, Santa Barbara, physicists proposed a solution to the information paradox that only seemed to make everything more contentious.
When attempting to wrestle with the information paradox in 2012,Ahmed Almheiri, Donald Marolf, Joseph Polchinski, and James Sully collectively known as AMPS published a paper in the Journal of High Energy Physics arguing that along the edge of a black hole's event horizon was a swirling wall of energy so intense that it completely incinerated anything that touched it.
This was the result, AMPS argued, of the entanglement responsible for Hawking radiation being effectively severed by the event horizon, releasing an enormous amount of energy in the process. And since Hawking radiation is a constant process all along the edge of the event horizon, this energy is also being released constantly all across the event horizon.
What makes this theory so controversial is that this would violate another pillar of modern physics: the principle of equivalence. According to General Relativity,gravitationaland inertial forces have a similar nature and are often indistinguishable. So, you would not be able totell the difference between being in a stationary elevator in a gravitational field and an accelerating elevator in free space.This means that, if an observer were to pass the event horizon of a black hole, they should not notice anything amiss at least not immediately because it is still entangled to the observer outside the event horizon.
The tidal force of the singularity's incredible gravity would eventually tear the observer apart into a very long string of atoms, but depending on the size of the black hole, an observer could continue to float down toward the black hole's singularity for anywhere from a few microseconds to possibly a few decades before this spaghettification occurs.
If the black hole firewall theory is correct though, the infalling observer would not even make it past the event horizon, since the outside particle becomes Hawking radiation when its entangled counterpart falls into the black hole. In order for the quantum information inside the black hole to be preserved, the new Hawking radiation must become entangled with the area outside the event horizon.
Quantum mechanics forbids this kind of dual-entanglement. Either Hawking radiation does not entangle with the region along the event horizon, meaning that quantum information is lost for good, or its entanglement with the infalling particle must be severed at the event horizon, meaning equivalence breaks down, which inexorably gives rise to the black hole firewall.
This did not go over well with physicists, since undoing the equivalence principle would pull the entire foundation of spacetime out from under Einstein's relativity, which simply couldn't be possible given how regularly relativity has been validated through experimentation. If equivalence didn't hold, then all of those experiments had to have been a 90-plus-year series of flukes that happened to confirm a false idea by pure chance.
This wasn't lost on AMPS, who pointed out that if everyone wanted to keep equivalence, then they had no choice butto sacrifice the preservation of information or completely rewrite what we knew about quantum field theory.
Steve Giddings, a quantum physicist at the University of California, Santa Barbara, said the paper produced a crisis in the foundations of physics that may need a revolution to resolve.
When Raphael Bousso,a string theorist at the University of California, Berkeley, first read the AMPS paper, he thought the theory preposterous and believed it would be quickly shot down. "A firewall simply cant appear in empty space, any more than a brick wall can suddenly appear in an empty field and smack you in the face," he said.
But as the years dragged on, no one has really been able to offer a satisfying rebuttal to put the controversy to rest.Bousso told a gathering of black hole experts who'd come to CERN in 2013 to discuss the black hole firewall that the theory, "shakes the foundations of what most of us believed about black holes... It essentially pits quantum mechanics against general relativity, without giving us any clues as to which direction to go next."
The controversy has produced some interesting counter theories though. Giddings proposed in 2013 that if Hawking radiation were to make it some short distance from the event horizon before its entanglement with the infalling particle is broken, the release of energy would be muted enough to preserve Einstein's equivalence principle.This has its own cost, though, as it would still require rewriting some of the rules of quantum mechanics.
Preskill, meanwhile, famously bet Hawking in 1997 that information was not lost in a black hole and soon after a theory was put forward by Havard University'sJuan Maldacena argued that "holograms" could encode 3D information in a 2D space where gravity had no influence, allowing information to find its way out of the black hole after all.
This argument proved persuasive enough for Hawking, who conceded to Preskill that information could be saved after all. With this history, Preskill makes an odd champion for the idea that information loss is actually the least offensive solution to the black hole firewall, but that was the argument he put forward in the 2013 conference. Quantum mechanics might need a page-one rewrite if information is lost, he said, but it wasn't out of the question. "Look in the mirror and ask yourself: Would I bet my life on unitarity?" he asked attendees.
Another possible solution to the black hole firewall problem was proposed byMaldacena and Stanford University's Leonard Susskind in 2013: wormholes.
In Maldacena and Susskind's proposal, quantum entanglement and Einstein-Rosen bridges are both intimately connected and could be two ways of describing the same phenomenon. If wormholes from inside the black hole were able to connect the infalling particles to their outside partners, then a form of entanglement could be maintained that did not require breaking entanglement at the event horizon, thus sidestepping the need for a firewall.
For all their inventiveness though, no one seems to be totally satisfied with the answers, even if they are enjoying the excitement of the debate itself.
This is probably the most exciting thing thats happened to me since I entered physics, Bousso said. Its certainly the nicest paradox thats come my way, and Im excited to be working on it.
Excerpt from:
Black Holes Might Conceal a Huge Wall of Fire. But We May Never See Them - Interesting Engineering
Posted in Quantum Physics
Comments Off on Black Holes Might Conceal a Huge Wall of Fire. But We May Never See Them – Interesting Engineering
Top 10 Quantum Computing Workshop and Conferences to Attend in 2021 – Analytics Insight
Posted: at 3:50 pm
As you know, quantum computing is a type of computation that harnesses the collective properties of quantum states, such as superposition, interference, and entanglement, to perform calculations. To discuss the future of quantum computing there are some workshops and conferences taking place in 2021 that every person should attend.
Here are the top ten quantum computing workshops and conferences:
IEEE Quantum Week the IEEE International Conference on Quantum Computing and Engineering (QCE) is bridging the gap between the science of quantum computing and the development of an industry surrounding it. IEEE Quantum Week is a multidisciplinary quantum computing and engineering venue that gives attendees the unique opportunity to discuss challenges and opportunities with quantum researchers, scientists, engineers, entrepreneurs, and more.
The International Conference on Quantum Communication ICQOM 2021 will take place at the Jussieu campus in Paris, France from the 18th to the 22nd of October 2021. The scope of the conference is focused on Quantum Communications, including theoretical and experimental activities related to Quantum Cryptography and Quantum Networks in a broad sense.
Quantum Techniques in Machine Learning (QTML) is an annual international conference focusing on the interdisciplinary field of quantum technology and machine learning. The goal of the conference is to gather leading academic researchers and industry players to interact through a series of scientific talks focused on the interplay between machine learning and quantum physics.
The 23rd Annual SQuInT Workshop is co-organized by the Center for Quantum Information and Control (CQuIC) at the University of New Mexico (UNM) and the Oregon Center for Optical Molecular and Quantum Science (OMQ) at the University of Oregon (UO). The last date of registration is October 11, 2021.
Keysight World 2021 will be held as a virtual conference. As part of a track focusing on Driving the Digital Transformation, there will be a session titled Pushing the Envelope on Quantum Computing that will include panel sessions with authorities from Rigetti, Google, IQC, and Keysight.
The Quantum Startup Foundry at the University of Maryland will be holding an Investment Summit for quantum startups to showcase their companies to potential investors on October 12-13, 2021. The focus of the event is to link the most promising early- and growth-stage companies with investors and informing key stakeholders about the unique aspects of investing in quantum.
The Inside Quantum Technology (IQT) Fall Conference will be held as a hybrid conference, both in-person and online, in New York City. The conference will be a gathering of business leaders, product developers, marketing strategists, and investors anywhere in the world focused on quantum technology.
The annual Chicago Quantum Summit engages scientific and government leaders, the industries that will scale and drive the applications of emerging quantum research, and the trainees that will lead this future. Focusing on fostering a domestic and international community, experts discuss the future of quantum information science and technology research, the companies in the quantum ecosystem, and strategies to educate and build tomorrows quantum workforce.
The Quantum Computing Summit Silicon Valley organized by Informa Tech will occur on November 3-4, 2021. It will run alongside the AI Summit that has been designed to provide business, technical, research, academic, and innovation insight, qualified via application-based quantum experiences to showcase how quantum is delivering real business value, drive process efficiency, and cost optimization.
The Optical Society (OSA) will hold its Quantum Information and Measurement VI as a virtual conference. The conference topics will cover the latest in theoretical developments and experimental implementations of quantum information technology, including the advanced engineering needed to realize such technologies. In addition to the conferences traditional focus on quantum optics and more.
Read more:
Top 10 Quantum Computing Workshop and Conferences to Attend in 2021 - Analytics Insight
Posted in Quantum Physics
Comments Off on Top 10 Quantum Computing Workshop and Conferences to Attend in 2021 – Analytics Insight
RIT professor and team discover new method to measure motion of superfluids – RochesterFirst
Posted: at 3:50 pm
HENRIETTA, N.Y (WROC) According to the Rochester Institute of Technology, Mishkat Bhattacharya, an associate professor at RITsSchool of Physics and AstronomyandFuture Photon Initiative, proposed a new method for detecting superfluid motion in anarticle published inPhysical Review Letters.
Bhattacharyas theoretical team on the paper consisted of RIT postdoctoral researchers Pardeep Kumar and Tushar Biswas, and alumnus Kristian Feliz 21 (physics). The international collaborators consisted of professors Rina Kanamoto from Meiji University, Ming-Shien Chang from the Academia Sinica, and Anand Jha from the Indian Institute of Technology. Bhattacharyas work was supported by aCAREER Award from the National Science Foundation.
The laser is shined through the superfluid in a minimally destructive manner, and the system can then read how the superfluid and light react, so the subatomic movements can be observed and studied.
This new research represents the first time that scientists will be able to get a closer look at how this seemingly-physics-defying moves. As scientists understand this wonder-liquid, they can start to harness it to make incredibly efficient power generation.
Bhattacharyas measuring method can also be used in quantum information processing.
So, clearly, theres a lot going on there. Lets break it down.
A superfluid is a gas or a liquid that can move without viscosity, or internal friction.
This means that the particles dont jostle each other, Bhattacharya said. Theyre not elbowing each other, or colliding.
Water for example has a very low viscosity. Its easy to imagine how quickly and smoothly water flows, compared to a highly viscous fluid, like maple syrup.
Its difficult to imagine, but a superfluid has zero internal friction.
This means that it slows down at an incredibly slow rate, meaning that once the gas or liquid is set in motion, its nearly impossible to stop. It also means that this movement of the particles doesnt lose energy like other processes of friction.
Slamming the brakes on your car introduces a lot of friction, and everyone knows that there is a lot of sound and heat that is given off. That is the release of energy when friction is applied. Superfluids dont have this.
This unusual trait can be harnessed practically if an electrical current is applied to it.
If you can get something to flow, its like current going around in a circle, Bhattacharya said.
That means that unlike normal electrical circuits that get incredibly hot when they are used to capacity, these atomtronic circuits with superfluid dont.
We dont really understand the physics of this, Bhattacharya said.
Part and parcel with this lack of understanding is that the only known superfluids like liquid helium only reach that state when they are supercooled. Needless to say, our cell phones would be massive and unusable if they needed a supercooler to use them.
Bhattacharya says that if someone can discover a superfluid that works at room temperature, they would not only win a Nobel prize, but they would revolutionize technology as we know it.
He says tests in Germany have shown that this technology admittedly with the supercooled superfluid can power entire towns in an economically feasible way.
Youd have to ask a computer scientist, Bhattacharya said when asked how much more powerful our phones would get. But it would be reasonable to say one hundred or one thousand times more powerful.
The challenge in creating a room temperature superfluid is that as Bhattacharya alluded to, physicists dont quite understand how superfluids really work, beyond the visually observable macro effects, like seeing it infinitely loop in a closed circuit, or the creep effect of liquid helium.
But to begin to understand how superfluids work, Bhattacharya and his team decided they needed a way to measure its subatomic movement, using quantum physics.
If you think about the particles of which the fluid is made, as little balls, it is impossible to explain what it is, without realizing that it also acts as a wave, he said.
So since an electrically charged superfluid circuit acts more like a wave rather than a particle, because of its lack of friction and electrical charges, it becomes a quantum object. Which means then that even the incredibly weak pressure force of a light wave will destroy the object, making it impossible to observe.
Bhattacharya and his team worked their way around this problem, by calibrating their laser light source to be a different wavelength than the superfluid that they are observing.
This minimally destructive method allows them to observe the incredibly small effect that the laser has, and by studying that wiggle, they can begin to determine how superfluid moves.
Once they understand how it moves, they can begin to figure out how to engineer a superfluid that stays in that state at room temperature.
Interestingly, this measuring method also has that application.
Scientists have begun to encode information on a paritcular kind of quantum particular that wiggles in a particular way. That can not only storage vast amounts of information, but move information at the speed of light.
Bhattacharya says that fiber optics does this is some manner, but the optics are too impure to move at that speed, and even the most advance fiber optics need signal boosters and repeaters at fairly regular intervals.
While quantum light processing made need those as well, the information would still be moving at the speed of light.
But his measuring technology can begin to more precisely measure the quantum wiggle, allowing them to figure out how to store the information longer.
It started at 1 millionth of a second, Bhattacharya said. Its now up to 60 seconds of storage. Thats seven orders of magnitude greater.
With Bhattacharya and his team on it, we may have those 1,000 times stronger cell phones in no time.
Originally posted here:
RIT professor and team discover new method to measure motion of superfluids - RochesterFirst
Posted in Quantum Physics
Comments Off on RIT professor and team discover new method to measure motion of superfluids – RochesterFirst
Two Indian American Women Researchers Win 2022 New Horizons ‘Oscars of Science’ Prize in Physics – India West
Posted: at 3:50 pm
Indian Americans Vedika Khemani, assistant professor of physics at Stanford University, and California Institute of Technology Astronomy Professor Mansi Kasliwalhave each been named recipients of the New Horizons in Physics prize from the Breakthrough Prize Foundation.
The prize is nicknamed the Oscars of Science. The Breakthrough Prize Foundations sponsors include Sergey Brin, co-founder of Google; Facebook founder Mark Zuckerberg and his wife Priscilla Chan; Russian-Israeli entrepreneurs and venture capitalists Yuri and Julia Milner; and Anne Wojcicki, CEO of the personal genomics company 23andMe.
Two Indian researchers from the University of Cambridge in England also won this years prize. Sir Shankar Balasubramanian, in theDepartment of Chemistryat theUniversity of Cambridge, was honored with the Life Sciences prize for developing next generation sequencing technologies, which allowed for immediate identification and characterization of the Covid-19 virus, rapid development of vaccines, and real-time monitoring of new genetic variants.
Before their inventions, re-sequencing a full human genome could take many months and cost millions of dollars; today, it can be done within a day at the cost of around $600. This resulted in a revolution in biology, enabling the revelation of unsuspected genetic diversity with major implications from cell and microbiome biology to ecology, forensics and personalized medicine, noted the Breakthrough Prize Foundation in a press statement.
Balasubramaniam was knighted in 2017.
Suchitra Sebastian, a condensed matter physicist at Cavendish Laboratory, University of Cambridge, received a New Horizons Prize in Physics for her work with high precision electronic and magnetic measurements that have profoundly changed our understanding of high temperature superconductors and unconventional insulators, according to a press release.
According to a press release issued by Stanford, time crystals, like all crystals, are structurally arranged in a repeating pattern. But, while standard crystals like diamonds or salt have an arrangement that repeats in space, time crystals repeat across time forever. Importantly, they do so without any input of energy, like a clock that runs forever without batteries.
Khemanis work offered a theoretical formulation for the first-time crystals, as well as a blueprint for their experimental creation. But she emphasized that time crystals are only one of the exciting potential outcomes of out-of-equilibrium quantum physics, which is still a nascent field, noted Stanford.
The researcher described her work as creating a checklist of what actually makes a time crystal a time crystal, and the measurements needed to experimentally establish its existence, both under ideal and realistic conditions.
Khemani sees great promise in these types of quantum experiments for physics. While many of these efforts are broadly motivated by the quest to build quantum computers which may only be achievable in the distant future, if at all these devices are also, and immediately, useful when viewed as experimental platforms for probing new non-equilibrium regimes in many-body physics, she said, in a press release issued by Stanford.
None of the world is in equilibrium; just look out your window, right? Were starting to see into these vastly larger spaces of how quantum systems evolve through experiments, said Khemani, who is on the faculty in the School of Humanities and Sciences and a member of Q-Farm, Stanfords broad interdisciplinary initiative in quantum science and engineering.
Im very excited to see what kinds of new physics these new regimes will bring. Time crystals are one example of something new we could get, but I think its just the beginning, she said.
In 2017, Kasliwal and her fellow researchers Gregg Hallinan, Alessandra Corsi, and Raffaella Margutti, helped make history with their observations of the first-ever cosmic event to be witnessed in both gravitational waves and electromagnetic, or light, waves.
The event, called GW170817, began when two dense stellar remnants, called neutron stars, spiraled together and collided, creating a storm of ripples in space and time, or gravitational waves, that traveled outward in all directions. Some of those waves ultimately reached Earth, where the Laser Interferometer Gravitational-wave Observatory detected their signatures, according to a CalTech press release.
Just seconds after the gravitational waves were produced, the neutron star collision resulted in an explosion of matter, as well as light spanning the electromagnetic spectrum, ranging from high-energy gamma rays to low-energy radio waves.
Kasliwal's team was one of the first to observe the collision in visible and infrared light, using the Global Relay of Observatories Watching Transients Happen project, a worldwide network of telescopes that specializes in catching short-lived energetic events such as this. The GROWTH team put together a picture of a cocoon breaking out to explain the rich multi-wavelength dataset.
"Pursuing astrophysics to unlock mysteries of our universe is truly a dream job for mea passion converted into a profession in a dynamic field where the book is actively being written. Discovering where and how the elements in our periodic table are synthesized is exhilarating, said Kasliwal.
Here is the original post:
Posted in Quantum Physics
Comments Off on Two Indian American Women Researchers Win 2022 New Horizons ‘Oscars of Science’ Prize in Physics – India West