Proof That a Complex Quantum Network Is Truly Quantum – Physics

May 11, 2023• Physics 16, s71

Researchers prove the fully nonclassical nature of a three-party quantum network, a requirement for developing secure quantum communication technologies.

In 1964, John Stewart Bell predicted that correlations between measurements made by two parties on a pair of entangled particles could confirm the fundamental nonclassical nature of the quantum world. In the past few years, researchers have performed various tests of Bells predictions that were rigorous enough to rule out classical explanations. Now researchers in China and Spain have done the same for a more complex systema quantum network in which three parties make measurements on pairs of entangled particles generated by two sources [1]. The researchers say that their stringent confirmation of quantum phenomena is encouraging for the development of future secure quantum communication networks.

To ensure a rigorous test of nonclassicality, and thereby prove that classical assumptions of local realism are invalid, the experiment must be carefully designed. If the parties making the measurements can communicate classically during the experiment, or if the two devices creating the entangled particles can influence one another, seemingly quantum behaviors can have classical explanations. In a quantum communication network these loopholes could allow eavesdroppers to listen in.

In their experiment, the researchers close these loopholes by placing each element of their network about 100 m apart. They also determine the measurement settings of the three parties using different quantum random number generators to make sure that the measurements are truly independent. These precautions allow the researchers to demonstrate that the network satisfies a condition known as full network nonlocality, which certifies that neither of the sources of entangled particles can be described by classical physics.

Marric Stephens

Marric Stephens is a Corresponding Editor forPhysics Magazine based in Bristol, UK.

Xue-Mei Gu, Liang Huang, Alejandro Pozas-Kerstjens, Yang-Fan Jiang, Dian Wu, Bing Bai, Qi-Chao Sun, Ming-Cheng Chen, Jun Zhang, Sixia Yu, Qiang Zhang, Chao-Yang Lu, and Jian-Wei Pan

Phys. Rev. Lett. 130, 190201 (2023)

Published May 11, 2023

A new kind of 3D optical lattice traps atoms using focused laser spots replicated in multiple planes and could eventually serve as a quantum computing platform. Read More

See original here:

Proof That a Complex Quantum Network Is Truly Quantum - Physics

Experiment contradicts Einstein and reveals spooky quantum action with superconducting qubits 30 meters apart – EL PAS USA

Quside/ICFOs ultra-fast and ultra-pure random number quantum generator used in the experiment.Quside/ICFO

Physicist James Trefil once said that quantum mechanics is a place where the human brain will simply never feel comfortable. This discomfort happens because nature, at a microscopic scale, obeys laws at odds with our perception of macroscopic reality. These laws include superposition (a particle can simultaneously be in different states, like Erwin Schrdingers live and dead cat), and quantum entanglement at a distance. Albert Einstein described the latter as spooky action at a distance, a principle allowing particles separated by distance to respond instantaneously and behave as a single system. A spectacular experiment that defies the speed of light was recently published in Nature by an international team of scientists led by the Swiss Federal Institute of Technology (ETH) in Zurich, collaborating with Spains Institute of Photonic Sciences (ICFO) and Quside, a quantum computing company. The study demonstrated for the first time super quick quantum random number generators that enable spooky action at a distance between superconducting quantum bits.

This experiments results contradict Einstein, who once considered quantum entanglement impossible. The physicist believed in the principle of locality, which states that an object is influenced directly only by its immediate surroundings. But advances in quantum physics have shown that two entangled particles can share a single unified state, even if they are 30 meters apart, as in the Zurich experiment.

Einstein could not accept that an action in one place could have an instantaneous effect elsewhere. But John Bell proved in 1964 that quantum entanglement exists. Subsequent experiments with this property by John Clauser, Alain Aspect and Anton Zeilinger earned them a Nobel Prize in 2022.

A major achievement of the study published in Nature is that it experimentally demonstrated in Bell tests performed on pairs of spatially separated, entangled quantum systems that quantum physics does not follow the principle of local causality with no so-called loopholes. The absence of loopholes means everything happens exactly as predicted by quantum physics no communication between particles.

A similar experiment was conducted a year ago by Spanish physicist Adn Cabello of the University of Seville (Spain) with ytterbium and barium ions (Science Advances). But the Nature study raised the complexity level by using two superconducting qubits entangled at temperatures close to absolute zero (-273.15C or -459.67F) and 30 meters apart.

Simultaneous measurements of the two qubits showed synchronized responses consistent with spooky action or entanglement at a distance. To demonstrate the absence of loopholes (that the coordination of states did not come from signals sent between qubits), the scientists made random 17-nanosecond measurements, which is the time it takes light to travel five meters. A full measurement required another 62 nanoseconds, the time for light to travel 21 meters. Because the systems were 30 meters apart, communication between the two was impossible.

The new study is significant because it has practical applications beyond the theoretical proof. Morgan W. Mitchell, a professor at the Catalan Institution for Research and Advanced Studies (ICREA) and a co-author of the study, said, With ordinary computing, your home device communicates constantly through the internet with a server. But to do something equivalent with quantum computers, we need to communicate them somehow, but not using classical bits. We have to use quantum bits and entanglement is the most efficient way to do this.

Mitchell said, This study shows that experiments like this can be done with the same superconductors used by Google and IBM. Other experiments used systems with a single pair of particles, but ours created entanglement between many electrons at both sites. And we achieved this for the first time without loopholes.

According to Mitchell, their experiment made progress toward distributed quantum computing with multiple computers at multiple sites Its a long-term goal that were not going to achieve immediately. But this experiment demonstrated its feasibility.

Carlos Abelln an expert in photonics and Qusides co-founder and CEO, said the experiment created a spectacular and unique technology that synchronized two particles with unprecedented speed. This required generating quantum random numbers and extracting them at extraordinarily fast speeds (17 nanoseconds) to eliminate any possibility of communication between the qubits. We had to engineer new ways of generating and extracting the random numbers before the information reached the other side. We needed to double the speed of earlier systems, said Abelln. Instead of using one device for calculations, we connected eight devices in parallel, and then synchronized and combined the signals. This gave us 16 random number generators with double the speed. If we had taken 19 nanoseconds instead of 17, the experiment would have been invalidated.

The experiment proved that quantum information can be transmitted between separate superconducting circuits housed in cryogenic systems. In other words, it works with currently available quantum computing systems. But why two separate systems can behave as one is still unexplained. Its a question for the philosophers, and a very difficult one at that. You can ask 10 different physicists and youre going to get 10 different answers. Its a mystery for new generations to solve. But these experiments prove that it really exists, said Mitchell.

Sign up for our weekly newsletter to get more English-language news coverage from EL PAS USA Edition

Read more:

Experiment contradicts Einstein and reveals spooky quantum action with superconducting qubits 30 meters apart - EL PAS USA

Stephen Hawking and I created his final theory of the cosmoshere’s what it reveals about the origins of time and life – Phys.org

This article has been reviewed according to ScienceX's editorial process and policies. Editors have highlighted the following attributes while ensuring the content's credibility:

fact-checked

trusted source

written by researcher(s)

proofread

The late physicist Stephen Hawking first asked me to work with him to develop "a new quantum theory of the Big Bang" in 1998. What started out as a doctoral project evolved over some 20 years into an intense collaboration that ended only with his passing on March 14 2018.

The enigma at the center of our research throughout this period was how the Big Bang could have created conditions so perfectly hospitable to life. Our answer is being published in a new book, "On the Origin of Time: Stephen Hawking's Final Theory."

Questions about the ultimate origin of the cosmos, or universe, take physics out of its comfort zone. Yet this was exactly where Hawking liked to venture. The prospector hopeto crack the riddle of cosmic design drove much of Hawking's research in cosmology. "To boldly go where Star Trek fears to tread" was his mottoand also his screen saver.

Our shared scientific quest meant that we inevitably grew close. Being around him, one could not fail to be influenced by his determination and optimism that we could tackle mystifying questions. He made me feel as if we were writing our own creation story, which, in a sense, we did.

In the old days, it was thought that the apparent design of the cosmos meant there had to be a designera God. Today, scientists instead point to the laws of physics. These laws have a number of striking life-engendering properties. Take the amount of matter and energy in the universe, the delicate ratios of the forces, or the number of spatial dimensions.

Physicists have discovered that if you tweak these properties ever so slightly, it renders the universe lifeless. It almost feels as if the universe is a fixeven a big one.

But where do the laws of physics come from? From Albert Einstein to Hawking in his earlier work, most 20th-century physicists regarded the mathematical relationships that underlie the physical laws as eternal truths. In this view, the apparent design of the cosmos is a matter of mathematical necessity. The universe is the way it is because nature had no choice.

Around the turn of the 21st century, a different explanation emerged. Perhaps we live in a multiverse, an enormous space that spawns a patchwork of universes, each with its own kind of Big Bang and physics. It would make sense, statistically, for a few of these universes to be life-friendly.

However, soon such multiverse musings got caught in a spiral of paradoxes and no verifiable predictions.

Can we do better? Yes, Hawking and I found out, but only by relinquishing the idea, inherent in multiverse cosmology, that our physical theories can take a God's-eye view, as if standing outside the entire cosmos.

It is an obvious and seemingly tautological point: cosmological theory must account for the fact that we exist within the universe. "We are not angels who view the universe from the outside," Hawking told me. "Our theories are never decoupled from us."

We set out to rethink cosmology from an observer's perspective. This required adopting the strange rules of quantum mechanics, which governs the microworld of particles and atoms.

According to quantum mechanics, particles can be in several possible locations at the same timea property called superposition. It is only when a particle is observed that it (randomly) picks a definite position. Quantum mechanics also involves random jumps and fluctuations, such as particles popping out of empty space and disappearing again.

In a quantum universe, therefore, a tangible past and future emerge out of a haze of possibilities by means of a continual process of observing. Such quantum observations don't need to be carried out by humans. The environment or even a single particle can "observe".

Countless such quantum acts of observation constantly transform what might be into what does happen, thereby drawing the universe more firmly into existence. And once something has been observed, all other possibilities become irrelevant.

We discovered that when looking back at the earliest stages of the universe through a quantum lens, there's a deeper level of evolution in which even the laws of physics change and evolve, in sync with the universe that is taking shape. What's more, this meta-evolution has a Darwinian flavor.

Variation enters because random quantum jumps cause frequent excursions from what's most probable. Selection enters because some of these excursions can be amplified and frozen, thanks to quantum observation. The interplay between these two competing forcesvariation and selectionin the primeval universe produced a branching tree of physical laws.

The upshot is a profound revision of the fundamentals of cosmology. Cosmologists usually start by assuming laws and initial conditions that existed at the moment of the Big Bang, then consider how today's universe evolved from them. But we suggest that these laws are themselves the result of evolution.

Dimensions, forces, and particle species transmute and diversify in the furnace of the hot Big Bangsomewhat analogous to how biological species emerge billions of years laterand acquire their effective form over time.

Moreover, the randomness involved means that the outcome of this evolutionthe specific set of physical laws that makes our universe what it iscan only be understood in retrospect.

In some sense, the early universe was a superposition of an enormous number of possible worlds. But we are looking at the universe today at a time when humans, galaxies and planets exist. That means we see the history that led to our evolution.

We observe parameters with "lucky values". But we are wrong to assume they were somehow designed or always like that.

The crux of our hypothesis is that, reasoning backward in time, evolution towards more simplicity and less structure continues all the way. Ultimately, even time and, with it, the physical laws fade away.

This view is especially borne out of the holographic form of our theory. The "holographic principle" in physics predicts that just as a hologram appears to have three dimensions when it is in fact encoded in only two dimensions, the evolution of the entire universe is similarly encoded on an abstract, timeless surface.

Hawking and I view time and causality as "emergent qualities", having no prior existence but arising from the interactions between countless quantum particles. It's a bit like how temperature emerges from many atoms moving collectively, even though no single atom has temperature.

One ventures back in time by zooming out and taking a fuzzier look at the hologram. Eventually, however, one loses all information encoded in the hologram. This would be the origin of timethe Big Bang.

For almost a century, we have studied the origin of the universe against the stable background of immutable laws of nature. But our theory reads the universe's history from within and as one that includes, in its earliest stages, the genealogy of the physical laws. It isn't the laws as such but their capacity to transmute that has the final word.

Future cosmological observations may find evidence of this. For instance, precision observations of gravitational wavesripples in the fabric of spacetimemay reveal signatures of some of the early branches of the universe. If spotted, Hawking's cosmological finale may well prove to be his greatest scientific legacy.

See more here:

Stephen Hawking and I created his final theory of the cosmoshere's what it reveals about the origins of time and life - Phys.org

Leading mathematician wants to solve the riddle of a million quantum particles: "It can only be done with a blackboard … – EurekAlert

image:Professor Sren Fournais of the University of Copenhagen view more

Credit: Jim Hyer/University of Copenhagen

"Imagine one of those spectacular opening ceremonies at the Olympics, where a huge crowd suddenly gathers as a unit and synchronizes its movements like a flock of starlings. In a few very special and strange cases, the same occurs in the world of atoms. In them, a million atoms that have all entered the same quantum state behave completely synchronously," explains University of Copenhagen math professor Sren Fournais.

Fournais is referring to the mysterious quantum phenomenon known as Bose-Einstein condensates. These can occur if certain kinds of atoms are successfully cooled to temperatures near absolute zero. Here, anywhere from 100,000 to several million atoms entangle, with all of them transitioning into the same quantum state at the same moment a point at which the substance is neither solid, liquid, gas nor plasma. It is often described as being in its own fifth state.

While Bose and Einstein predicted the existence of such a phenomenon in the 1920s, it wasnt until 1995 that a Bose-Einstein condensate was produced in the lab a Nobel Prize winning achievement. And even though researchers worldwide are busy exploring the new quantum state, much remains a mystery.

"Our general understanding of these extreme physical systems is incomplete. We lack the mathematical tools to analyse and understand condensates and thereby the ability to possibly put them to good use. The mathematical challenge is that you have a tremendous number of particles interacting with each other, and that the correlations between these particles are crucial for their behavior. The equations were written down a long time ago, but one cannot just solve them. In the years ahead, well focus on understanding these solutions," says University of Copenhagen mathematics professor Sren Fournais.

Fournais is one of the world's leading researchers in quantum mechanical equations and has already spent ten years of his career wrestling with the wondrous nature of Bose-Einstein condensates. He is now dedicating the next five years to getting closer to the phenomenons complex mathematical solutions. To do so, the European Research Council has awarded him DKK 15 million through one of its highly prestigious ERC Advanced Grants.

Quantum mechanics takes place in the micro-world of atoms, within which the quantum effects are so minuscule that we do not experience them in our daily lives. The truly fascinating aspect of a Bose-Einstein condensate is that because it is made up of huge masses of atoms, it is nearly large enough to be observed with the naked eye. This makes it ideal for teaching us about quantum mechanics and for conducting experiments on a scale large enough to actually see them.

Researchers around the world are working to exploit the quantum properties of Bose-Einstein condensates in various ways. In 2022, Dutch scientists built an atomic laser based on a Bose-Einstein condensate. Danish professor Lene Hau of Harvard University has demonstrated that she can stop the light using a Bose-Einstein condensate. Work is also underway around the world to base a quantum computer on these icy atoms. However, there are still plenty of bumps in the road and the condensates are still only used at the basic research level.

According to Sren Fournais, the missing answers are found in the equations or at least the theoretical answers:

"The beautiful and fascinating thing about mathematical physics is that we can write down the laws of nature in relatively simple equations on a piece of paper. Those equations contain an incredible amount of information in fact, all the answers. But how can we extract the information that tells us what we want to know about these wild physical systems? It's a huge challenge, but it's possible," says Fournais.

As with the phase transition that occurs when water freezes into ice, Bose-Einstein condensation, in which atoms transition to a quantum state, is a phase transition. It is this physical transformation that Sren Fournais dreams of finding the mathematical solution to.

"My big dream is to mathematically prove the phase transition that the Bose-Einstein condensation is. But demonstrating phase transition is notorious for being extremely difficult, because you go from particles moving randomly about to them sticking. A symmetry is broken. It is very difficult to see why the particles do so and exactly when it will happen. Consequently, up until now, there is only one physical system where we have succeeded in saying something mathematical about this phase transition," says Fournais, who adds:

"I think that its unrealistic to solve this task in five years but hope that the project will get us closer than we are today. And then we have many intermediate objectives that we hope to achieve along the way."

Over the past 5-10 years, Professor Fournais has been behind some of the most significant global breakthroughs having to do with the mathematical understanding of Bose-Einstein condensates. Among other things, he proved the formula for the ground-state energy of a Bose-Einstein condensate a question that had remained unanswered since about 1960.

So, how does one of Europe's leading mathematicians approach such a task? According to Sren Fournais, it doesnt happen in front of a computer:

"A computer can make a numerical calculation for 10 or 20 particles, but not a million. So, the computer is not a useful tool for us. Instead, it's a matter of a lot of coffee, good ideas and hard work at the blackboard with chalk," says the researcher, who concludes:

"A typical week begins with a few good new ideas from the previous weekend that you are eager to pursue. And then you work Monday, Tuesday and Wednesday until it seems to be progressing. Except that on Thursday, you realize that it isnt. But you get wiser. And then, the next round of ideas for the following week is perhaps a bit more tested. And then at some point, everything falls into place."

Disclaimer: AAAS and EurekAlert! are not responsible for the accuracy of news releases posted to EurekAlert! by contributing institutions or for the use of any information through the EurekAlert system.

Excerpt from:

Leading mathematician wants to solve the riddle of a million quantum particles: "It can only be done with a blackboard ... - EurekAlert

Quantum mechanics and the return of free will | Tim Andersen – IAI

The common definition of free will often has problems when relating to desire and power to choose. An alternative definition that ties free will to different outcomes for life despite one's past is supported by the probabilistic nature of quantum physics. This definition is compatible with the Many Worlds Interpretation of quantum physics, which refutes the conclusion that randomness does not imply free will, writes Tim Andersen.

Free will is one of those things where people tend to be very attached to its being true or false and yet most people implicitly treat it as true. Consider that we hold people accountable for their actions as if they decided to carry out those actions of their own free will. We reward people for their successes and discoveries likewise. If Albert Einstein didnt really make his discoveries but it was, instead, inevitable that his brain would do so, does he really deserve his Nobel Prize?

SUGGESTED READINGQuantum mechanics makes no sense without the mindBy Shan Gao

Some argue that we should accept that free will is a myth and change our society accordingly. Our justice system (especially in the United States) is heavily invested in the free will hypothesis. We punish people for crimes. We do no treat them like broken machines that need to be fixed. Other nations like Norway, however, take exactly this approach.

Many physicists believe that free will in incompatible with modern physics.

The argument goes like this:

(1) Classical (non-quantum) mechanics is deterministic. Given any initial conditions to a classical system, and the entire future and past state of the system can be determined. There is no free will in determinism.

(2) Quantum mechanics allows for randomness in the outcomes of experiments, but we have no control over those outcomes. There is no free will in randomness.

(3) Human will is a product of the brain which is a physical object. All physical objects are subject to physics and the sum total of physics is contained in classical and quantum mechanics (technically, classical is an approximation of quantum).

Ergo, humans have no free will. Our brains are simply carrying out a program that, while appearing to be making free choices, is in fact just a very complex algorithm.

___

As Schopenhauer said, Man can will what he wants but cannot will what he wills.

___

The logic seems sound and in any philosophical discourse we need to look at the logic and decide (whether freely or not). There are quite a few ways to counter this argument. The first is to object to #3. This is the approach many religions take. Human will is not necessarily reducible to physical causation. Therefore, it is beyond physical law. The brain simply interacts with the will and carries out its commands.

Another is to question the reductionist assumption of the conclusion, i.e., that everything is reducible to the properties of its constituent parts, no matter how complex. If the individual parts are deterministic, so must the whole. Science has not proven that yet. Perhaps if we could model a human brain in a computer in its entirety, we might know better.

Another approach is to question what the scientist means by free will. Most scientists arent philosophers and dont necessarily define their philosophical terms as clearly as their scientific ones. The common definition of free will is that it is the freedom to choose, the ability to decide to choose or do otherwise, to be the source of ones actions. Philosophers largely tie free will to the concept of moral responsibility. Am I morally responsible for my actions?

SUGGESTED VIEWINGNew theories of the universe With Sabine Hossenfelder, Phillip Ball, Bjrn Ekeberg, Sam Henry

To put is precisely, an agent S is morally accountable for performing an action =df. S deserves praise if goes beyond what can be reasonably expected of S and S deserves blame if is morally wrong. The key then is whether an agent has the ability or power to do otherwise.

Now, what does it mean to have the ability to choose or do otherwise? It cant simply mean to have the power because one must have both the power and the desire. But what if one does not have the power to change what one desires? Then you are stuck with no free will or only a pseudo-free will in which you can change your actions but not your desires.

As Schopenhauer said, Man can will what he wants but cannot will what he wills. Consider, if I have a choice to practice my cello or lift weights, I choose to practice my cello. Now, I seem to have had the power to choose to lift weights but I did not have the desire to do so. Did I have the power to desire differently?

From the argument of physics, the brains desires are either fixed results of classical laws or random results of quantum effects. A random quantum fluctuation creates voltage bias in one of my neurons which cascades to other neurons and suddenly I want to lift weights. According to the physicist, I did not choose. It just appeared as if I did. And certainly if I had chosen differently I would have done differently, and yet in reality quantum physics chose for me by rolling a cosmic die.

___

You have to see free will as having the power to have different outcomes for your life despite your past.

___

This kind of free will definition, which is the one most people think of and the one that most scientists seem to assume, has a lot of problems. Its hard to even understand what we really mean by freedom because it gets all muddled with desire.

Without a good definition, it is impossible to argue that something exists or not. Another definition of free will avoids this problem and throws a monkey wrench into the scientist on the streets knee-jerk attitude that free will is impossible in a quantum world.

This alternative is called the categorical analysis and is stated as follows: An agent S has the ability to choose or do otherwise than at time t if and only if it was possible, holding fixed everything up to t, that S choose or do otherwise than at t.

What this means is that we have to take into account the state of the agent up until the time the choice is made and given that state ask if there is a possible world where the agent makes a choice other than the one he or she made. That, then, is what freedom of choice is.

SUGGESTED READINGConsciousness is irrelevant to Quantum MechanicsBy Carlo Rovelli

Oxford physicist David Deutsch favours this definition of free will because it is compatible with his Many Worlds Interpretation (MWI) of quantum physics. But even if you dont accept MWI, what it says is that there are probable states that have the same past up until a point t and then a choice is made and a non-deterministic path is followed. It doesnt matter if those paths are all real worlds as Deutsch believes. What matters is that they have different futures, and all interpretations of quantum physics as-is support this idea.

If that is true, and this is the most important point, then you can say that freedom of choice exists because the agent made different choices in different probable realities. Thus, the agent had the power to choose and exercised it.

This definition of free will is interesting from a physics perspective because it is not true in a classical, deterministic world in which all pasts have the same future, but it is true in a quantum world where all pasts do not share the same future. Thus, it refutes the conclusion from #2 above that randomness does not imply free will. It only does so if you define free will in the way that people commonly understand it which is, frankly, not a defensible definition.

Rather, you have to see free will as having the power to have different outcomes for your life despite your past. Whether you can affect those outcomes by changing your actions or desires is a meaningless statement.

___

Freedom of choice exists because the agent made different choices in different probable realities.

___

Thus if I made the choice to practice in 60% of quantum futures and lift weights in 40%, then that proves I had the power to do otherwise. If I practiced in 100% of futures, then I did not have that power. Whether science can prove this is an open question, but it does not require any modification to quantum theory. Indeed, some modifications attempt to remove this possibility, incorrectly I believe.

While it may seem that this is sleight of hand in changing definitions, it is in reality making the definition of free will precise by saying that it is exactly the power to do otherwise. This is evidenced by quantum physics, i.e., because more than one outcome of a choice can occur from a single state of the universe, an agent does have the power to do otherwise which is what free will is.

See the rest here:

Quantum mechanics and the return of free will | Tim Andersen - IAI

How we could discover quantum gravity without rebuilding space-time – New Scientist

Shutterstock/Sola Solandra

MODERN physics has two stories to tell about our universe. The first says it is fundamentally made of space-time: a continuous, stretchy fabric that has ballooned since the dawn of time. The other says it is fundamentally made of indivisible things that cant decide where they are, or even when.

Both stories are compelling, describing what we observe with incredible accuracy. The big difference, though, is the scale at which they apply. Albert Einsteins theory of general relativity, which describes gravity, space and time, rules over very massive objects and cosmic distances. Quantum physics, meanwhile, governs tiny, sprightly atoms and subatomic particles.

Ultimately, both stories cant be true. Nowhere is this more apparent than at the big bang, where everything in the universe was compacted into an infinitesimally small point. Here, you need a single theory that encompasses gravity and the quantum realm. Why were here is the big question, says Toby Wiseman, a theorist at Imperial College London. It seems that quantum gravity is the only answer.

Alas, it is an answer we are yet to find, despite many decades of searching. Quantum gravity means a reconciliation of the continuous and the indivisible, the predictable and the random. There are many ideas, but none can totally incorporate everything. Were still no better off at understanding the beginning of space and time, says Wiseman.

Most physicists attempting this begin with quantum physics, the workhorse of which is quantum field theory. This describes three of the four forces of nature electromagnetism, the strong nuclear force and the weak nuclear force by quantising them as force-carrying elementary particles. It

See more here:

How we could discover quantum gravity without rebuilding space-time - New Scientist

Time Twisted in Quantum Physics: How the Future Might Influence … – SciTechDaily

The 2022 physics Nobel prize was awarded for experimental work demonstrating fundamental breaks in our understanding of the quantum world, leading to discussions around local realism and how it could be refuted. Many theorists believe these experiments challenge either locality (the notion that distant objects require a physical mediator to interact) or realism (the idea that theres an objective state of reality). However, a growing number of experts suggest an alternative approach, retrocausality, which posits that present actions can affect past events, thus preserving both locality and realism.

The 2022 Nobel Prize in physics highlighted the challenges quantum experiments pose to local realism. However, a growing body of experts propose retrocausality as a solution, suggesting that present actions can influence past events, thus preserving both locality and realism. This concept offers a novel approach to understanding causation and correlations in quantum mechanics, and despite some critics and confusion with superdeterminism, it is increasingly seen as a viable explanation for recent groundbreaking experiments, potentially safeguarding the core principles of special relativity.

In 2022, the physics Nobel prize was awarded for experimental work showing that the quantum world must break some of our fundamental intuitions about how the universe works.

Many look at those experiments and conclude that they challenge locality the intuition that distant objects need a physical mediator to interact. And indeed, a mysterious connection between distant particles would be one way to explain these experimental results.

Others instead think the experiments challenge realism the intuition that theres an objective state of affairs underlying our experience. After all, the experiments are only difficult to explain if our measurements are thought to correspond to something real. Either way, many physicists agree about whats been called the death by experiment of local realism.

But what if both of these intuitions can be saved, at the expense of a third? A growing group of experts think that we should abandon instead the assumption that present actions cant affect past events. Called retrocausality, this option claims to rescue both locality and realism.

What is causation anyway? Lets start with the line everyone knows: correlation is not causation. Some correlations are causation, but not all. Whats the difference?

Consider two examples. (1) Theres a correlation between a barometer needle and the weather thats why we learn about the weather by looking at the barometer. But no one thinks that the barometer needle is causing the weather. (2) Drinking strong coffee is correlated with a raised heart rate. Here it seems right to say that the first is causing the second.

The difference is that if we wiggle the barometer needle, we wont change the weather. The weather and the barometer needle are both controlled by a third thing, the atmospheric pressure thats why they are correlated. When we control the needle ourselves, we break the link to the air pressure, and the correlation goes away.

But if we intervene to change someones coffee consumption, well usually change their heart rate, too. Causal correlations are those that still hold when we wiggle one of the variables.

These days, the science of looking for these robust correlations is called causal discovery. Its a big name for a simple idea: finding out what else changes when we wiggle things around us.

In ordinary life, we usually take for granted that the effects of a wiggle are going to show up later than the wiggle itself. This is such a natural assumption that we dont notice that were making it.

But nothing in the scientific method requires this to happen, and it is easily abandoned in fantasy fiction. Similarly in some religions, we pray that our loved ones are among the survivors of yesterdays shipwreck, say. Were imagining that something we do now can affect something in the past. Thats retrocausality.

The quantum threat to locality (that distant objects need a physical mediator to interact) stems from an argument by the Northern Ireland physicist John Bell in the 1960s. Bell considered experiments in which two hypothetical physicists, Alice and Bob, each receive particles from a common source. Each chooses one of several measurement settings, and then records a measurement outcome. Repeated many times, the experiment generates a list of results.

Bell realized that quantum mechanics predicts that there will be strange correlations (now confirmed) in this data. They seemed to imply that Alices choice of setting has a subtle nonlocal influence on Bobs outcome, and vice versa even though Alice and Bob might be light years apart. Bells argument is said to pose a threat to Albert Einsteins theory of special relativity, which is an essential part of modern physics.

But thats because Bell assumed that quantum particles dont know what measurements they are going to encounter in the future. Retrocausal models propose that Alices and Bobs measurement choices affect the particles back at the source. This can explain the strange correlations, without breaking special relativity.

In recent work, weve proposed a simple mechanism for the strange correlation it involves a familiar statistical phenomenon called Berksons bias (see our popular summary here).

Theres now a thriving group of scholars who work on quantum retrocausality. But its still invisible to some experts in the wider field. It gets confused for a different view called superdeterminism.

Superdeterminism agrees with retrocausality that measurement choices and the underlying properties of the particles are somehow correlated.

But superdeterminism treats it like the correlation between the weather and the barometer needle. It assumes theres some mysterious third thing a superdeterminer that controls and correlates both our choices and the particles, the way atmospheric pressure controls both the weather and the barometer.

So superdeterminism denies that measurement choices are things we are free to wiggle at will, they are predetermined. Free wiggles would break the correlation, just as in the barometer case. Critics object that superdeterminism thus undercuts core assumptions necessary to undertake scientific experiments. They also say that it means denying free will, because something is controlling both the measurement choices and particles.

These objections dont apply to retrocausality. Retrocausalists do scientific causal discovery in the usual free, wiggly way. We say it is folk who dismiss retrocausality who are forgetting the scientific method, if they refuse to follow the evidence where it leads.

What is the evidence for retrocausality? Critics ask for experimental evidence, but thats the easy bit: the relevant experiments just won a Nobel Prize. The tricky part is showing that retrocausality gives the best explanation of these results.

Weve mentioned the potential to remove the threat to Einsteins special relativity. Thats a pretty big hint, in our view, and its surprising it has taken so long to explore it. The confusion with superdeterminism seems mainly to blame.

In addition, we and others have argued that retrocausality makes better sense of the fact that the microworld of particles doesnt care about the difference between past and future.

We dont mean that it is all plain sailing. The biggest worry about retrocausation is the possibility of sending signals to the past, opening the door to the paradoxes of time travel. But to make a paradox, the effect in the past has to be measured. If our young grandmother cant read our advice to avoid marrying grandpa, meaning we wouldnt come to exist, theres no paradox. And in the quantum case, its well known that we can never measure everything at once.

Still, theres work to do in devising concrete retrocausal models that enforce this restriction that you cant measure everything at once. So well close with a cautious conclusion. At this stage, its retrocausality that has the wind in its sails, so hull down towards the biggest prize of all: saving locality and realism from death by experiment.

Written by:

This article was first published in The Conversation.

Read this article:

Time Twisted in Quantum Physics: How the Future Might Influence ... - SciTechDaily

ETH Zurich Researchers Strengthen Quantum Mechanics with … – HPCwire

May 12, 2023 A group of researchers led by Andreas Wallraff, Professor of Solid State Physics at ETH Zurich, has performed a loophole-free Bell test to disprove the concept of local causality formulated by Albert Einstein in response to quantum mechanics.

By showing that quantum mechanical objects that are far apart can be much more strongly correlated with each other than is possible in conventional systems, the researchers have provided further confirmation for quantum mechanics. Whats special about this experiment is that the researchers were able for the first time to perform it using superconducting circuits, which are considered to be promising candidates for building powerful quantum computers.

An Old Dispute

A Bell test is based on an experimental setup that was initially devised as a thought experiment by British physicist John Bell in the 1960s. Bell wanted to settle a question that the greats of physics had already argued about in the 1930s: Are the predictions of quantum mechanics, which run completely counter to everyday intuition, correct, or do the conventional concepts of causality also apply in the atomic microcosm, as Albert Einstein believed?

To answer this question, Bell proposed to perform a random measurement on two entangled particles at the same time and check it against Bells inequality. If Einsteins concept of local causality is true, these experiments will always satisfy Bells inequality. By contrast, quantum mechanics predicts that they will violate it.

The Last Doubts Dispelled

In the early 1970s, John Francis Clauser and Stuart Freedman carried out the first practical Bell test. In their experiments, the two researchers were able to prove that Bells inequality is indeed violated. But they had to make certain assumptions in their experiments to be able to conduct them in the first place. So, theoretically, it might still have been the case that Einstein was correct to be skeptical of quantum mechanics.

Over time, however, more and more of these loopholes could be closed. Finally in 2015, various groups succeeded in conducting the first truly loophole-free Bell tests.

Promising Applications

Wallraffs group can now confirm these results with a novel experiment. The work by the ETH researchers published in the scientific journal Nature demonstrates that research on this topic has not concluded despite the initial confirmation seven years ago.

A number of factors contribute to this outcome. The experiment conducted by the ETH researchers establishes that, despite their larger size compared to microscopic quantum objects, superconducting circuits still abide by the principles of quantum mechanics. These electronic circuits, which are several hundred micrometers in size and made from superconducting materials, function at microwave frequencies and are known as macroscopic quantum objects.

In addition, Bell tests also have a practical significance. Modified Bell tests can be used in cryptography, for example, to demonstrate that information is actually transmitted in encrypted form, explained Simon Storz, a doctoral student in Wallraffs group. With our approach, we can prove much more efficiently than is possible in other experimental setups that Bells inequality is violated. That makes it particularly interesting for practical applications.

The Search for a Compromise

To carry out their research, the team required an advanced testing facility. A critical aspect of a loophole-free Bell test is ensuring that no information exchange occurs between the two entangled circuits before the completion of the quantum measurements. As information can only travel as fast as the speed of light, the measurement process must be faster than the time taken for a light particle to travel from one circuit to another.

When designing the experiment, striking the right balance is crucial. Increasing the distance between the two superconducting circuits allows for more time to conduct the measurement, but it also complicates the experimental setup. This is due to the need for the entire experiment to be carried out in a vacuum near absolute zero.

The ETH researchers determined that the minimum distance needed for a successful loophole-free Bell test is approximately 33 meters. A light particle takes about 110 nanoseconds to travel this distance in a vacuum, which is slightly longer than the time it took for the researchers to complete the experiment.

Thirty-meter Vacuum

Wallraffs team has built an impressive facility in the underground passageways of the ETH campus. At each of its two ends is a cryostat containing a superconducting circuit. These two cooling apparatuses are connected by a 30-meter-long tube with interiors cooled to a temperature just above absolute zero (273.15C).

Before the start of each measurement, a microwave photon is transmitted from one of the two superconducting circuits to the other so that the two circuits become entangled. Random number generators then decide which measurements are made on the two circuits as part of the Bell test. Next, the measurement results on both sides are compared.

Large-scale Entanglement

After evaluating more than one million measurements, the researchers have shown with very high statistical certainty that Bells inequality is violated in this experimental setup. In other words, they have confirmed that quantum mechanics also allows for non-local correlations in macroscopic electrical circuits and consequently that superconducting circuits can be entangled over a large distance. This opens up interesting possible applications in the field of distributed quantum computing and quantum cryptography.

Building the facility and carrying out the test was a challenge, Wallraff said. We were able to finance the project over a period of six years with funding from an ERC Advanced Grant. Just cooling the entire experimental setup to a temperature close to absolute zero takes considerable effort.

He explained, There are 1.3 tons of copper and 14,000 screws in our machine, as well as a great deal of physics knowledge and engineering know-how. Wallraff further believes that, in principle, it is possible to construct facilities capable of overcoming even greater distances using the same approach. Such technology could potentially be employed to connect superconducting quantum computers across vast distances.

Source: Felix Wrsten, ETH Zrich

Read the rest here:

ETH Zurich Researchers Strengthen Quantum Mechanics with ... - HPCwire

Theoretical Physicists Discover Why Optical Cavities Slow Down … – SciTechDaily

Resonant vibrational strong-coupling can inhibit chemical reactions. Strong resonant coupling between cavity and vibrational modes can selectively inhibit a chemical reaction, i.e., preventing the appearance of products, that is present outside the cavity environment. Credit: E. Ronca / C. Schfer

Scientists have discovered why chemical reactions are slowed down in mirrored cavities, where molecules interact with light. The team used Quantum-Electrodynamical Density-Functional Theory to find that the conditions inside the optical cavity affected the energy that makes atoms vibrate around the molecules single bonds, which are critical to the reaction.

Chemical processes are all around us. From novel materials to more effective medicines or plastic products chemical reactions play a key role in the design of the things we use every day. Scientists constantly search for better ways to control these reactions, for example, to develop new materials. Now an international research team led by the MPSD has found an explanation why chemical reactions are slowed down inside mirrored cavities, where molecules are forced to interact with light. Their work, now published in the journal Nature Communications, is a key step in understanding this experimentally observed process.

Chemical reactions occur on the scale of atomic vibrations one million times smaller than the thickness of a human hair. These tiny movements are difficult to control. Established methods include the control of temperature or providing surfaces and complexes in solution made from rare materials. They tackle the problem on a larger scale and cannot target specific parts of the molecule. Ideally, researchers would like to provide only a small amount of energy to some atoms at the right time, just like a billiard player wants to nudge just one ball on the table.

In recent years, it became clear that molecules undergo fundamental changes when they are placed in optical cavities with opposing mirrors. Inside those confines, the system is forced to interact with virtual light, or photons. Crucially, this interaction changes the rate of chemical reactions an effect that was observed in experiments but whose underlying mechanism remained a mystery.

Now a team of theoretical physicists from Germany, Sweden, Italy, and the USA has come up with a possible explanation that qualitatively agrees with the experimental results. The team involved researchers from the Max Planck Institute for the Structure and Dynamics of Matter (MPSD) in Hamburg, Germany, Chalmers University of Technology in Sweden, the Center for Computational Quantum Physics at the Flatiron Institute, Harvard University (both in the U.S.A.), and the Istituto per i Processi Chimico Fisici at the CNR (National Research Council) in Italy.

Using an advanced theoretical method, called Quantum-Electrodynamical Density-Functional Theory (QEDFT), the authors have unveiled the microscopic mechanism which reduces the chemical reaction rate, for the specific case of the deprotection reaction of 1-phenyl-2-trimethylsilylacetylene. Their findings are in agreement with the observations by the group of Thomas Ebbesen in Strasbourg.

The team discovered that the conditions inside the optical cavity affect the energy which makes the atoms vibrate around the molecules single bonds, which are critical for the chemical reaction. Outside the cavity, that energy is usually deposited in a single bond during the reaction, which can ultimately break the bond a key step in a chemical reaction. However, we find that the cavity introduces a new pathway, so that the energy is less likely to be funneled only into a single bond, says lead author Christian Schfer. This is the key process which inhibits the chemical reaction, because the probability to break a specific bond is diminished.

Manipulating materials through the use of cavities (the so-called polaritonic chemistry) is a powerful tool with many potential applications, according to the papers author Enrico Ronca, who works at CNR: For instance, it was observed that coupling to specific vibrational excitations can inhibit, steer, and even catalyze a chemical process at room temperature. Our theoretical work enhances the understanding of the underlying microscopic mechanisms for the specific case of a reaction inhibited by the field.

While the authors point out that important aspects remain to be understood and further experimental validation is required, they also highlight the special role of this new direction. This works puts the controversial field of polaritonic chemistry onto a different level, adds Angel Rubio, the Director of the MPSDs Theory Department. It provides fundamental insights into the microscopic mechanisms that enable the control of chemical reactions. We expect the present findings to be applicable to a larger set of relevant reactions (including click chemical reactions linked to this years Nobel Prize in chemistry) under strong light-matter coupling conditions.

Reference: Shining light on the microscopic resonant mechanism responsible for cavity-mediated chemical reactivity by Christian Schfer, Johannes Flick, Enrico Ronca, Prineha Narang and Angel Rubio, 19 December 2022, Nature Communications.DOI: 10.1038/s41467-022-35363-6

See more here:

Theoretical Physicists Discover Why Optical Cavities Slow Down ... - SciTechDaily

Stanley Deser, Whose Ideas on Gravity Help Explain the Universe … – The New York Times

Stanley Deser, a theoretical physicist who helped illuminate the details of gravity and how it shapes the space-time fabric of the universe, died on April 21 in Pasadena, Calif. He was 92.

His death, at a hospital, was confirmed by his daughter, Abigail Deser.

Physicists have long dreamed of devising a theory of everything a set of equations that neatly and completely describe how the universe works. By the middle of the 20th century, they had come up with two theories that serve as the pillars of modern physics: quantum mechanics and general relativity.

Quantum mechanics describes how, in the subatomic realm, everything is broken up in discrete chunks, or quanta, such as the individual particles of light called photons. Albert Einsteins theory of general relativity had elegantly captured how mass and gravity bend the fabric of space-time.

However, these two pillars did not fit together. General relativity does not contain any notion of quanta; a quantum theory of gravity is an ambition that remains unfinished today.

The problem we face is how to unify these two into a seamless theory of everything, said Michael Duff, an emeritus professor of physics at Imperial College London in England. Stanley was amongst the first to tackle this problem.

In 1959, Dr. Deser, along with two other physicists, Richard Arnowitt and Charles Misner, published what is now known as the ADM formalism (named after the initials of their surnames), which rejiggered the equations of general relativity in a form that laid a foundation for work toward a quantum theory of gravity.

Its a bridge toward quantum, said Edward Witten, a physicist at the Institute for Advanced Study in Princeton, N.J. So far, however, no one has been able to take it to the next step and come up with a unified theory that includes quantum gravity.

The ADM formalism offered additional benefit: It made general relativity equations amenable to computer simulations, enabling scientists to probe phenomena like the space-bending pull of black holes and the universe-shaking explosions when stars collide.

The rejiggered equations split four-dimensional space-time into slices of three-dimensional space, an innovation that allowed computers to handle the complex data and, as Frans Pretorius, a professor of physics at Princeton University, put it, evolve these slices in time to find the full solution.

Dr. Deser is perhaps best known for his work in the 1970s as one of the pioneers of supergravity, which expanded an idea known as supersymmetry to include gravity.

From quantum mechanics, physicists already knew that fundamental particles fell into one of two groups. Familiar constituents of matter like electrons and quarks fall into the group known as fermions; while those that carry fundamental forces like photons, the particles of light that convey the force of electromagnetism, are known as bosons.

Supersymmetry hypothesizes an as-yet-undiscovered boson partner for every fermion, and a fermion partner for each boson.

Dr. Deser worked with Bruno Zumino, one of the originators of supersymmetry, to add gravity to the theory, creating the theory of supergravity. Supergravity includes gravitons the gravitational equivalent of photons and adds a supersymmetric partner, the gravitino.

Experiments using particle accelerators have yet to turn up evidence of any of these partner particles, but the theories have not been disproved, and because of their mathematical elegance, they remain attractive to physicists.

Supergravity is also a key aspect of superstring theories, which attempt to provide a complete explanation of how the universe works, overcoming shortfalls of quantum gravity theories.

Stanley was one of the most influential researchers on questions related to gravity over his extremely long and distinguished career, said Dr. Witten, who has been at the forefront of devising superstring theories.

Stanley Deser was born in Rovno, Poland, a city now known as Rivne and part of Ukraine, on March 19, 1931. As Jews, his parents, Norman, a chemist, and Miriam, fled Polands repressive, antisemitic regime in 1935 for Palestine. But prospects for finding work there were dim, and a few months later they moved to Paris.

In 1940, with World War II engulfing Europe, the family narrowly escaped France after Germany invaded.

They finally realized the danger and decided to leave everything, Dr. Deser wrote of his parents in his autobiography, Forks in the Road. I rushed with my father to empty our safe. That evening, my mother sewed the coins into a belt of towels, a much-practiced maneuver of refugees, while the rest of us packed a few belongings.

The family fled to Portugal and 11 months later obtained visas to emigrate to the United States. They eventually settled in New York City, where Norman and Miriam ran a chemical supplies business.

By age 12 Stanley had been promoted to 10th grade, and he graduated from high school at 14. He earned a bachelors degree in physics from Brooklyn College in 1949 at 18, then went to Harvard, where he studied under Julian Schwinger, a Nobel Prize laureate. He completed his doctorate in 1953.

After postdoctoral fellowships at the Institute for Advanced Study and the Niels Bohr Institute in Copenhagen, Dr. Deser joined the faculty of Brandeis University in 1958.

The following three years, working on the ADM formalism, provided the best run of luck that one could possibly hope for, he wrote in his autobiography.

In an interview last year for Caltechs Heritage Project, Dr. Deser recalled that he, Dr. Arnowitt and Dr. Misner completed much of the work during summers in Denmark, in a kindergarten classroom. The nice thing about this kindergarten, it has blackboards, he said. Denmark is very good that way.

Since the blackboards were mounted low for children, we would crawl and write equations, Dr. Deser said. And the papers just poured out.

Dr. Misner, an emeritus professor of physics at the University of Maryland, said there were parallels between the ADM recasting of general relativity and the quantum field theory of electromagnetism that other physicists were working on, and they were able to apply that experience to general relativity.

The work on supergravity occurred during a stay at the CERN particle laboratory in Geneva where Dr. Zumino worked. In a period of just three weeks, to our amazement, we had a consistent theory, Dr. Deser recalled.

He and Dr. Zumino published a paper about supergravity in June 1976. However, another group of physicists Daniel Freedman, Sergio Ferrara and Peter van Nieuwenhuizen beat them to the punch, describing supergravity in a paper that had been completed about a month before Dr. Deser and Dr. Zumino submitted theirs.

As a result, Dr. Deser said, sometimes the work that he and Dr. Zumino did was overlooked. In 2019, a Breakthrough Prize in Fundamental Physics accompanied by $3 million was awarded to the other team.

He was understandably upset, Dr. Duff, the British physicist, said. I think they could have erred on the side of generosity and included Stanley as the fourth recipient. (Dr. Zumino died in 2014.)

Dr. Schwarz and Dr. Witten, who were members of the committee that awarded the prize, declined to discuss the particulars of the decision, but Dr. Schwarz said, It was a purely scientific decision.

Dr. Deser worked at Brandeis until he retired in 2005. He then moved to Pasadena to be close to his daughter and obtained an unpaid position as a senior research associate at Caltech.

In addition to Abigail, he is survived by two other daughters, Toni Deser and Clara Deser, and four grandchildren.

His wife of 64 years, Elsbeth Deser, died in 2020. A daughter, Eva, died in 1968.

While Dr. Deser was an expert on gravity and general relativity, he was not infallible.

In the Caltech interview, he recalled a paper in which he suggested that gravity could solve some troubling infinities that were showing up in the quantum field theory of electrodynamics.

Other noteworthy physicists had similar thoughts but did not publish them. Dr. Deser did.

It was garbage, he said. During a talk at a conference, Richard Feynman, the Nobel Prize-winning physicist who devised much of quantum electrodynamics, without much difficulty shot me to pieces, which I deserved, he said.

He added, Everybodys entitled to a few strikes.

Read more here:

Stanley Deser, Whose Ideas on Gravity Help Explain the Universe ... - The New York Times

UMD Quantum Physicist Elected to National Academy of Sciences – Maryland Today

A groundbreaking quantum physics researcher who has long been affiliated with the Joint Quantum Institute at the University of Maryland was elected a member of the National Academy of Scienceslast week.

Paul Julienne, an emeritus fellow at JQI and an adjunct professor of physics at UMD, joined 142 other U.S. and international members recognized in 2023 for their exceptional, ongoing achievements in original research. Hes one of 22 current UMD faculty members in the National Academy of Sciences, and 67 named to various esteemed honorary academies.

Julienne helped establish the research field of ultracold matter, which investigates atoms and molecules near absolute zero. His theoretical research includes developing models that describe how cold trapped molecules and atoms can be precisely controlled using magnetic fields or lasers. This research topic has revealed details of atomic states and chemical reactions of ultracold molecules.

I am both gratified and humbled by this honor, which is only possible because of the many excellent colleagues and students with whom I have worked with over the years, Julienne said. I owe them a debt of gratitude, for it is by working together that science advances."

Julienne joined JQI in 2007, soon after its founding as a joint research institute combining the scientific strengths of the University of Maryland with the National Institute of Standards and Technology (NIST). The university and the federal agency have since broadened and deepened their collaboration with other quantum centers and institutes at UMD, helping lay the foundation for the university to become one of the most vibrant loci of quantum research in the world.

UMDs hundreds of researchers, partnerships with government agencies and labs, and collaboration with a wide range of firms in the quantum space inspired university President Darryll J. Pines to refer to the scientific and tech ferment centered at UMD as the Capital of Quantum.

Paul Julienne's election to the National Academy of Sciences highlights his remarkable achievements in the field of ultracold matter and underscores the significance of his contributions to science and the quantum revolution, Pines said. We are honored to have such a distinguished researcher and educator as part of our institution."

Julienne earned a B.S. in chemistry from Wofford College in 1965 and his Ph.D. in chemical physics from the University of North Carolina at Chapel Hill in 1969. He worked as a postdoctoral researcher at the National Bureau of Standards and as a staff researcher at the Naval Research Laboratory before beginning a career of nearly 40 years at NIST, first as a research scientist and then as a NIST fellow, retiring in 2013. Among his other awards and accomplishments, he received the 2015 William F. Meggers Award of the Optical Society of America and the 2004 Davisson-Germer Prize of the American Physical Society and is a fellow of the division of Atomic, Molecular, and Optical Physics of the American Physical Society.

We are proud to see Dr. Julienne honored with one of the highest professional distinctions accorded to a scientist, said Amitabh Varshney, dean of UMDs College of Computer, Mathematical, and Natural Sciences. Our college extends its congratulations to him for this well-deserved recognition.

See the article here:

UMD Quantum Physicist Elected to National Academy of Sciences - Maryland Today

Is Quantum Computing a Threat To Current Encryption Methods? – Spiceworks News and Insights

Encryption is the backbone of cybersecurity, keeping data and systems secure. Quantum computing threatens to make todays encryption obsolete. Developing quantum-secure encryption is one of the main challenges facing the cybersecurity sector today, highlights Michael Redding, chief technology officer at Quantropi.

Explaining how quantum computers work is challenging. It involves presenting complicated scientific concepts like superposition, which allows groups of qubits to create multidimensional computational spaces. For those who do not have a background in quantum physics, quantum computing can seem more like science fiction than computer science.

Explaining what quantum computers do, however, is much easier. In essence, they leverage the behavior of subatomic particles to increase computation speed exponentially. When Google announced in October 2019 that it had achieved quantum supremacyOpens a new window , it was celebrating the fact that it had used quantum computing to solve a complex mathematical problem in 3 minutes and 20 seconds. How long would a conventional computer have taken to solve the same problem? According to Google, it would have taken at least 10,000 years.

How will the world use this mind-blowingly fast processing power? Experts predict it will transform a number of industries, from pharmaceuticals to finance to supply chain management. However, the quantum computing use case that has been making the most headlines in recent months is cybersecurity.

Encryption is the backbone of cybersecurity. It is the tool that keeps critical data under lock and key. Without it, security and privacy would be impossible to achieve.

Hackers have a number of avenues for gaining unauthorized access to encrypted information. One popular method involves social engineering attacks that seek to trick someone into revealing the password that provides access to data. Rather than cracking the code, hackers using social engineering attacks simply steal the key.

Data breaches provide another option for obtaining passwords. Reports of breaches regularly make the news, and each breach has the potential to put passwords into the hands of bad actors seeking to obtain access to encrypted data.

Brute force attacks represent a different approach to cracking encryption. Rather than trying to obtain the password from a user or stolen data, these attacks use computers to cycle through possible passwords until the correct one is found. Essentially, brute force attacks figure out passwords through trial and error, leveraging computers to do the work quickly and systematically.

Current encryption methods are considered effective in thwarting brute force attacks, as the most advanced encryption systems work with passwords or keys that are long and complicated or highly random. With todays computers, deciphering the key through trial and error can take millions of years.

However, quantum computing changes the timeline for cracking todays encryption. By exponentially increasing processing speed, quantum computers could break the most advanced keys commonly used today in minutes.

When will bad actors have access to a quantum computer capable of threatening todays encryption? Based on Shors AlgorithmOpens a new window , a quantum computer would need millions of qubits with a quantum circuit depth measured in the billions with essentially perfect calculation fidelity.

Based on todays quantum computing capability, that would put Y2Q into the 2040s, if ever. However, breakthroughs that have been achieved in 2023 in research out of China and Germany using a hybrid classic + quantum attack vector using AI and machine learning have drastically reduced the quantum capabilities required to break asymmetric encryption as compared to Shors Algorithm.

Combining these new AI and machine learning hybrid attack vectors with the rapid advancement of quantum computing capabilities begins to crystalize a pathway to Y2Q in the next 1 to 5 years. It is no longer a question of how to break asymmetric encryption with todays generation of technology the approach has been published. Now, It is only a matter of optimization and continued incremental technology improvements.

To address Y2Qs impact on security, developers are focusing on two main approaches to quantum security: post-quantum cryptography and quantum key distribution. Post-quantum cryptography (PQC) leverages complex mathematical algorithms to provide security that is resistant to quantum attacks, while quantum key distribution involves exploiting the properties of quantum mechanics to bolster security.

PQC provides an efficient means of updating security systems because it is math-based, which allows it to be implemented through computer coding and deployed in end devices with a simple software update. However, PQCs security relies on complex/hard mathematical calculations and, in some cases, large key sizes, both of which come with considerable performance costs.

Organizations that seek to quantum-proof their systems with PQC must be aware that considerable infrastructure updates may be necessary. Because PQC encryption schemes are typically more complex than those currently in use, they require more resources for encrypting and decrypting, including more time, storage space, memory, and network bandwidth.

For the average user relying on PQC for booting machines or encrypting data related to web browsing, the additional processing burden might not be noticeable. However, organizations simultaneously transmitting and receiving thousands or millions of digital transactions per second must consider the impact this will have on their performance. Failure to do so can create dangerous latency in devices that rely on high efficiency, such as the systems that manage computer-aided driving software in autonomous vehicles.

PQC also poses challenges for updating internet of things (IoT) devices to quantum-secure encryption. Smart doorbells and other intelligent appliances will become vulnerable if their encryption systems are not updated, though they typically do not have the processing power to support PQC effectively.

Quantum key distribution (QKD) is another option for quantum-resistant encryption. This approach relies on the laws of quantum physics rather than mathematics to generate and transmit encryption keys between two parties. The natural laws involved in this process also provide warnings to users when QKD transmissions are disturbed or intercepted by bad actors.

Theoretically, QKD provides security that is effective against quantum computing attacks and can withstand attacks for an indefinite amount of time. Practically, however, making it a reality would require overcoming a number of significant technical challenges. QKD uses photon emitters and receivers to create quantum entanglement between two devices. However, the current state of this technology is largely experimental, with few commercial deployments and significant limitations on bandwidth, distance, complexity, and cost that continue to be explored and improved upon.

See More: Why and Where the PQC Market is Gaining Traction

Developing quantum-secure encryption is just the first step toward preparing for Y2Q. In order to be truly quantum secure, organizations must assess where they are vulnerable, determine how to integrate new security systems in those areas, deploy those systems, and test them. It is a process that could take years, and the clock is ticking.

Those who understand the stakes involved are already taking steps. For example, the US government issued a National Security Memorandum in May 2022 that warns of the significant risks that quantum computing poses to the economic and national security of the United States. The memorandum calls for a timely transition to quantum-resistant cryptography.

Rob Joyce, NSA cybersecurity director and deputy national manager for national security systems, highlightedOpens a new window the need to push forward in achieving quantum-resistant systems in his comments on the memorandum. He stated: Implementing approved quantum-resistant cryptographic solutions across all of our systems will not happen overnight, but its critical that we chart a path to get there considering the potential threat of quantum computing.

In the end, public and private organizations need to prepare for Y2Q immediately to protect their data, connected devices, systems, and communications. The time is now.

Are you preparing for Y2Q? How are you upgrading to quantum-secure encryption? Share with us on FacebookOpens a new window , TwitterOpens a new window , and LinkedInOpens a new window . Wed love to hear from you!

Image Source: Shutterstock

Go here to see the original:

Is Quantum Computing a Threat To Current Encryption Methods? - Spiceworks News and Insights

With new experimental method, researchers probe spin structure in 2D materials for first time – Phys.org

This article has been reviewed according to ScienceX's editorial process and policies. Editors have highlighted the following attributes while ensuring the content's credibility:

fact-checked

peer-reviewed publication

trusted source

proofread

For two decades, physicists have tried to directly manipulate the spin of electrons in 2D materials like graphene. Doing so could spark key advances in the burgeoning world of 2D electronics, a field where super-fast, small and flexible electronic devices carry out computations based on quantum mechanics.

Standing in the way is that the typical way in which scientists measure the spin of electronsan essential behavior that gives everything in the physical universe its structureusually doesn't work in 2D materials. This makes it incredibly difficult to fully understand the materials and propel forward technological advances based on them. But a team of scientists led by Brown University researchers believe they now have a way around this longstanding challenge. They describe their solution in a new study published in Nature Physics.

In the study, the teamwhich also include scientists from the Center for Integrated Nanotechnologies at Sandia National Laboratories, and the University of Innsbruckdescribe what they believe to be the first measurement showing direct interaction between electrons spinning in a 2D material and photons coming from microwave radiation.

Called a coupling, the absorption of microwave photons by electrons establishes a novel experimental technique for directly studying the properties of how electrons spin in these 2D quantum materialsone that could serve as a foundation for developing computational and communicational technologies based on those materials, according to the researchers.

"Spin structure is the most important part of a quantum phenomenon, but we've never really had a direct probe for it in these 2D materials," said Jia Li, an assistant professor of physics at Brown and senior author of the research. "That challenge has prevented us from theoretically studying spin in these fascinating material for the last two decades. We can now use this method to study a lot of different systems that we could not study before."

The researchers made the measurements on a relatively new 2D material called "magic-angle" twisted bilayer graphene. This graphene-based material is created when two sheets of ultrathin layers of carbon are stacked and twisted to just the right angle, converting the new double-layered structure into a superconductor that allows electricity to flow without resistance or energy waste. Just discovered in 2018, the researchers focused on the material because of the potential and mystery surrounding it.

"A lot of the major questions that were posed in 2018 have still yet to be answered," said Erin Morissette, a graduate student in Li's lab at Brown who led the work.

Physicists usually use nuclear magnetic resonance or NMR to measure the spin of electrons. They do this by exciting the nuclear magnetic properties in a sample material using microwave radiation and then reading the different signatures this radiation causes to measure spin.

The challenge with 2D materials is that the magnetic signature of electrons in response to the microwave excitation is too small to detect. The research team decided to improvise. Instead of directly detecting the magnetization of the electrons, they measured subtle changes in electronic resistance, which were caused by the changes in magnetization from the radiation using a device fabricated at the Institute for Molecular and Nanoscale Innovation at Brown.

These small variations in the flow of the electronic currents allowed the researchers to use the device to detect that the electrons were absorbing the photos from the microwave radiation.

The researchers were able to observe novel information from the experiments. The team noticed, for instance, that interactions between the photons and electrons made electrons in certain sections of the system behave as they would in an anti-ferromagnetic systemmeaning the magnetism of some atoms was canceled out by a set of magnetic atoms that are aligned in a reverse direction.

The new method for studying spin in 2D materials and the current findings won't be applicable to technology today, but the research team sees potential applications the method could lead to in the future. They plan to continue to apply their method to twisted bilayer graphene but also expand it to other 2D material.

"It's a really diverse toolset that we can use to access an important part of the electronic order in these strongly correlated systems and in general to understand how electrons can behave in 2D materials," Morissette said.

More information: Andrew Mounce, Dirac revivals drive a resonance response in twisted bilayer graphene, Nature Physics (2023). DOI: 10.1038/s41567-023-02060-0. http://www.nature.com/articles/s41567-023-02060-0

Journal information: Nature Physics

Originally posted here:

With new experimental method, researchers probe spin structure in 2D materials for first time - Phys.org

Physicists tracked electron recollision in real-time – Tech Explorist

The measurement of the fastest dynamical processes in nature typically relies on observing the nonlinear response of a system to precisely timed interactions with external stimuli. This usually requires two (or more) controlled events, with time-resolved information gained by controllably varying the interpulse delays.

Using a breakthrough technique created by MPIK physicists and used to verify quantum-dynamics theory by collaborators at MPI-PKS, the movement of an electron in a strong infrared laser field is tracked in real-time. The experimental method connects the free-electron mobility caused by the subsequent near-infrared pulse to the absorption spectrum of the ionizing extreme ultraviolet pulse.

Although the electron is a quantum object, the classical description of its motion is appropriate for our experimental technique.

Strong-field physics fundamentally depends on high-harmonic generation, which converts optical or near-infrared (NIR) light into the extreme ultraviolet (XUV) regime. In the well-known three-step concept, the driving light field (1) ionizes the electron by tunnel ionization, (2) accelerates it away and back to the ionic core, where the electron (3) recollides and emits XUV light if it recombines.

In this study, physicists replaced the first step with an XUV single-photon ionization, which has a twofold advantage: First, one can choose the ionization time relative to the NIR phase. Second, the NIR laser can be tuned to low intensities where tunnel ionization is practically impossible. This allows us to study strong-field-driven electron recollision in a low-intensity limiting case.

Attosecond transient absorption spectroscopy, previously established by a team led by Christian Ott, for bound electrons, is the method used here, along with reconstructing the time-dependent dipole moment. It links the time-dependent dipole moment with the classical motion (trajectories) of the ionized electrons, in this case, by extending the approach to free electrons.

Ph.D. student Tobias Heldt said,Our new method, applied to helium as a model system, links the absorption spectrum of the ionizing light to the electron trajectories. This allows us to study ultrafast dynamics with a single spectroscopic measurement without scanning a time delay to compose the dynamics frame by frame.

The results of the measurements indicate that, depending on the experimental settings, circular polarisation of the light wave can increase the likelihood of bringing the electron back to the ion. Despite seeming counterintuitive, theorists had anticipated this result.

This interpretation of recolliding periodic orbits is also justified by classical simulations. Whenever electron (re-)collides with the helium atom (the green line intersects the white center line), it leads to a characteristic modification and increase of the time-dependent atomic dipole (the result of the quick red-blue oscillation near the center line), which an attosecond absorption-spectroscopy experiment can pick up.

Group leader Christian Ott is optimistic about the future potential of this new approach:In general, our technique allows us to explore laser-driven electron motion in a new lower-intensity regime, and it could further be applied on various systems, e.g., for studying the laser-driven electron dynamics within larger atoms or molecules.

Journal Reference:

Read the original post:

Physicists tracked electron recollision in real-time - Tech Explorist

Researchers discover superconductive images are actually 3D and disorder-driven fractals – Phys.org

This article has been reviewed according to ScienceX's editorial process and policies. Editors have highlighted the following attributes while ensuring the content's credibility:

fact-checked

peer-reviewed publication

trusted source

proofread

Meeting the world's energy demands is reaching a critical point. Powering the technological age has caused issues globally. It is increasingly important to create superconductors that can operate at ambient pressure and temperature. This would go a long way toward solving the energy crisis.

Advancements with superconductivity hinge on advances in quantum materials. When electrons inside of quantum materials undergo a phase transition, the electrons can form intricate patterns, such as fractals. A fractal is a never-ending pattern. When zooming in on a fractal, the image looks the same. Commonly seen fractals can be a tree or frost on a windowpane in winter. Fractals can form in two dimensions, like the frost on a window, or in three-dimensional space like the limbs of a tree.

Dr. Erica Carlson, a 150th Anniversary Professor of Physics and Astronomy at Purdue University, led a team that developed theoretical techniques for characterizing the fractal shapes that these electrons make, in order to uncover the underlying physics driving the patterns.

Carlson, a theoretical physicist, has evaluated high resolution images of the locations of electrons in the superconductor Bi2-xPbzSr2-yLayCuO6+x (BSCO), and determined that these images are indeed fractal and discovered that they extend into the full three-dimensional space occupied by the material, like a tree filling space.

What was once thought of as random dispersions within the fractal images are purposeful and, shockingly, not due to an underlying quantum phase transition as expected, but due to a disorder-driven phase transition.

Carlson led a collaborative team of researchers across multiple institutions and published their findings, titled "Critical nematic correlations throughout the superconducting doping range in Bi2-xPbzSr2-yLayCuO6+x," in Nature Communications.

The team includes Purdue scientists and partner institutions. From Purdue, the team includes Carlson, Dr. Forrest Simmons, recent Ph.D. student, and former Ph.D. students Dr. Shuo Liu and Dr. Benjamin Phillabaum. The Purdue team completed their work within the Purdue Quantum Science and Engineering Institute (PQSEI). The team from partner institutions includes Dr. Jennifer Hoffman, Dr. Can-Li Song, Dr. Elizabeth Main of Harvard University, Dr. Karin Dahmen of the University of Urbana-Champaign, and Dr. Eric Hudson of Pennsylvania State University.

"The observation of fractal patterns of orientational ('nematic') domainscleverly extracted by Carlson and collaborators from STM images of the surfaces of crystals of a cuprate high temperature superconductoris interesting and aesthetically appealing on its own, but also of considerable fundamental importance in coming to grips with the essential physics of these materials," says Dr. Steven Kivelson, the Prabhu Goel Family Professor at Stanford University and a theoretical physicist specializing in novel electronic states in quantum materials. "Some form of nematic order, typically thought to be an avatar of a more primitive charge-density-wave order, has been conjectured to play an important role in the theory of the cuprates, but the evidence in favor of this proposition has previously been ambiguous at best. Two important inferences follow from Carlson et al.'s analysis: 1) The fact that the nematic domains appear fractal implies that the correlation lengththe distance over which the nematic order maintains coherenceis larger than the field of view of the experiment, which means that it is very large compared to other microscopic scales. 2) The fact that patterns that characterize the order are the same as those obtained from studies of the three dimensional random-field Ising modelone of the paradigrmatic models of classical statistical mechanicssuggests that the extent of the nematic order is determined by extrinsic quantities and that intrinsically (i.e. in the absence of crystalline imperfections) it would exhibit still longer range correlations not just along the surface, but extending deep into the bulk of the crystal."

High resolution images of these fractals are painstakingly taken in Hoffman's lab at Harvard University and Hudson's lab, now at Penn State, using scanning tunneling microscopes (STM) to measure electrons at the surface of the BSCO, a cuprate superconductor. The microscope scans atom by atom across the top surface of the BSCO, and what they found was stripe orientations that went in two different directions instead of the same direction. The result, seen above in red and blue, is a jagged image that forms interesting patterns of electronic stripe orientations.

"The electronic patterns are complex, with holes inside of holes, and edges that resemble ornate filigree," explains Carlson. "Using techniques from fractal mathematics, we characterize these shapes using fractal numbers. In addition, we use statistics methods from phase transitions to characterize things like how many clusters are of a certain size, and how likely the sites are to be in the same cluster."

Once the Carlson group analyzed these patterns, they found a surprising result. These patterns do not form only on the surface like flat layer fractal behavior, but they fill space in three dimensions. Simulations for this discovery were carried out at Purdue University using Purdue's supercomputers at Rosen Center for Advanced Computing. Samples at five different doping levels were measured by Harvard and Penn State, and the result was similar among all five samples.

The unique collaboration between Illinois (Dahmen) and Purdue (Carlson) brought cluster techniques from disordered statistical mechanics into the field of quantum materials like superconductors. Carlson's group adapted the technique to apply to quantum materials, extending the theory of second order phase transitions to electronic fractals in quantum materials.

"This brings us one step closer to understanding how cuprate superconductors work," explains Carlson. "Members of this family of superconductors are currently the highest temperature superconductors that happen at ambient pressure. If we could get superconductors that work at ambient pressure and temperature, we could go a long way toward solving the energy crisis because the wires we currently use to run electronics are metals rather than superconductors. Unlike metals, superconductors carry current perfectly with no loss of energy. On the other hand, all the wires we use in outdoor power lines use metals, which lose energy the whole time they are carrying current. Superconductors are also of interest because they can be used to generate very high magnetic fields, and for magnetic levitation. They are currently used (with massive cooling devices!) in MRIs in hospitals and levitating trains."

Next steps for the Carlson group are to apply the Carlson-Dahmen cluster techniques to other quantum materials.

"Using these cluster techniques, we have also identified electronic fractals in other quantum materials, including vanadium dioxide (VO2) and neodymium nickelates (NdNiO3). We suspect that this behavior might actually be quite ubiquitous in quantum materials," says Carlson.

This type of discovery leads quantum scientists closer to solving the riddles of superconductivity.

"The general field of quantum materials aims to bring to the forefront the quantum properties of materials, to a place where we can control them and use them for technology," Carlson explains. "Each time a new type of quantum material is discovered or created, we gain new capabilities, as dramatic as painters discovering a new color to paint with."

More information: Can-Li Song et al, Critical nematic correlations throughout the superconducting doping range in Bi2-xPbzSr2-yLayCuO6+x, Nature Communications (2023). DOI: 10.1038/s41467-023-38249-3

Journal information: Nature Communications

Continue reading here:

Researchers discover superconductive images are actually 3D and disorder-driven fractals - Phys.org

Collaboration builds fantastical stories from nuggets of truth – Symmetry magazine

Science fiction asks the question What if? and then attempts to answer that question with a story grounded in science fact. Thats why Comma Press, for its latest anthology, paired science fiction writers with CERN scientists, to create untrue stories with a bit of truth in them.

Creating the anthology, titled Collision, started with a call to CERN researchers and alumni of the European physics research center, asking them to describe concepts they thought would inspire good creative writing. Next, authors picked the ideas that called to them most. Each author then met with the scientist who proposed their chosen idea and discussed the science in detail. The anthology consists of a collection of the resulting stories, each with an afterward by the consulting scientist.

Symmetry interviewed three different writer-scientist pairs to learn what it was like to participate in this unusual collaboration: Television writer, television producer and screenwriter Steven Moffat, famous for his work on the tv series Doctor Who and Sherlock, worked with Peter Dong, a physics teacher at the Illinois Math and Science Academy who works with students to analyze data from CERN. Poet, playwright and essayist lisa luxx worked with physicist Carole Weydert, now leader of a unit of an insurance regulatory authority in Luxembourg. And UK-based Short Story Fellow of the Arts Foundation Adam Marek worked with Andrea Giammanco, who continues to do research at CERN as a physicist with the National Fund for Scientific Research in Belgium.

Although the assignment was the same for every pair, they all approached their stories differently.

When Peter Dong first got the call for inspiring physics concepts, he knew exactly what to submit: a strange but real-life theory proposed by physicists Holger Bech Nielsen and Masao Ninomiya.

In the late 2000s, the theorists posited that the universe might for some reason prefer to keep certain fundamental particlessay, the Higgs boson or particles of dark mattera mystery. So much so, they wrote, that the universe was actively conspiring against their discovery.

The search for the Higgs certainly wasnt easy. Only weeks after being turned on for the first time, the Large Hadron Collider at CERN experienced a critical failure, immediately stopping the two most advanced experiments aimed at finding the Higgs. Years earlier and an ocean away, the United States Congress suddenly scrapped plans to build a similar collider in Texas, despite construction having already begun.

These two events, Nielsen and Ninomiya argued, could indicate that discovering the new particle was so improbable that the universe wouldnt allow it to happen.

Dong first read about the theory as a graduate student at the US Department of Energys Fermi National Accelerator Laboratory in 2008. While the probability is absurdly small, and it's a wacky, out-there idea, it's still not impossible, he says. When we come up with these wacky theories, I feel like more people should know about them. They're just so much fun.

In their paper, Nielsen and Ninomiya proposed an experiment thatwhile it would not prove their hypothesiscould at least put it to the test: Shuffle a deck of a million cards in which a single card is marked Shut down the LHC. If that card were randomly pulled from the deck, they would take that as a sign the universe wanted physicists to back off.

The card experiment was never run, and in the end, scientists were able to repair and restart the LHC. In 2012, physicists on the CMS and ATLAS experiments announced the discovery of the Higgs.

Dong had previously tried out using Nielsen and Ninomiyas idea as the basis for a story while auditing a creative writing class, so when he got the CERN email, he was ready with a pitch.

Writer Steven Moffat says Dongs idea stood out to him on the list. I zeroed in on that promptone, because I understood it, or at least I thought I understood part of it, he says. And two, because I could see a story in it. I just love the idea that a bunch of very serious-minded scientists wondered if the universe might be actively trying to stop them.

Moffat and Dong worked together to make sure the science in the story held up. Steven was taking great pains to ask Does this make sense? Would that be right? Dong says.

The result was a mad, paranoid fantasy, Moffat says. But its decorated in the right terminology and derives from something that really happened.

Not all the stories in Collision fall neatly into the sci-fi genre. For her entry, poet lisa luxx explored her lived experiences of adoption and migration through the lens of quantum physics. At a certain point, I had to disentangle [pun not intended] myself from trying to achieve a particular genre, luxx says.

The writer chose a prompt related to supersymmetry, which posits that every particle has a yet unobserved partner with some shared properties. Like Moffat, luxx says she was attracted to the idea she chose because it made some amount of sense to her. The physicist had used quite a poetic quote in her explanation of this theory, luxx says. That immediately drew me in.

Physicist Carole Weydert, who submitted the idea, may have had an artistic way of explaining supersymmetry because she had previously explored another complex physics theory in her own art. It is this idea that you could have a kind of symmetry between everything contained in spacetime and spacetime itself, she says.

Weydert says she [tries] to squeeze in time to paint, sometimes using ripped and cut pages from old textbooks in her work. I try to express this quest for simplification in theoretical physics, that from one very, very basic theory, the whole complexity of the world emerges.

It was not easy to translate the mathematical and theoretical aspects of supersymmetry into a fictionalized story, luxx says. The most challenging part was me grasping exactly the nuance of where physicists are at with understanding supersymmetry, she says. I was learning the theory while writing it.

But the theory eventually clicked into place as a metaphor.

The poet says she wanted to ground the story in a common language to show how entwined physics is with everyday life. Physics are just parts of us. We can't understand ourselves and can't understand society without at least somewhat understanding these theories.

The final piece strayed from Weyderts initial prompt, but Weydert says she is nevertheless pleased with the outcome. I think what is important in these stories is not the science, as such, but connecting it to something that conveys an emotion.

For his contribution to the anthology, short story writer Adam Marek moved away from traditional prose. He instead wrote his piece in the form of an interview for real-life BBC Radio program Desert Island Discs. The 45-minute show tells the story of a famous persons life through music tracks they have chosen.

Ive always loved it as a format for telling the story of someones life, Marek says. And I'd thought it could make a terrific framework for writing a short story.

The prompt Marek chose came from physicist Andrea Giammanco. As a postdoc, Giammanco had spent time researching the dark sector, a collection of hypothetical particles that have thus far gone unobserved. I was in love with the idea of the dark sector popping up in unconventional ways, Giammanco says.

Marek had worked with other scientists on stories for other Comma Press anthologies, but he says this time was unique. It was different, he says. And I think that was because of Andreas particular interests and approach.

Marek and Andrea spoke many times throughout the project, on video calls and email, Marek says. It helped that they shared a love of science fiction. Andrea seemed as fascinated by the writing process as I was with the science. We had lots of questions for each other.

Giammanco says they threw so many ideas at each other that we could have easily written five completely different stories.

In one of these brainstorming sessions, Giammanco says he mentioned offhand that the dark sector may be hiding in plain sight, and it may even be revealed in a reanalysis of old data. This piqued Adams attention, he saysand it ultimately became a key part of the plot.

Throughout the process Giammanco ensured that everything in the story was at least possible. Adam wanted the idea to work from the narrative point of view, he says. But for me, it was paramount to make it work from the scientific point of view.

But staying within the realm of the possible didnt restrict Marek and Giammanco to the realm of the particularly plausible. Marek says that flexibility helped him find a creative way for the story to end.

At the peak of being very stressed out about the story and not knowing how to finish it, he says, Andrea just happened to send me an email about ghosts.

Giammanco had sent Marek an article that quoted physicist and pop culture figure Brian Cox asserting that the LHC had unintentionally disproved the existence of incorporeal beings by failing to detect any evidence of them. But Giammanco had laid out an alternative argument: If ghosts did exist, they would just need to be a part of the dark sector, which the LHC has not been able to reach. It just fixed it all, Marek says. It was really, really fortuitous for the story.

Whether or not any of the ideas in the stories collected in Collision turns out to be true, the project highlights what both writers and scientists have in common: the ongoing quest to imagine What if?

Originally posted here:

Collaboration builds fantastical stories from nuggets of truth - Symmetry magazine

Is there any evidence that the "aether" exists? – Big Think

All throughout the Universe, different types of signals propagate. Some of them, like sound waves, require a medium to travel through. Others, like light or gravitational waves, are perfectly content to traverse the vacuum of space, seemingly defying the need for a medium altogether. Irrespective of how they do it, all of these signals can be detected from the effects they have on all the matter and energy that they interact with: both along their journey through space all the way up until their eventual arrival at their final destination.

But is it truly possible for waves to travel through the vacuum of space itself, without any need for a medium to propagate through at all? For some of us, this is a very counterintuitive notion, as the notion of things existing within and moving through some form of empty nothingness just doesnt make any sense. But plenty of things in physics dont make intuitive sense, as it isnt up to humans to tell nature what does and doesnt make sense. Instead, all we can do is ask the Universe questions about itself through experiment, observation, and measurement, and follow natures answers to the best conclusions we can draw. Although theres no way to disprove the aethers (or anything else thats unobservable) existence, we can certainly look at the evidence and allow it to take us wherever it will.

Whether through a medium, like mechanical waves, or in a vacuum, like electromagnetic and gravitational waves, every ripple that propagates has a propagation speed. In no case is the propagation speed infinite, and in theory, the speed at which gravitational ripples propagate should be the same as the maximum speed in the Universe: the speed of light.

Back in the earliest days of science before Newton, going back hundreds or even thousands of years we only had large-scale, macroscopic phenomena to investigate. The waves we observed came in many different varieties, including:

In the case of all of these waves, matter is involved. That matter provides a medium for these waves to travel through, and as the medium either compresses-and-rarifies in the direction of propagation (a longitudinal wave) or oscillates perpendicular to the direction of propagation (a transverse wave), the signal is transported from one location to another.

This diagram, dating back to Thomas Youngs work in the early 1800s, is one of the oldest pictures that demonstrate both constructive and destructive interference as arising from wave sources originating at two points: A and B. This is a physically identical setup to a double slit experiment, even though it applies just as well to water waves propagated through a tank.

As we began to investigate waves more carefully, a third type began to emerge. In addition to longitudinal and transverse waves, a type of wave where each of the particles involved underwent motion in a circular path a surface wave was discovered. The rippling characteristics of water, which were previously thought to be either longitudinal or transverse waves exclusively, were shown to also contain this surface wave component.

All three of these types of waves are examples of mechanical waves, which is where some type of energy is transported from one location to another through a material, matter-based medium. A wave that travels through a spring, a slinky, water, the Earth, a string, or even the air, all require an impetus for creating some initial displacement from equilibrium, and then the wave carries that energy through a medium toward its destination.

A series of particles moving along circular paths can appear to create a macroscopic illusion of waves. Similarly, individual water molecules that move in a particular pattern can produce macroscopic water waves, individual photons make the phenomenon we perceive as light waves, and the gravitational waves we see are likely made out of individual quantum particles that compose them: gravitons.

It makes sense, then, that as we discovered new types of waves, wed assume they had similar properties to the classes of waves we already knew about. Even before Newton, the aether was the name given to the void of space, where the planets and other celestial objects resided. Tycho Brahes famous 1588 work,De Mundi Aetherei Recentioribus Phaenomenis, literally translates as On Recent Phenomena in the Aethereal World.

The aether, it was assumed, was the medium inherent to space that all objects, from comets to planets to starlight itself, traveled through. Whether light was a wave or a corpuscle, though, was a point of contention for many centuries. Newton claimed it was a corpuscle, while Christiaan Huygens, his contemporary, claimed it was a wave. The issue wasnt decided until the 19th century,where experiments with light unambiguously revealed its wave-like nature. (With modern quantum physics, we now know it behaves like a particle also, but its wave-like nature cannot be denied.)

The results of an experiment, showcased using laser light around a spherical object, with the actual optical data. Note the extraordinary validation of Fresnels theorys prediction: that a bright, central spot would appear in the shadow cast by the sphere, verifying the absurd prediction of the wave theory of light. Logic, alone, would not have gotten us here.

This was further borne out as we began to understand the nature of electricity and magnetism. Experiments that accelerated charged particles not only showed that they were affected by magnetic fields, but that when you bent a charged particle with a magnetic field, it radiated light. Theoretical developments showed that light itself was an electromagnetic wave that propagated at a finite, large, but calculable velocity, today known asc, the speed of light in a vacuum.

If light was an electromagnetic wave, and all waves required a medium to travel through, and as all the heavenly bodies traveled through the medium of space then surely that medium itself, the aether, was the medium that light traveled through. The biggest question remaining, then, was to determine what properties the aether itself possessed.

In Descartes vision of gravity, there was an aether permeating space, and only the displacement of matter through it could explain gravitation. This, unfortunately, did not lead to an accurate formulation of gravity that matched with observations.

One of the most important points about what the aethercouldntbe was figured out by Maxwell himself, who was the first to derive the electromagnetic nature of light waves. In an 1874 letter to Lewis Campbell, he wrote:

It may also be worth knowing that the aether cannot be molecular. If it were, it would be a gas, and a pint of it would have the same properties as regards heat, etc., as a pint of air, except that it would not be so heavy.

In other words, whatever the aether was or more accurately, whatever it was that electromagnetic waves propagated through it could not have many of the traditional properties that other, matter-based media possessed. It could not be composed of individual particles. It could not contain heat. It could not be a conduit for the transfer of energy through it. In fact, just about the only thing left that the aether was allowed to do was serve as a background medium for things that were known to travel but didnt otherwise seem to require a medium, like light, to actually travel through.

If you split light into two perpendicular components and bring them back together, they will produce an interference pattern. If theres a medium that light is traveling through, the interference pattern should depend on how your apparatus is oriented relative to that motion.

All of this led to the most important experiment for detecting the aether: the Michelson-Morley experiment. If aether really were a medium for light to travel through, then the Earth should be passing through the aether as it rotated on its axis and revolved around the Sun. Even though we only revolve at a speed of around 30 km/s, thats a substantial fraction (about 0.01%) of the speed of light.

With a sensitive enough interferometer, if light were a wave traveling through this medium, we should detect a shift in lights interference pattern dependent on the angle the interferometer made with our direction of motion. Michelson alone tried to measure this effect in 1881, but his results were inconclusive. 6 years later, with Morley, they reached sensitivities that were just 1/40th the magnitude of the expected signal. Their experiment, however, yielded a null result; there was no evidence for the aether at all.

The Michelson interferometer (top) showed a negligible shift in light patterns (bottom, solid) as compared with what was expected if Galilean relativity were true (bottom, dotted). The speed of light was the same no matter which direction the interferometer was oriented, including with, perpendicular to, or against the Earths motion through space.

Aether enthusiasts contorted themselves in knots attempting to explain this null result.

Travel the Universe with astrophysicist Ethan Siegel. Subscribers will get the newsletter every Saturday. All aboard!

All of these possibilities, despite their arbitrary constants and parameters, were seriously considered right up until Einsteins relativity came along. Once the realization came about thatthe laws of physics should be, and in fact were, the same for all observers in all frames of reference, the idea of an absolute frame of reference, which the aether absolutely was, was no longer necessary or tenable.

If you allow light to come from outside your environment to inside, you can gain information about the relative velocities and accelerations of the two reference frames. The fact that the laws of physics, the speed of light, and every other observable is independent of your reference frame is strong evidence against the need for an aether.

What all of this means is that the laws of physics dont require the existence of an aether; they work just fine without one. Today, with our modern understanding of not just Special Relativity but also General Relativity which incorporates gravitation we recognize that both electromagnetic waves and gravitational waves dont require any sort of medium to travel through at all. The vacuum of space, devoid of any material entity, is enough all on its own.

This doesnt mean, however, that weve disproven the existence of the aether. All weve proven, and indeed all were capable of proving, is that if there is an aether, it has no properties that are detectable by any experiment were capable of performing. It doesnt affect the motion of light or gravitational waves through it, not under any physical circumstances, which is equivalent to stating that everything we observe is consistent with its non-existence.

Visualization of a quantum field theory calculation showing virtual particles in the quantum vacuum. (Specifically, for the strong interactions.) Even in empty space, this vacuum energy is non-zero, and what appears to be the ground state in one region of curved space will look different from the perspective of an observer where the spatial curvature differs. As long as quantum fields are present, this vacuum energy (or a cosmological constant) must be present, too.

If something has no observable, measurable effects on our Universe in any way, shape or form, even in principle, we consider that thing to be physically non-existent. But the fact that theres nothing pointing to the existence of the aether doesnt mean we fully understand what empty space, or the quantum vacuum, actually is. In fact, there are a whole slew of unanswered, open questions about exactly that topic plaguing the field today.

Why does empty space still have a non-zero amount of energy dark energy, or a cosmological constant intrinsic to it? If space is discrete at some level, does that imply a preferred frame of reference, where that discrete size is maximized under the rules of relativity? Can light or gravitational waves exist without space to travel through, and does that mean there is some type of propagation medium, after all?

As Carl Sagan famously said, Absence of evidence is not evidence of absence. We have no proof that the aether exists, but can never prove the negative: that no aether exists. All we can demonstrate, and have demonstrated, is that if the aether exists, it has no properties that affect the matter and radiation that we actually do observe, and so the burden isnt on those looking to disprove its existence: the burden of proof is on those who favor the aether, to provide evidence that it truly is real.

See original here:

Is there any evidence that the "aether" exists? - Big Think

20 Fastest-Growing Cities in the Southeast – Yahoo Finance

In this piece, we will take a look at the 20 fastest-growing cities in the Southeast. For more cities, head on over to 5 Fastest-Growing Cities in the Southeast.

The Southeastern region of the United States, as defined by the Geological Survey, is made up of eleven states in the continental United States, the Virgin Islands, and Puerto Rico. It is one of the oldest regions in America, with the first signs of human presence in the area dating back to tens of thousands of years. Additionally, the Southeastern states were also a crucial part of the Civil War in the 19th century, with the bulk of these deciding to break away from America.

Since then, the Southeast has significantly changed and is a global leader in industrial production and manufacturing. In fact, one of the most famous facilities in the world is located in the Southwestern state of Florida. Florida is from where humans launched rockets to the Moon, and it is the only region in the world with the honor. Courtesy of its proximity to the equator, Florida makes for the perfect site to launch rockets from as they can capitalize on the Earth's rotation around its axis and follow the equator to gain an additional 'oomph' to their speed during launch. The launch site is the National Aeronautics and Space Administration's (NASA) Kennedy Space Center - and it still plays a crucial part in the American space program by being the only launch site in the country that is capable of supporting crewed missions. The KSC is also NASA's only rocket launch site in the country, with other facilities belonging to the Space Force.

However, the space center is not the only crucial NASA facility in the region. Another highly important site in the Southwest is the Stennis Test Center. Stennis is located in Hancock County, Mississippi, and it is responsible for producing some of the most breathtaking visuals in America. This is due to the fact Stennis is the only site in America that can withstand the unimaginable forces generated by a rocket engine. Like KSC, Stennis is also a historic site since it was built to test the massive F-1 engines for the Saturn V rocket.

Story continues

Yet, as if sending people to the Moon and blasting some of the most powerful rocket engines made in human history was not enough, the Southeast is also busy leading America in other high technology areas. For instance, the second largest research park in the world, the Cummings Research Park, is in the Southeastern state of Alabama. Cummings has more than three hundred companies, and these include high end space and aerospace companies such as Lockheed Martin, Dynetics, and Teledyne. Additionally, the Southeast is also home to Florida State University. The university has the only high end magnetic laboratory in America, the National High Magnetic Field Laboratory (MagLab). This facility has several programs which range from spectroscopy to electromagnetic resonance, researching quantum physics, and high magnetic field experiments and research.

Not only is the second largest research triangle in the Southeast, but the biggest triangle - namely the Research Triangle Park in North Carolina which covers seven thousand acres - is also in the region. The RTP, like Cummings, has more than three hundred companies, which include some of the largest and most advanced firms on the planet. This list includes the pharmaceutical giant GSK and the networking equipment manufacturer Cisco. Other firms in the area include agriculture technology companies, medical device firms, and biotechnology companies.

The region is also home to some of the biggest companies in the world. These include The Coca-Cola Company (NYSE:KO), United Parcel Service, Inc. (NYSE:UPS), Walmart Inc. (NYSE:WMT), and Delta Air Lines, Inc. (NYSE:DAL). These are just a handful of firms in the area, and most of these are in Atlanta - one of the Southeast's largest and most prosperous cities. Atlanta is the capital of Georgia and has one of the biggest gross domestic products (GDPs) in the world and in the U.S. There are more than a thousand multinational firms in the area, and the city's economy is dominated by information technology and broadcasting markets. Another prosperous Southeastern city is Miami - the second largest city in Florida in terms of population. Miami has some of the busiest sea and air ports in America, and its location also makes it a hub of the cruise ship industry in the U.S.

With these details in mind, let's take a look at some of the fastest-growing cities in the Southeast.

20 Fastest-Growing Cities in the Southeast

rayna-tuero-AbTcdSQ3v2E-unsplash

Our Methodology

To compile our list of the fastest growing cities in the Southeast, we first narrowed down the 30 largest cities in the region in terms of population. Then, their inbound move rates, which are the percentage of inbound movers over the total inbound and outbound movers, were determined. Finally, the corresponding metropolitan areas were ranked accordingly, out of which the top twenty fastest-growing cities in the Southwest were chosen and are listed below.

Net Inbound Move Rate: 52.3%

Winston-Salem is a city in North Carolina with a quarter of a million people living in its boundaries. Its metropolitan area has five counties, and the city houses several large food and clothing companies. Additionally, the city has a strong presence in the healthcare segment.

Net Inbound Move Rate: 52.4%

Saint Petersburg is a city in Florida and one of the largest in the state in terms of population. The city also sits at the heart of Florida's name as the Sunshine State since it gets sunlight for most days of the year. Saint Petersburg has a diversified economy, with healthcare, insurance, and retail companies being some of the largest employers in the area.

Net Inbound Move Rate: 52.4%

Tampa is another Floridan city. It houses nearly four hundred people and was set up in 1823. Tampa has the biggest port in the state in terms of tonnage handled. It also has a large presence of the U.S. Air Force, and headquarters some of the most crucial Pentagon installations.

Net Inbound Move Rate: 52.7%

Augusta is one of the smallest cities on our list since it houses a little over two hundred thousand people. Augusta is in Georgia and is one of the oldest cities on our list since it was set up in the early 19th century. It houses facilities for several important companies such as T-Mobile, Kellogg's, Kimberly Clark, and Nutrien.

Net Inbound Move Rate: 53%

Durham is one of the largest cities in North Carolina It is known to have one of the best universities in the world, Duke University. Additionally, International Business Machines (IBM) has a large presence in the city as well as the presence of large healthcare and investment firms.

Net Inbound Move Rate: 53.4%

Washington D.C. is the capital of the United States. It houses the Capitol, the White House, and the Supreme Court. Additionally, the city is also home to several international financing institutions and has some of the highest costs of living in the U.S.

Net Inbound Move Rate: 53.5%

Jacksonville is the largest city in Florida in terms of population. Almost one million people live within the city's boundaries, and Jacksonville is also part of one of Florida's largest metropolitan areas. It houses several Fortune 500 companies and is a growing financial center.

Net Inbound Move Rate: 53.6%

Orlando houses more than three hundred thousand people and is a tourism hub in Florida. It also has a large technology park and a presence of important aerospace firms.

Net Inbound Move Rate: 53.8%

Baton Rouge is the capital of Louisiana. The city has some of the largest petroleum facilities in the world, as well as a strong presence of the chemical industry.

Net Inbound Move Rate: 53.8%

Atlanta is one of the most prosperous cities in the Southeast. It has some of the largest numbers of Fortune 500 firms in the region.

Net Inbound Move Rate: 54%

Greensboro is a city in North Carolina. It has a concentration of a variety of manufacturing and industrial companies.

Net Inbound Move Rate: 54.2%

Lexington is the second largest city in Kentucky. It houses several important Fortune 500 companies operating in technology and other industries.

Net Inbound Move Rate: 54.3%

Raleigh is the tenth largest city in the Southeast. The education system is one of the city's largest employers.

Net Inbound Move Rate: 54.7%

Chesapeake is the second largest city in Virginia. Most of its residents are employed in the public sector and healthcare.

Net Inbound Move Rate: 54.7%

Norfolk is another Virginian city and one of the more smaller ones on our list since it houses less than half a million people.

Click to continue reading and see 5 Fastest-Growing Cities in the Southeast.

Suggested Articles:

Disclosure: None.20 Fastest-Growing Cities in the Southeast is originally published on Insider Monkey.

Go here to read the rest:

20 Fastest-Growing Cities in the Southeast - Yahoo Finance

Team creates heaviest Schrdinger’s cat yet – Futurity: Research News

Share this Article

You are free to share this article under the Attribution 4.0 International license.

Researchers have created the heaviest Schrdinger cat to date by putting a crystal in a superposition of two oscillation states.

Even if you are not a quantum physicist, you will most likely have heard of Schrdingers famous cat. Erwin Schrdinger came up with the feline that can be alive and dead at the same time in a thought experiment in 1935. The obvious contradictionafter all, in everyday life we only ever see cats that are either alive or deadhas prompted scientists to try to realize analogous situations in the laboratory. So far, they have managed to do so using, for instance, atoms or molecules in quantum mechanical superposition states of being in two places at the same time.

At ETH Zurich, a team of researchers led by Yiwen Chu, professor at the Laboratory for Solid State Physics, report their findings in the journal Science. The work could lead to more robust quantum bits and shed light on the mystery of why quantum superpositions are not observed in the macroscopic world.

In Schrdingers original thought experiment, a cat is locked up inside a metal box together with a radioactive substance, a Geiger counter, and a flask of poison. In a certain time-framean hour, sayan atom in the substance may or may not decay through a quantum mechanical process with a certain probability, and the decay products might cause the Geiger counter to go off and trigger a mechanism that smashes the flask containing the poison, which would eventually kill the cat.

Since an outside observer cannot know whether an atom has actually decayed, they also dont know whether the cat is alive or deadaccording to quantum mechanics, which governs the decay of the atom, it should be in an alive/dead superposition state.

Of course, in the lab we cant realize such an experiment with an actual cat weighing several kilograms, says Chu. Instead, she and her coworkers managed to create a so-called cat state using an oscillating crystal, which represents the cat, with a superconducting circuit representing the original atom. That circuit is essentially a quantum bit or qubit that can take on the logical states 0 or 1 or a superposition of both states, 0+1.

The link between the qubit and the crystal cat is not a Geiger counter and poison, but rather a layer of piezoelectric material that creates an electric field when the crystal changes shape while oscillating. That electric field can be coupled to the electric field of the qubit, and hence the superposition state of the qubit can be transferred to the crystal.

As a result, the crystal can now oscillate in two directions at the same timeup/down and down/up, for instance. Those two directions represent the alive or dead states of the cat. By putting the two oscillation states of the crystal in a superposition, we have effectively created a Schrdinger cat weighing 16 micrograms, explains Chu. That is roughly the mass of a fine grain of sand and nowhere near that of a cat, but still several billion times heavier than an atom or molecule, making it the fattest quantum cat to date.

In order for the oscillation states to be true cat states, it is important that they be macroscopically distinguishable. This means that the separation of the up and down states should be larger than any thermal or quantum fluctuations of the positions of the atoms inside the crystal. Chu and her colleagues checked this by measuring the spatial separation of the two states using the superconducting qubit. Even though the measured separation was only a billionth of a billionth of a metersmaller than an atom, in factit was large enough to clearly distinguish the states.

In the future, Chu would like to push the mass limits of her crystal cats even further. This is interesting because it will allow us to better understand the reason behind the disappearance of quantum effects in the macroscopic world of real cats, she says. Beyond this rather academic interest, there are also potential applications in quantum technologies.

For instance, quantum information stored in qubits could be made more robust by using cat states made up of a huge number of atoms in a crystal rather than relying on single atoms or ions, as is currently done. Also, the extreme sensitivity of massive objects in superposition states to external noise could be exploited for precise measurements of tiny disturbances such as gravitational waves or for detecting dark matter.

Source: ETH Zurich

Go here to read the rest:

Team creates heaviest Schrdinger's cat yet - Futurity: Research News

Physicists discover ‘stacked pancakes of liquid magnetism’ – Phys.org

This article has been reviewed according to ScienceX's editorial process and policies. Editors have highlighted the following attributes while ensuring the content's credibility:

fact-checked

peer-reviewed publication

trusted source

proofread

Physicists have discovered "stacked pancakes of liquid magnetism" that may account for the strange electronic behavior of some layered helical magnets.

The materials in the study are magnetic at cold temperatures and become nonmagnetic as they thaw. Experimental physicist Makariy Tanatar of Ames National Laboratory at Iowa State University noticed perplexing electronic behavior in layered helimagnetic crystals and brought the mystery to the attention of Rice theoretical physicist Andriy Nevidomskyy, who worked with Tanatar and former Rice graduate student Matthew Butcher to create a computational model that simulated the quantum states of atoms and electrons in the layered materials.

Magnetic materials undergo a "thawing" transition as they warm up and become nonmagnetic. The researchers ran thousands of Monte Carlo computer simulations of this transition in helimagnets and observed how the magnetic dipoles of atoms inside the material arranged themselves during the thaw. Their results were published in a recent study in Physical Review Letters.

At a submicroscopic level, the materials under study are composed of thousands of 2D crystals stacked one atop another like pages in a notebook. In each crystal sheet, atoms are arrayed in lattices, and the physicists modeled quantum interactions both within and between sheets.

"We're used to thinking that if you take a solid, like a block of ice, and you heat it up, eventually it will become a liquid, and at a higher temperature, it will evaporate and become a gas," said Nevidomskyy, an associate professor of physics and astronomy and member of the Rice Quantum Initiative. "A similar analogy can be made with magnetic materials, except that nothing evaporates in a true sense of the word."

"The crystal is still intact," he said. "But if you look at the arrangement of the little magnetic dipoleswhich are like compass needlesthey start out in a correlated arrangement, meaning that if you know which way one of them is pointing, you can determine which way any of them points, regardless how far away it is in the lattice. That is the magnetic statethe solid in our analogy. As you heat up, the dipoles eventually will become completely independent, or random, with respect to one another. That's known as a paramagnet, and it is analogous to a gas."

Nevidomskyy said physicists typically think of materials either having magnetic order or lacking it.

"A better analogy from the classical viewpoint would be a block of dry ice," he said. "It kind of forgets about the liquid phase and goes straight from ice into gas. That's what magnetic transitions are usually like in the textbooks. We are taught that you start with something correlated, let's say a ferromagnet, and at some point the order parameter disappears, and you end up with a paramagnet."

Tanatar, a research scientist at Ames' Superconductivity and Magnetism Low-Temperature Laboratory, had found signs that the transition from magnetic order to disorder in helical magnets was marked by a transitory phase in which electronic properties, like resistance, differed by direction. For instance, they might differ if they were measured horizontally, from side to side, as opposed to vertically from top to bottom. This directional behavior, which physicists call anisotropy, is a hallmark of many quantum materials like high-temperature superconductors.

"These layered materials don't look the same in the vertical and horizontal directions," said Nevidomskyy. "That's the anisotropy. Makariy's intuition was that anisotropy was affecting how magnetism melts in the material, and our modeling demonstrated that to be true and showed why it happens."

The model showed that the material passes through an intermediate phase as it transitions from magnetic order to disorder. In that phase, dipole interactions are much stronger within sheets than between them. Moreover, the correlations between the dipoles resembled those of a liquid, rather than a solid. The result is "flattened puddles of magnetic liquids that are stacked up like pancakes," Nevidomskyy said. In each puddle-like pancake, dipoles point roughly in the same direction, but that sense of direction varies between neighboring pancakes.

"It's a bunch of atoms all with their dipoles pointing in the same direction," Nevidomskyy said. "But then, if you go up one layer, all of them are pointing in a different random direction."

The atomic arrangement in the material "frustrates" the dipoles and keeps them from aligning in a uniform direction throughout the material. Instead, the dipoles in the layers shift, rotating slightly in response to changes in neighboring pancakes.

"Frustrations make it difficult for the arrows, these magnetic dipoles, to decide where they want to point, at one angle or another," Nevidomskyy said. "And to relieve that frustration, they tend to rotate and shift in each layer."

Tanatar said, "The idea is that you have two competing magnetic phases. They are fighting each other, and as a result you have a transition temperature for these phases that is lower than it would be without competition. And in this competition scenario, the phenomena that lead to magnetic order are different from the phenomena when you don't have this competition."

Tanatar and Nevidomskyy said that while there's no immediate application for the discovery, it may nevertheless offer hints about the still-unexplained physics of other anisotropic materials like high-temperature superconductors.

Despite the name, high-temperature superconductivity occurs at very cold temperatures. One theory suggests that materials may become superconductors when they are cooled in the vicinity of a quantum critical point, a temperature sufficient to suppress long-range magnetic order and give rise to effects brought about by strong quantum fluctuations. For example, several magnetic "parent" materials have been shown to harbor superconductivity close to a quantum critical point where magnetism disappears.

"Once you suppress the main effect, the long-range magnetic ordering, you may give way to weaker effects like superconductivity," Tanatar said. "This is one of the leading theories of unconventional superconductivity. In our study, we show that you can do the same thing in a different way, with frustration or competing interactions."

More information: Matthew W. Butcher et al, Anisotropic Melting of Frustrated Ising Antiferromagnets, Physical Review Letters (2023). DOI: 10.1103/PhysRevLett.130.166701

Journal information: Physical Review Letters

See the article here:

Physicists discover 'stacked pancakes of liquid magnetism' - Phys.org