Page 89«..1020..88899091..100110..»

Category Archives: Quantum Physics

Mid-Atlantic Quantum Alliance Expands Impact and Reach with Addition of 10 New Partners – PR Web

Posted: February 25, 2021 at 1:36 am

"Our region is already a world leader in quantum science and technology, and the MQA is working to expand its impact in the design, building and commercialization of quantum technologies, and to create a skilled, diverse quantum workforce. - University of Maryland President Darryll J. Pines

COLLEGE PARK, Md. (PRWEB) February 19, 2021

The Mid-Atlantic Quantum Alliancea rapidly growing hub of quantum technology research, development, innovation and education organized and facilitated by the University of Marylandhas added 10 new members over the past year for a total of 24 university, government and industry partners. Together these MQA members are building a vibrant and diverse ecosystem designed to foster U.S. and regional leadership in the coming quantum technology revolution.

The new members of the Mid-Atlantic Quantum Alliance (MQA) https://mqa.umd.edu/ are: the National Institute of Standards & Technology (NIST), IBM, Protiviti, Quantopo, Quaxys, Bowie State University, Georgetown University, Pittsburgh Quantum Institute, University of Delaware, and Virginia Tech.

The additions expand the power, diversity, and geographical coverage of an already strong consortium of quantum scientists and engineers in academia, national laboratories, and industry that was formally launched on January 30, 2020, as the Maryland Quantum Alliance. The MQA was recently renamed the Mid-Atlantic Quantum Alliance to reflect its larger, more inclusive scope.

We are very pleased to welcome new partners to the Mid-Atlantic Quantum Alliance. said University of Maryland President Darryll J. Pines. "Our region is already a world leader in quantum science and technology, and the MQA is working to expand its impact in the design, building and commercialization of quantum technologies, and to create a skilled, diverse quantum workforce. This work is essential to power the coming quantum revolution in computing, communication, sensing, materials and many other areas."

For the past year, Mid-Atlantic Quantum Alliance workgroups have been creating ways for MQA members to more easily collaborate, share resources, facilities, equipment, expertise and data, team-up to pursue opportunities, and educate the public about the promise of the second quantum revolution. The expanding alliance has built a powerful forum for its members to engage and work together, not only on quantum science and technology R&D, but also on quantum training, education and global thought leadership. The MQA recently expanded the number and size of its workgroups in order to launch several new initiatives in its second year. These will showcase its technical leadership on the global stage, support quantum commercialization and entrepreneurship, and expand the quantum talent pipeline.

"MQA members' wealth of relevant expertise and Marylands concentration of world-leading quantum institutes with cutting-edge facilities and research, made this the ideal place to launch our new quantum technologies company, said Alan Salari, founder and CEO of Quaxys MQAs newest member. We are developing the new generation of hardware systems, especially at microwave frequencies,used for control and measurement of quantum bits, the basic unit of information in quantum computing. Quaxys is committed to solving the most challenging technical hurdles to quantum technologies. And we are excited to collaborate with experts from the University of Maryland and other MQA members in our work to bring the best of such technology to the market in the shortest time."

The Mid-Atlantic Quantum Alliance is collaboratively working to empower the region and nation to lead the unfolding second quantum revolution, which is expected to bring transformative advances in computing, communication, sensing, materials and many other areas. Alliance members have formally agreed to work together to inclusively advance the regional quantum ecosystem in a host of ways, including:

raising public awareness of quantum opportunities and potential, driving new quantum science discovery, developing pioneering quantum technologies, supporting quantum entrepreneurship, and startup companiestraining a diverse, world class quantum workforce.

We are excited to be a part of the Maryland Quantum Alliance (MQA), said Bowie State University Professor Chaobin Liu. We expect that MQA will create opportunities for BSU students to leverage the world-class quantum expertise, educational resources and career opportunities in the region and to fully participate in the second quantum revolution, said Liu, whose research and teaching focuses on probability theory and mathematical statistics, Mathematical physics and quantum computation. Bowie State University, Marylands first historically black public university, supports the regions workforce and economy by engaging in strategic partnerships, research, and public service to benefit local, state, national, and global communities.

Current specific goals of the MQA include: accelerating the strong quantum innovation by and among alliance members, and across the Mid-Atlantic regionpromoting interdisciplinary, applied & translational quantum tech research and commercialization efforts & outcomesmaking relevant quantum expertise & tech easier to find & accesssharing resources and identifying regional research infrastructure needs & opportunitiesbuilding a quantum workforce byfacilitating curriculum sharing & access to unique equipment/labs/expertisecreating unique shared experiential learning programs elevating diversity & inclusion as a core part alliance effortsconnecting/amplifying public & K-12 education campaignsbuilding international partnerships

Building and expanding diverse collaborations across different types of organizations are the foundation for a vibrant quantum economy within the region, which is the prime purpose of the MQA, said John Sawyer, MQA Interim Executive Director, and Director of Strategic Research Initiatives at the University of Maryland. Our members work together to align basic and applied quantum science with real-world needs and requirements, enable more rapid discovery of creative solutions, and equitably create the necessary infrastructure and workforce to scale up quantum technologies.

University of Maryland Quantum LeadershipThe University of Maryland is the facilitating institution for, and a leader of the Mid-Atlantic Quantum Alliance. Long recognized as a national and world leader in quantum science and technology, UMD hosts five collaborative research centers focused on different aspects of quantum science and technology: The Joint Quantum Institute (JQI) and the Joint Center for Quantum Information and Computer Science (QuICS) are collaborations with the National Institute of Standards and Technology. The Quantum Technology Center (QTC) a collaboration between the Clark School of Engineering, College of Computer, Mathematical, and Natural Sciences, and the Army Research Laboratory, brings together UMD engineers and physicists to work on translating quantum physics into transformational new technologies. The Condensed Matter Theory Center has made pioneering contributions to topological approaches to quantum computing, and the Quantum Materials Center explores superconductors and novel quantum materials to enable new technology devices.https://umdrightnow.umd.edu/news/mid-atlantic-quantum-alliance-expands-impact-and-reach-addition-10-new-partners

Continue reading here:

Mid-Atlantic Quantum Alliance Expands Impact and Reach with Addition of 10 New Partners - PR Web

Posted in Quantum Physics | Comments Off on Mid-Atlantic Quantum Alliance Expands Impact and Reach with Addition of 10 New Partners – PR Web

Everything you need to know about quantum physics (almost …

Posted: February 22, 2021 at 2:19 pm

What is quantum physics?

Quantum physics is a branch of physics also known as quantum mechanics or quantum theory.

Mechanics is that part of physics concerned with stuff that moves, from cannonballs to tennis balls, cars, rockets, and planets. Quantum mechanics is that part of physics which describes the motions of objects at molecular, atomic, and sub-atomic levels, such as photons and electrons.

Although quantum mechanics is an extraordinarily successful scientific theory, on which much of our modern, tech-obsessed lifestyles depend, it is also completely mad.

Read more about quantum physics:

The theory quite obviously works, but it appears to leave us chasing ghosts and phantoms, particles that are waves and waves that are particles, cats that are at once both alive and dead, lots of seemingly spooky goings-on, and a desperate desire to lie down quietly in a darkened room.

If youve ever wondered what is it about quantum theory that makes it so baffling to many, heres a brief summary of quantum in simple terms.

We now know that all matter is composed of atoms. Each atom is in turn made up of electrons orbiting a nucleus consisting of protons and neutrons. Atoms are discrete. They are localised: here or there.

But towards the end of the 19th Century, atoms were really rather controversial. In fact, it was a determination to refute the existence of atoms that led the German physicist Max Planck to study the properties and behaviour of so-called black-body radiation.

What he found in an act of desperation in late-1900 turned him into a committed atomist, but it took a few more years for the real significance of his discovery to sink in.

Planck had concluded that radiation is absorbed and emitted as though it is composed of discrete bits which he called quanta. In 1905, Albert Einstein went further. He speculated that the quanta are real radiation itself comes in discrete lumps of light-energy. Today we call these lumps photons.

Einsteins hypothesis posed a bit of a problem. There was an already well-established body of evidence in favour of a wave theory of light. The key observation is called the double slit experiment.

Push light through a narrow aperture or slit and it will squeeze through, bend around at the edges and spread out beyond. It diffracts.

Cut two slits side-by-side and we get interference. Waves diffracted by the two slits produce an alternating pattern of light and dark bands called interference fringes. This kind of behaviour is not limited to light such wave interference is easily demonstrated using water waves.

But waves are inherently delocalised: they are here and there. Einsteins hypothesis didnt overturn all the evidence for the delocalised wave-like properties of light. What he was suggesting is that a complete description somehow needs to take account of its localised, particle-like properties, too.

So, light acts like both a wave and a particle.

In 1923, French physicist Louis de Broglie made a bold suggestion. If light waves can also be particles, could particles like electrons also be waves? This was just an idea, but he was able to use it to develop a direct mathematical relationship between an electrons wave-like property (wavelength) and a particle-like property (momentum).

But this was not a fully-fledged wave-particle theory of matter. That challenge fell to Erwin Schrdinger, whose formulation first published early in 1926 and called wave mechanics is still taught to science students today.

Schrdingers theory is really the classical theory of waves in which we introduce some quantum conditions using de Broglies relation. The result is Schrdingers wave equation, in which the motion of a particle such as an electron is calculated from its wave function.

Right from the very beginning, physicists were scratching their heads about Schrdingers wave function.

In classical mechanics, there are no real issues with the way we interpret the concepts represented in the theory, such as energy and momentum (which are called physical observables) and their relation to the properties of the objects that possess them.

Read more about the quantum world:

Want to calculate the classical momentum of an object flying through the air at a fixed speed? Easy. Measure the objects mass and its speed and multiply these together. Job done.

But what if you want to know the momentum of an electron moving freely in a vacuum? In quantum mechanics we calculate this by performing a specific mathematical operation on the electrons wave function.

Such operations are mathematical recipes, which we can think of as keys which unlock the wave function (depicted in this animation as a box), releasing the observable before closing again.

The operation is a key that unlocks the wave function

We calculate the momentum by opening the box using the momentum key. A different observable will require a different key.

So, if electrons behave like waves, can they be diffracted? If we push a beam of electrons through two slits side-by-side will we see interference fringes on a distant screen? What if we limit the intensity of the beam so that, on average, only one electron passes through the slits at a time. What then?

What we see is at first quite comforting. Each electron passing through the slits registers as a single spot on the screen, telling us that an electron struck here. This is perfectly consistent with notion of electrons as particles, as it seems they pass one by one through one or other of the slits and hit the screen in a seemingly random pattern.

Interference patterns appearing in a double slit experiment

But wait. The pattern isnt random. As more and more electrons pass through the slits we cross a threshold. We begin to see individual dots group together, overlap and merge. Eventually we get a two-slit interference pattern of alternating bright and dark fringes.

Alternatively, we conclude that the wave nature of the electron is an intrinsic behaviour. Each individual electron behaves as a wave, described by a wave function, passing through both slits simultaneously and interfering with itself before striking the screen.

So, how are we supposed to know precisely where the next electron will appear?

Schrdinger had wanted to interpret the wave function literally, as the theoretical representation of a matter wave. But to make sense of one-electron interference we must reach for an alternative interpretation suggested later in 1926 by Max Born.

Born reasoned that in quantum mechanics the wave function-squared is a measure of the probability of finding its associated electron in a certain spot.

The alternating peaks and troughs of the electron wave translate into a pattern of quantum probabilities in this location (which will become a bright fringe) theres a higher probability of finding the next electron, and in this other location (which will become a dark fringe) theres a very low or zero probability of finding the next electron.

Read more about physics:

Before an electron strikes the screen, it has a probability of being found here, there and most anywhere where the square of the wave function is bigger than zero. This probability of many states existing at the same time is known as quantum superposition.

Does this mean that an individual electron can be in more than one place at a time? No, not really. It is true to say that it has a probability of being found in more than one place at a time. And, if we want to interpret the wave function as a real physical thing, there is a sense in which this is delocalised or distributed.

But if by individual electron were referring to an electron as a particle, then there is a sense in which this doesnt exist as such until the wave function interacts with the screen, at which point it collapses and the electron appears here, in only one place.

One more thing. That theres a 50 percent probability that a tossed coin will land heads simply means that it has two sides and we have no way of knowing (or easily predicting) which way up it will land. This is a classical probability born of ignorance.

We can be confident that the coin continues to have two sides heads and tails as it spins through the air, but were ignorant of the exact details of its motion so we cant predict with certainty which side will land face up. In theory, we could, if we knew exactly how hard you flipped it at exactly what angle, and at exactly what height you would catch it.

Quantum probability is thought to be very different. When we toss a quantum coin we might actually be quite knowledgeable about most of the details of its motion, but we cant assume that heads and tails exist before the coin has landed, and we look.

So, it doesnt matter exactly how much information you have about the coin toss, you will never be able to say with any certainty what the result will be, because its not pre-determined like in a classical system.

Einstein deplored this seeming element of pure chance in quantum mechanics. He famously declared that: God does not play dice.

And then, in 1927, the debates began. What is the wave function and how should it be interpreted? What is quantum mechanics telling us about the nature of physical reality? And just what is this thing called reality, anyway?

Read the original post:

Everything you need to know about quantum physics (almost ...

Posted in Quantum Physics | Comments Off on Everything you need to know about quantum physics (almost …

Quantum mechanics – Wikipedia

Posted: at 2:19 pm

Branch of physics describing nature on an atomic scale

Quantum mechanics is a fundamental theory in physics that provides a description of the physical properties of nature at the scale of atoms and subatomic particles.[2]:1.1 It is the foundation of all quantum physics including quantum chemistry, quantum field theory, quantum technology, and quantum information science.

Classical physics, the description of physics that existed before the theory of relativity and quantum mechanics, describes many aspects of nature at an ordinary (macroscopic) scale, while quantum mechanics explains the aspects of nature at small (atomic and subatomic) scales, for which classical mechanics is insufficient. Most theories in classical physics can be derived from quantum mechanics as an approximation valid at large (macroscopic) scale.[3]

Quantum mechanics differs from classical physics in that energy, momentum, angular momentum, and other quantities of a bound system are restricted to discrete values (quantization), objects have characteristics of both particles and waves (wave-particle duality), and there are limits to how accurately the value of a physical quantity can be predicted prior to its measurement, given a complete set of initial conditions (the uncertainty principle).

Quantum mechanics arose gradually from theories to explain observations which could not be reconciled with classical physics, such as Max Planck's solution in 1900 to the black-body radiation problem, and the correspondence between energy and frequency in Albert Einstein's 1905 paper which explained the photoelectric effect. These early attempts to understand microscopic phenomena, now known as the "old quantum theory", led to the full development of quantum mechanics in the mid-1920s by Niels Bohr, Erwin Schrdinger, Werner Heisenberg, Max Born and others. The modern theory is formulated in various specially developed mathematical formalisms. In one of them, a mathematical entity called the wave function provides information, in the form of probability amplitudes, about what measurements of a particle's energy, momentum, and other physical properties may yield.

Quantum mechanics allows the calculation of probabilities for how physical systems can behave. It is typically applied to microscopic systems: molecules, atoms and sub-atomic particles. Predictions of quantum mechanics have been verified experimentally to an extremely high degree of accuracy.[note 1] A basic mathematical feature of quantum mechanics is that a probability is found by taking the square of the absolute value of a complex number, known as a probability amplitude. This is known as the Born rule, named after physicist Max Born. For example, a quantum particle like an electron can be described by a wave function, which associates to each point in space a probability amplitude. Applying the Born rule to these amplitudes gives a probability density function for the position that the electron will be found to have when an experiment is performed to measure it. The Schrdinger equation relates the collection of probability amplitudes that pertain to one moment of time to the collection of probability amplitudes that pertain to another.

One consequence of the mathematical rules of quantum mechanics is a tradeoff in predictability between different measurable quantities. The most famous form of this uncertainty principle says that no matter how a quantum particle is prepared or how carefully experiments upon it are arranged, it is impossible to have a precise prediction for a measurement of its position and also for a measurement of its momentum.

Another consequence of the mathematical rules of quantum mechanics is the phenomenon of quantum interference, which is often illustrated with the double-slit experiment. In the basic version of this experiment, a coherent light source, such as a laser beam, illuminates a plate pierced by two parallel slits, and the light passing through the slits is observed on a screen behind the plate.[4]:102111[2]:1.11.8 The wave nature of light causes the light waves passing through the two slits to interfere, producing bright and dark bands on the screen a result that would not be expected if light consisted of classical particles.[4] However, the light is always found to be absorbed at the screen at discrete points, as individual particles rather than waves; the interference pattern appears via the varying density of these particle hits on the screen. Furthermore, versions of the experiment that include detectors at the slits find that each detected photon passes through one slit (as would a classical particle), and not through both slits (as would a wave).[4]:109[5][6] However, such experiments demonstrate that particles do not form the interference pattern if one detects which slit they pass through. Other atomic-scale entities, such as electrons, are found to exhibit the same behavior when fired towards a double slit.[2] This behavior is known as wave-particle duality.

Another counter-intuitive phenomenon predicted by quantum mechanics is quantum tunnelling: a particle that goes up against a potential barrier can cross it, even if its kinetic energy is smaller than the maximum of the potential.[7] In classical mechanics this particle would be trapped. Quantum tunnelling has several important consequences, enabling radioactive decay, nuclear fusion in stars, and applications such as scanning tunnelling microscopy and the tunnel diode.[8]

When quantum systems interact, the result can be the creation of quantum entanglement: their properties become so intertwined that a description of the whole solely in terms of the individual parts is no longer possible. Erwin Schrdinger called entanglement "...the characteristic trait of quantum mechanics, the one that enforces its entire departure from classical lines of thought".[9] Quantum entanglement enables the counter-intuitive properties of quantum pseudo-telepathy, and can be a valuable resource in communication protocols, such as quantum key distribution and superdense coding.[10] Contrary to popular misconception, entanglement does not allow sending signals faster than light, as demonstrated by the no-communication theorem.[10]

Another possibility opened by entanglement is testing for "hidden variables", hypothetical properties more fundamental than the quantities addressed in quantum theory itself, knowledge of which would allow more exact predictions than quantum theory can provide. A collection of results, most significantly Bell's theorem, have demonstrated that broad classes of such hidden-variable theories are in fact incompatible with quantum physics. According to Bell's theorem, if nature actually operates in accord with any theory of local hidden variables, then the results of a Bell test will be constrained in a particular, quantifiable way. Many Bell tests have been performed, using entangled particles, and they have shown results incompatible with the constraints imposed by local hidden variables.[11][12]

It is not possible to present these concepts in more than a superficial way without introducing the actual mathematics involved; understanding quantum mechanics requires not only manipulating complex numbers, but also linear algebra, differential equations, group theory, and other more advanced subjects.[note 2] Accordingly, this article will present a mathematical formulation of quantum mechanics and survey its application to some useful and oft-studied examples.

In the mathematically rigorous formulation of quantum mechanics developed by Paul Dirac,[15] David Hilbert,[16] John von Neumann,[17] and Hermann Weyl,[18] the state of a quantum mechanical system is a vector {displaystyle psi } belonging to a (separable) Hilbert space H {displaystyle {mathcal {H}}} . This vector is postulated to be normalized under the Hilbert space inner product, that is, it obeys , = 1 {displaystyle langle psi ,psi rangle =1} , and it is well-defined up to a complex number of modulus 1 (the global phase), that is, {displaystyle psi } and e i {displaystyle e^{ialpha }psi } represent the same physical system. In other words, the possible states are points in the projective space of a Hilbert space, usually called the complex projective space. The exact nature of this Hilbert space is dependent on the system for example, for describing position and momentum the Hilbert space is the space of complex square-integrable functions L 2 ( C ) {displaystyle L^{2}(mathbb {C} )} , while the Hilbert space for the spin of a single proton is simply the space of two-dimensional complex vectors C 2 {displaystyle mathbb {C} ^{2}} with the usual inner product.

Physical quantities of interest position, momentum, energy, spin are represented by observables, which are Hermitian (more precisely, self-adjoint) linear operators acting on the Hilbert space. A quantum state can be an eigenvector of an observable, in which case it is called an eigenstate, and the associated eigenvalue corresponds to the value of the observable in that eigenstate. More generally, a quantum state will be a linear combination of the eigenstates, known as a quantum superposition. When an observable is measured, the result will be one of its eigenvalues with probability given by the Born rule: in the simplest case the eigenvalue {displaystyle lambda } is non-degenerate and the probability is given by | , | 2 {displaystyle |langle {vec {lambda }},psi rangle |^{2}} , where {displaystyle {vec {lambda }}} is its associated eigenvector. More generally, the eigenvalue is degenerate and the probability is given by , P {displaystyle langle psi ,P_{lambda }psi rangle } , where P {displaystyle P_{lambda }} is the projector onto its associated eigenspace.

After the measurement, if result {displaystyle lambda } was obtained, the quantum state is postulated to collapse to {displaystyle {vec {lambda }}} , in the non-degenerate case, or to P / , P {displaystyle P_{lambda }psi /{sqrt {langle psi ,P_{lambda }psi rangle }}} , in the general case. The probabilistic nature of quantum mechanics thus stems from the act of measurement. This is one of the most difficult aspects of quantum systems to understand. It was the central topic in the famous BohrEinstein debates, in which the two scientists attempted to clarify these fundamental principles by way of thought experiments. In the decades after the formulation of quantum mechanics, the question of what constitutes a "measurement" has been extensively studied. Newer interpretations of quantum mechanics have been formulated that do away with the concept of "wave function collapse" (see, for example, the many-worlds interpretation). The basic idea is that when a quantum system interacts with a measuring apparatus, their respective wave functions become entangled, so that the original quantum system ceases to exist as an independent entity. For details, see the article on measurement in quantum mechanics.[19]

The time evolution of a quantum state is described by the Schrdinger equation:

Here H {displaystyle H} denotes the Hamiltonian, the observable corresponding to the total energy of the system. The constant i {displaystyle ihbar } is introduced so that the Hamiltonian is reduced to the classical Hamiltonian in cases where the quantum system can be approximated by a classical system; the ability to make such an approximation in certain limits is called the correspondence principle.

The solution of this differential equation is given by

The operator U ( t ) = e i H t / {displaystyle U(t)=e^{-iHt/hbar }} is known as the time-evolution operator, and has the crucial property that it is unitary. This time evolution is deterministic in the sense that given an initial quantum state ( 0 ) {displaystyle psi (0)} it makes a definite prediction of what the quantum state ( t ) {displaystyle psi (t)} will be at any later time.[20]

Some wave functions produce probability distributions that are independent of time, such as eigenstates of the Hamiltonian. Many systems that are treated dynamically in classical mechanics are described by such "static" wave functions. For example, a single electron in an unexcited atom is pictured classically as a particle moving in a circular trajectory around the atomic nucleus, whereas in quantum mechanics, it is described by a static wave function surrounding the nucleus. For example, the electron wave function for an unexcited hydrogen atom is a spherically symmetric function known as an s orbital (Fig. 1).

Analytic solutions of the Schrdinger equation are known for very few relatively simple model Hamiltonians including the quantum harmonic oscillator, the particle in a box, the dihydrogen cation, and the hydrogen atom. Even the helium atom which contains just two electrons has defied all attempts at a fully analytic treatment.

However, there are techniques for finding approximate solutions. One method, called perturbation theory, uses the analytic result for a simple quantum mechanical model to create a result for a related but more complicated model by (for example) the addition of a weak potential energy. Another method is called "semi-classical equation of motion", which applies to systems for which quantum mechanics produces only small deviations from classical behavior. These deviations can then be computed based on the classical motion. This approach is particularly important in the field of quantum chaos.

One consequence of the basic quantum formalism is the uncertainty principle. In its most familiar form, this states that no preparation of a quantum particle can imply simultaneously precise predictions both for a measurement of its position and for a measurement of its momentum.[21][22] Both position and momentum are observables, meaning that they are represented by Hermitian operators. The position operator X ^ {displaystyle {hat {X}}} and momentum operator P ^ {displaystyle {hat {P}}} do not commute, but rather satisfy the canonical commutation relation:

Given a quantum state, the Born rule lets us compute expectation values for both X {displaystyle X} and P {displaystyle P} , and moreover for powers of them. Defining the uncertainty for an observable by a standard deviation, we have

and likewise for the momentum:

The uncertainty principle states that

Either standard deviation can in principle be made arbitrarily small, but not both simultaneously.[23] This inequality generalizes to arbitrary pairs of self-adjoint operators A {displaystyle A} and B {displaystyle B} . The commutator of these two operators is

and this provides the lower bound on the product of standard deviations:

Another consequence of the canonical commutation relation is that the position and momentum operators are Fourier transforms of each other, so that a description of an object according to its momentum is the Fourier transform of its description according to its position. The fact that dependence in momentum is the Fourier transform of the dependence in position means that the momentum operator is equivalent (up to an i / {displaystyle i/hbar } factor) to taking the derivative according to the position, since in Fourier analysis differentiation corresponds to multiplication in the dual space. This is why in quantum equations in position space, the momentum p i {displaystyle p_{i}} is replaced by i x {displaystyle -ihbar {frac {partial }{partial x}}} , and in particular in the non-relativistic Schrdinger equation in position space the momentum-squared term is replaced with a Laplacian times 2 {displaystyle -hbar ^{2}} .[21]

When two different quantum systems are considered together, the Hilbert space of the combined system is the tensor product of the Hilbert spaces of the two components. For example, let A and B be two quantum systems, with Hilbert spaces H A {displaystyle {mathcal {H}}_{A}} and H B {displaystyle {mathcal {H}}_{B}} , respectively. The Hilbert space of the composite system is then

If the state for the first system is the vector A {displaystyle psi _{A}} and the state for the second system is B {displaystyle psi _{B}} , then the state of the composite system is

Not all states in the joint Hilbert space H A B {displaystyle {mathcal {H}}_{AB}} can be written in this form, however, because the superposition principle implies that linear combinations of these "separable" or "product states" are also valid. For example, if A {displaystyle psi _{A}} and A {displaystyle phi _{A}} are both possible states for system A {displaystyle A} , and likewise B {displaystyle psi _{B}} and B {displaystyle phi _{B}} are both possible states for system B {displaystyle B} , then

is a valid joint state that is not separable. States that are not separable are called entangled.[24][25]

If the state for a composite system is entangled, it is impossible to describe either component system A or system B by a state vector. One can instead define reduced density matrices that describe the statistics that can be obtained by making measurements on either component system alone. This necessarily causes a loss of information, though: knowing the reduced density matrices of the individual systems is not enough to reconstruct the state of the composite system.[24][25] Just as density matrices specify the state of a subsystem of a larger system, analogously, positive operator-valued measures (POVMs) describe the effect on a subsystem of a measurement performed on a larger system. POVMs are extensively used in quantum information theory.[24][26]

As described above, entanglement is a key feature of models of measurement processes in which an apparatus becomes entangled with the system being measured. Systems interacting with the environment in which they reside generally become entangled with that environment, a phenomenon known as quantum decoherence. This can explain why, in practice, quantum effects are difficult to observe in systems larger than microscopic.[27]

There are many mathematically equivalent formulations of quantum mechanics. One of the oldest and most common is the "transformation theory" proposed by Paul Dirac, which unifies and generalizes the two earliest formulations of quantum mechanics matrix mechanics (invented by Werner Heisenberg) and wave mechanics (invented by Erwin Schrdinger).[28] An alternative formulation of quantum mechanics is Feynman's path integral formulation, in which a quantum-mechanical amplitude is considered as a sum over all possible classical and non-classical paths between the initial and final states. This is the quantum-mechanical counterpart of the action principle in classical mechanics.

The Hamiltonian H {displaystyle H} is known as the generator of time evolution, since it defines a unitary time-evolution operator U ( t ) = e i H t / {displaystyle U(t)=e^{-iHt/hbar }} for each value of t {displaystyle t} . From this relation between U ( t ) {displaystyle U(t)} and H {displaystyle H} , it follows that any observable A {displaystyle A} that commutes with H {displaystyle H} will be conserved: its expectation value will not change over time. This statement generalizes, as mathematically, any Hermitian operator A {displaystyle A} can generate a family of unitary operators parameterized by a variable t {displaystyle t} . Under the evolution generated by A {displaystyle A} , any observable B {displaystyle B} that commutes with A {displaystyle A} will be conserved. Moreover, if B {displaystyle B} is conserved by evolution under A {displaystyle A} , then A {displaystyle A} is conserved under the evolution generated by B {displaystyle B} . This implies a quantum version of the result proven by Emmy Noether in classical (Lagrangian) mechanics: for every differentiable symmetry of a Hamiltonian, there exists a corresponding conservation law.

The simplest example of quantum system with a position degree of freedom is a free particle in a single spatial dimension. A free particle is one which is not subject to external influences, so that its Hamiltonian consists only of its kinetic energy:

The general solution of the Schrdinger equation is given by

which is a superposition of all possible plane waves e i ( k x k 2 2 m t ) {displaystyle e^{i(kx-{frac {hbar k^{2}}{2m}}t)}} , which are eigenstates of the momentum operator with momentum p = k {displaystyle p=hbar k} . The coefficients of the superposition are ^ ( k , 0 ) {displaystyle {hat {psi }}(k,0)} , which is the Fourier transform of the initial quantum state ( x , 0 ) {displaystyle psi (x,0)} .

It is not possible for the solution to be a single momentum eigenstate, or a single position eigenstate, as these are not normalizable quantum states.[note 3] Instead, we can consider a Gaussian wave packet:

which has Fourier transform, and therefore momentum distribution

We see that as we make a smaller the spread in position gets smaller, but the spread in momentum gets larger. Conversely, by making a larger we make the spread in momentum smaller, but the spread in position gets larger. This illustrates the uncertainty principle.

As we let the Gaussian wave packet evolve in time, we see that its center moves through space at a constant velocity (like a classical particle with no forces acting on it). However, the wave packet will also spread out as time progresses, which means that the position becomes more and more uncertain. The uncertainty in momentum, however, stays constant.[29]

The particle in a one-dimensional potential energy box is the most mathematically simple example where restraints lead to the quantization of energy levels. The box is defined as having zero potential energy everywhere inside a certain region, and therefore infinite potential energy everywhere outside that region.[21]:7778 For the one-dimensional case in the x {displaystyle x} direction, the time-independent Schrdinger equation may be written

With the differential operator defined by

the previous equation is evocative of the classic kinetic energy analogue,

with state {displaystyle psi } in this case having energy E {displaystyle E} coincident with the kinetic energy of the particle.

The general solutions of the Schrdinger equation for the particle in a box are

or, from Euler's formula,

The infinite potential walls of the box determine the values of C , D , {displaystyle C,D,} and k {displaystyle k} at x = 0 {displaystyle x=0} and x = L {displaystyle x=L} where {displaystyle psi } must be zero. Thus, at x = 0 {displaystyle x=0} ,

and D = 0 {displaystyle D=0} . At x = L {displaystyle x=L} ,

in which C {displaystyle C} cannot be zero as this would conflict with the postulate that {displaystyle psi } has norm 1. Therefore, since sin ( k L ) = 0 {displaystyle sin(kL)=0} , k L {displaystyle kL} must be an integer multiple of {displaystyle pi } ,

This constraint on k {displaystyle k} implies a constraint on the energy levels, yielding

E n = 2 2 n 2 2 m L 2 = n 2 h 2 8 m L 2 . {displaystyle E_{n}={frac {hbar ^{2}pi ^{2}n^{2}}{2mL^{2}}}={frac {n^{2}h^{2}}{8mL^{2}}}.}

A finite potential well is the generalization of the infinite potential well problem to potential wells having finite depth. The finite potential well problem is mathematically more complicated than the infinite particle-in-a-box problem as the wave function is not pinned to zero at the walls of the well. Instead, the wave function must satisfy more complicated mathematical boundary conditions as it is nonzero in regions outside the well. Another related problem is that of the rectangular potential barrier, which furnishes a model for the quantum tunneling effect that plays an important role in the performance of modern technologies such as flash memory and scanning tunneling microscopy.

As in the classical case, the potential for the quantum harmonic oscillator is given by

This problem can either be treated by directly solving the Schrdinger equation, which is not trivial, or by using the more elegant "ladder method" first proposed by Paul Dirac. The eigenstates are given by

where Hn are the Hermite polynomials

and the corresponding energy levels are

This is another example illustrating the discretization of energy for bound states.

The MachZehnder interferometer (MZI) illustrates the concepts of superposition and interference with linear algebra in dimension 2, rather than differential equations. It can be seen as a simplified version of the double-slit experiment, but it is of interest in its own right, for example in the delayed choice quantum eraser, the ElitzurVaidman bomb tester, and in studies of quantum entanglement.[30][31]

We can model a photon going through the interferometer by considering that at each point it can be in a superposition of only two paths: the "lower" path which starts from the left, goes straight through both beam splitters, and ends at the top, and the "upper" path which starts from the bottom, goes straight through both beam splitters, and ends at the right. The quantum state of the photon is therefore a vector C 2 {displaystyle psi in mathbb {C} ^{2}} that is a superposition of the "lower" path l = ( 1 0 ) {displaystyle psi _{l}={begin{pmatrix}1\0end{pmatrix}}} and the "upper" path u = ( 0 1 ) {displaystyle psi _{u}={begin{pmatrix}0\1end{pmatrix}}} , that is, = l + u {displaystyle psi =alpha psi _{l}+beta psi _{u}} for complex , {displaystyle alpha ,beta } such that | | 2 + | | 2 = 1 {displaystyle |alpha |^{2}+|beta |^{2}=1} .

Both beam splitters are modelled as the unitary matrix B = 1 2 ( 1 i i 1 ) {displaystyle B={frac {1}{sqrt {2}}}{begin{pmatrix}1&i\i&1end{pmatrix}}} , which means that when a photon meets the beam splitter it will either stay on the same path with a probability amplitude of 1 / 2 {displaystyle 1/{sqrt {2}}} , or be reflected to the other path with a probability amplitude of i / 2 {displaystyle i/{sqrt {2}}} . The phase shifter on the upper arm is modelled as the unitary matrix P = ( 1 0 0 e i ) {displaystyle P={begin{pmatrix}1&0\0&e^{iDelta Phi }end{pmatrix}}} , which means that if the photon is on the "upper" path it will gain a relative phase of {displaystyle Delta Phi } , and it will stay unchanged if it is in the lower path.

A photon that enters the interferometer from the left will then end up in the state

and the probabilities that it will be detected at the right or at the top are given respectively by

One can therefore use the MachZehnder interferometer to estimate the phase shift by estimating these probabilities.

It is interesting to consider what would happen if the photon were definitely in either the "lower" or "upper" paths between the beam splitters. This can be accomplished by blocking one of the paths, or equivalently by removing the first beam splitter (and feeding the photon from the left or the bottom, as desired). In both cases there will be no interference between the paths anymore, and the probabilities are given by p ( u ) = p ( l ) = 1 / 2 {displaystyle p(u)=p(l)=1/2} , independently of the phase {displaystyle Delta Phi } . From this we can conclude that the photon does not take one path or another after the first beam splitter, but rather that it is in a genuine quantum superposition of the two paths.[32]

Quantum mechanics has had enormous success in explaining many of the features of our universe, with regards to small-scale and discrete quantities and interactions which cannot be explained by classical methods.[note 4] Quantum mechanics is often the only theory that can reveal the individual behaviors of the subatomic particles that make up all forms of matter (electrons, protons, neutrons, photons, and others). Solid-state physics and materials science are dependent upon quantum mechanics.

In many aspects modern technology operates at a scale where quantum effects are significant. Important applications of quantum theory include quantum chemistry, quantum optics, quantum computing, superconducting magnets, light-emitting diodes, the optical amplifier and the laser, the transistor and semiconductors such as the microprocessor, medical and research imaging such as magnetic resonance imaging and electron microscopy.[33] Explanations for many biological and physical phenomena are rooted in the nature of the chemical bond, most notably the macro-molecule DNA.

The rules of quantum mechanics assert that the state space of a system is a Hilbert space and that observables of the system are Hermitian operators acting on vectors in that space although they do not tell us which Hilbert space or which operators. These can be chosen appropriately in order to obtain a quantitative description of a quantum system, a necessary step in making physical predictions. An important guide for making these choices is the correspondence principle, a heuristic which states that the predictions of quantum mechanics reduce to those of classical mechanics in the regime of large quantum numbers.[34] One can also start from an established classical model of a particular system, and then try to guess the underlying quantum model that would give rise to the classical model in the correspondence limit. This approach is known as quantization.

When quantum mechanics was originally formulated, it was applied to models whose correspondence limit was non-relativistic classical mechanics. For instance, the well-known model of the quantum harmonic oscillator uses an explicitly non-relativistic expression for the kinetic energy of the oscillator, and is thus a quantum version of the classical harmonic oscillator.

Complications arise with chaotic systems, which do not have good quantum numbers, and quantum chaos studies the relationship between classical and quantum descriptions in these systems.

Quantum decoherence is a mechanism through which quantum systems lose coherence, and thus become incapable of displaying many typically quantum effects: quantum superpositions become simply probabilistic mixtures, and quantum entanglement becomes simply classical correlations. Quantum coherence is not typically evident at macroscopic scales, except maybe at temperatures approaching absolute zero at which quantum behavior may manifest macroscopically.[note 5]

Many macroscopic properties of a classical system are a direct consequence of the quantum behavior of its parts. For example, the stability of bulk matter (consisting of atoms and molecules which would quickly collapse under electric forces alone), the rigidity of solids, and the mechanical, thermal, chemical, optical and magnetic properties of matter are all results of the interaction of electric charges under the rules of quantum mechanics.[35]

Early attempts to merge quantum mechanics with special relativity involved the replacement of the Schrdinger equation with a covariant equation such as the KleinGordon equation or the Dirac equation. While these theories were successful in explaining many experimental results, they had certain unsatisfactory qualities stemming from their neglect of the relativistic creation and annihilation of particles. A fully relativistic quantum theory required the development of quantum field theory, which applies quantization to a field (rather than a fixed set of particles). The first complete quantum field theory, quantum electrodynamics, provides a fully quantum description of the electromagnetic interaction. Quantum electrodynamics is, along with general relativity, one of the most accurate physical theories ever devised.[36][37]

The full apparatus of quantum field theory is often unnecessary for describing electrodynamic systems. A simpler approach, one that has been used since the inception of quantum mechanics, is to treat charged particles as quantum mechanical objects being acted on by a classical electromagnetic field. For example, the elementary quantum model of the hydrogen atom describes the electric field of the hydrogen atom using a classical e 2 / ( 4 0 r ) {displaystyle textstyle -e^{2}/(4pi epsilon _{_{0}}r)} Coulomb potential. This "semi-classical" approach fails if quantum fluctuations in the electromagnetic field play an important role, such as in the emission of photons by charged particles.

Quantum field theories for the strong nuclear force and the weak nuclear force have also been developed. The quantum field theory of the strong nuclear force is called quantum chromodynamics, and describes the interactions of subnuclear particles such as quarks and gluons. The weak nuclear force and the electromagnetic force were unified, in their quantized forms, into a single quantum field theory (known as electroweak theory), by the physicists Abdus Salam, Sheldon Glashow and Steven Weinberg.[38]

Even though the predictions of both quantum theory and general relativity have been supported by rigorous and repeated empirical evidence, their abstract formalisms contradict each other and they have proven extremely difficult to incorporate into one consistent, cohesive model. Gravity is negligible in many areas of particle physics, so that unification between general relativity and quantum mechanics is not an urgent issue in those particular applications. However, the lack of a correct theory of quantum gravity is an important issue in physical cosmology and the search by physicists for an elegant "Theory of Everything" (TOE). Consequently, resolving the inconsistencies between both theories has been a major goal of 20th- and 21st-century physics. This TOE would combine not only the models of subatomic physics, but also derive the four fundamental forces of nature from a single force or phenomenon.

One proposal for doing so is string theory, which posits that the point-like particles of particle physics are replaced by one-dimensional objects called strings. String theory describes how these strings propagate through space and interact with each other. On distance scales larger than the string scale, a string looks just like an ordinary particle, with its mass, charge, and other properties determined by the vibrational state of the string. In string theory, one of the many vibrational states of the string corresponds to the graviton, a quantum mechanical particle that carries gravitational force.[39][40]

Another popular theory is loop quantum gravity (LQG), which describes quantum properties of gravity and is thus a theory of quantum spacetime. LQG is an attempt to merge and adapt standard quantum mechanics and standard general relativity. This theory describes space as an extremely fine fabric "woven" of finite loops called spin networks. The evolution of a spin network over time is called a spin foam. The characteristic length scale of a spin foam is the Planck length, approximately 1.6161035 m, and so lengths shorter than the Planck length are not physically meaningful in LQG.[41]

Since its inception, the many counter-intuitive aspects and results of quantum mechanics have provoked strong philosophical debates and many interpretations. The arguments centre on the probabilistic nature of quantum mechanics, the difficulties with wavefunction collapse and the related measurement problem, and quantum nonlocality. Perhaps the only consensus that exists about these issues is that there is no consensus. Richard Feynman once said, "I think I can safely say that nobody understands quantum mechanics."[42] According to Steven Weinberg, "There is now in my opinion no entirely satisfactory interpretation of quantum mechanics."[43]

The views of Niels Bohr, Werner Heisenberg and other physicists are often grouped together as the "Copenhagen interpretation".[44][45] According to these views, the probabilistic nature of quantum mechanics is not a temporary feature which will eventually be replaced by a deterministic theory, but is instead a final renunciation of the classical idea of "causality". Bohr in particular emphasized that any well-defined application of the quantum mechanical formalism must always make reference to the experimental arrangement, due to the complementary nature of evidence obtained under different experimental situations. Copenhagen-type interpretations remain popular in the 21st century.[46]

Albert Einstein, himself one of the founders of quantum theory, was troubled by its apparent failure to respect some cherished metaphysical principles, such determinism and locality. Einstein's long-running exchanges with Bohr about the meaning and status of quantum mechanics are now known as the BohrEinstein debates. Einstein believed that underlying quantum mechanics must be a theory that explicitly forbids action at a distance. He argued that quantum mechanics was incomplete, a theory that was valid but not fundamental, analogous to how thermodynamics is valid, but the fundamental theory behind it is statistical mechanics. In 1935, Einstein and his collaborators Boris Podolsky and Nathan Rosen published an argument that the principle of locality implies the incompleteness of quantum mechanics, a thought experiment later termed the EinsteinPodolskyRosen paradox.[note 6] In 1964, John Bell showed that EPR's principle of locality, together with determinism, was actually incompatible with quantum mechanics: they implied constraints on the correlations produced by distance systems, now known as Bell inequalities, that can be violated by entangled particles.[51] Since then several experiments have been performed to obtain these correlations, with the result that they do in fact violate Bell inequalities, and thus falsify the conjunction of locality with determinism.[11][12]

Bohmian mechanics shows that it is possible to reformulate quantum mechanics to make it deterministic, at the price of making it explicitly nonlocal. It attributes not only a wave function to a physical system, but in addition a real position, that evolves deterministically under a nonlocal guiding equation. The evolution of a physical system is given at all times by the Schrdinger equation together with the guiding equation; there is never a collapse of the wave function. This solves the measurement problem.[52]

Everett's many-worlds interpretation, formulated in 1956, holds that all the possibilities described by quantum theory simultaneously occur in a multiverse composed of mostly independent parallel universes.[53] This is not accomplished by introducing a "new axiom" to quantum mechanics, but by removing the axiom of the collapse of the wave packet. All possible states of the measured system and the measuring apparatus, together with the observer, are present in a real physical quantum superposition. While the multiverse is deterministic, we perceive non-deterministic behavior governed by probabilities, because we don't observe the multiverse as a whole, but only one parallel universe at a time. Exactly how this is supposed to work has been the subject of much debate. Why we should assign probabilities at all to outcomes that are certain to occur in some worlds, and why should the probabilities be given by the Born rule?[54] Everett tried to answer both questions in the paper that introduced many-worlds; his derivation of the Born rule has been criticized as relying on unmotivated assumptions.[55] Since then several other derivations of the Born rule in the many-worlds framework have been proposed. There is no consensus on whether this has been successful.[56][57]

Relational quantum mechanics appeared in the late 1990s as a modern derivative of Copenhagen-type ideas,[58] and QBism was developed some years later.[59]

Quantum mechanics was developed in the early decades of the 20th century, driven by the need to explain phenomena that, in some cases, had been observed in earlier times. Scientific inquiry into the wave nature of light began in the 17th and 18th centuries, when scientists such as Robert Hooke, Christiaan Huygens and Leonhard Euler proposed a wave theory of light based on experimental observations.[60] In 1803 English polymath Thomas Young described the famous double-slit experiment.[61] This experiment played a major role in the general acceptance of the wave theory of light.

In 1838 Michael Faraday discovered cathode rays. These studies were followed by the 1859 statement of the black-body radiation problem by Gustav Kirchhoff, the 1877 suggestion by Ludwig Boltzmann that the energy states of a physical system can be discrete, and the 1900 quantum hypothesis of Max Planck.[62] Planck's hypothesis that energy is radiated and absorbed in discrete "quanta" (or energy packets) precisely matched the observed patterns of black-body radiation. The word quantum derives from the Latin, meaning "how great" or "how much".[63] According to Planck, quantities of energy could be thought of as divided into "elements" whose size (E) would be proportional to their frequency ():

where h is Planck's constant. Planck cautiously insisted that this was only an aspect of the processes of absorption and emission of radiation and was not the physical reality of the radiation.[64] In fact, he considered his quantum hypothesis a mathematical trick to get the right answer rather than a sizable discovery.[65] However, in 1905 Albert Einstein interpreted Planck's quantum hypothesis realistically and used it to explain the photoelectric effect, in which shining light on certain materials can eject electrons from the material. Niels Bohr then developed Planck's ideas about radiation into a model of the hydrogen atom that successfully predicted the spectral lines of hydrogen.[66] Einstein further developed this idea to show that an electromagnetic wave such as light could also be described as a particle (later called the photon), with a discrete amount of energy that depends on its frequency.[67] In his paper "On the Quantum Theory of Radiation," Einstein expanded on the interaction between energy and matter to explain the absorption and emission of energy by atoms. Although overshadowed at the time by his general theory of relativity, this paper articulated the mechanism underlying the stimulated emission of radiation,[68] which became the basis of the laser.

This phase is known as the old quantum theory. Never complete or self-consistent, the old quantum theory was rather a set of heuristic corrections to classical mechanics.[69] The theory is now understood as a semi-classical approximation[70] to modern quantum mechanics.[71] Notable results from this period include, in addition to the work of Planck, Einstein and Bohr mentioned above, Einstein and Peter Debye's work on the specific heat of solids, Bohr and Hendrika Johanna van Leeuwen's proof that classical physics cannot account for diamagnetism, and Arnold Sommerfeld's extension of the Bohr model to include special-relativistic effects.

In the mid-1920s quantum mechanics was developed to become the standard formulation for atomic physics. In the summer of 1925, Bohr and Heisenberg published results that closed the old quantum theory. Heisenberg, Max Born, and Pascual Jordan pioneered matrix mechanics. The following year, Erwin Schrdinger suggested a partial differential equation for the wave functions of particles like electrons. And when effectively restricted to a finite region, this equation allowed only certain modes, corresponding to discrete quantum states whose properties turned out to be exactly the same as implied by matrix mechanics. Born introduced the probabilistic interpretation of Schrdinger's wave function in July 1926.[72] Thus, the entire field of quantum physics emerged, leading to its wider acceptance at the Fifth Solvay Conference in 1927.[73]

By 1930 quantum mechanics had been further unified and formalized by David Hilbert, Paul Dirac and John von Neumann[74] with greater emphasis on measurement, the statistical nature of our knowledge of reality, and philosophical speculation about the 'observer'. It has since permeated many disciplines, including quantum chemistry, quantum electronics, quantum optics, and quantum information science. It also provides a useful framework for many features of the modern periodic table of elements, and describes the behaviors of atoms during chemical bonding and the flow of electrons in computer semiconductors, and therefore plays a crucial role in many modern technologies. While quantum mechanics was constructed to describe the world of the very small, it is also needed to explain some macroscopic phenomena such as superconductors[75] and superfluids.[76]

Its speculative modern developments include string theory and other attempts to build a quantum theory of gravity.

The following titles, all by working physicists, attempt to communicate quantum theory to lay people, using a minimum of technical apparatus.

More technical:

On Wikibooks

Go here to read the rest:

Quantum mechanics - Wikipedia

Posted in Quantum Physics | Comments Off on Quantum mechanics – Wikipedia

Six Things Everyone Should Know About Quantum Physics

Posted: at 2:19 pm

Quantum physics is usually just intimidating from the get-go. It's kind of weird and can seem counter-intuitive, even for the physicists who deal with it every day. But it's not incomprehensible. If you're reading something about quantum physics, there are really six key concepts about it that you should keep in mind. Do that, and you'll find quantum physics a lot easier to understand.

Everything Is Made Of Waves; Also, Particles

Light as both a particle and a wave. (Image credit: Fabrizio Carbone/EPFL)

There's lots of places to start this sort of discussion, and this is as good as any: everything in the universe has both particle and wave nature, at the same time. There's a line in Greg Bear's fantasy duology (The Infinity Concerto and The Serpent Mage), where a character describing the basics of magic says "All is waves, with nothing waving, over no distance at all." I've always really liked that as a poetic description of quantum physics-- deep down, everything in the universe has wave nature.

Of course, everything in the universe also has particle nature. This seems completely crazy, but is an experimental fact, worked out by a surprisingly familiar process:

(there's also an animated version of this I did for TED-Ed).

Of course, describing real objects as both particles and waves is necessarily somewhat imprecise. Properly speaking, the objects described by quantum physics are neither particles nor waves, but a third category that shares some properties of waves (a characteristic frequency and wavelength, some spread over space) and some properties of particles (they're generally countable and can be localized to some degree). This leads to some lively debate within the physics education community about whether it's really appropriate to talk about light as a particle in intro physics courses; not because there's any controversy about whether light has some particle nature, but because calling photons "particles" rather than "excitations of a quantum field" might lead to some student misconceptions. I tend not to agree with this, because many of the same concerns could be raised about calling electrons "particles," but it makes for a reliable source of blog conversations.

This "door number three" nature of quantum objects is reflected in the sometimes confusing language physicists use to talk about quantum phenomena. The Higgs boson was discovered at the Large Hadron Collider as a particle, but you will also hear physicists talk about the "Higgs field" as a delocalized thing filling all of space. This happens because in some circumstances, such as collider experiments, it's more convenient to discuss excitations of the Higgs field in a way that emphasizes the particle-like characteristics, while in other circumstances, like general discussion of why certain particles have mass, it's more convenient to discuss the physics in terms of interactions with a universe-filling quantum field. It's just different language describing the same mathematical object.

Quantum Physics Is Discrete

These oscillations created an image of "frozen" light. (Credit: Princeton)

It's right there in the name-- the word "quantum" comes from the Latin for "how much" and reflects the fact that quantum models always involve something coming in discrete amounts. The energy contained in a quantum field comes in integer multiples of some fundamental energy. For light, this is associated with the frequency and wavelength of the light-- high-frequency, short-wavelength light has a large characteristic energy, which low-frequency, long-wavelength light has a small characteristic energy.

In both cases, though, the total energy contained in a particular light field is an integer multiple of that energy-- 1, 2, 14, 137 times-- never a weird fraction like one-and-a-half, , or the square root of two. This property is also seen in the discrete energy levels of atoms, and the energy bands of solids-- certain values of energy are allowed, others are not. Atomic clocks work because of the discreteness of quantum physics, using the frequency of light associated with a transition between two allowed states in cesium to keep time at a level requiring the much-discussed "leap second" added last week.

Ultra-precise spectroscopy can also be used to look for things like dark matter, and is part of the motivation for a low-energy fundamental physics institute.

This isn't always obvious-- even some things that are fundamentally quantum, like black-body radiation, appear to involve continuous distributions. But there's always a kind of granularity to the underlying reality if you dig into the mathematics, and that's a large part of what leads to the weirdness of the theory.

Quantum Physics Is Probabilistic

(Credit: Graham Barclay/Bloomberg News)

One of the most surprising and (historically, at least) controversial aspects of quantum physics is that it's impossible to predict with certainty the outcome of a single experiment on a quantum system. When physicists predict the outcome of some experiment, the prediction always takes the form of a probability for finding each of the particular possible outcomes, and comparisons between theory and experiment always involve inferring probability distributions from many repeated experiments.

The mathematical description of a quantum system typically takes the form of a "wavefunction," generally represented in equations by the Greek letter psi:. There's a lot of debate about what, exactly, this wavefunction represents, breaking down into two main camps: those who think of the wavefunction as a real physical thing (the jargon term for these is "ontic" theories, leading some witty person to dub their proponents "psi-ontologists") and those who think of the wavefunction as merely an expression of our knowledge (or lack thereof) regarding the underlying state of a particular quantum object ("epistemic" theories).

In either class of foundational model, the probability of finding an outcome is not given directly by the wavefunction, but by the square of the wavefunction (loosely speaking, anyway; the wavefunction is a complex mathematical object (meaning it involves imaginary numbers like the square root of negative one), and the operation to get probability is slightly more involved, but "square of the wavefunction" is enough to get the basic idea). This is known as the "Born Rule" after German physicist Max Born who first suggested this (in a footnote to a paper in 1926), and strikes some people as an ugly ad hoc addition. There's an active effort in some parts of the quantum foundations community to find a way to derive the Born rule from a more fundamental principle; to date, none of these have been fully successful, but it generates a lot of interesting science.

This is also the aspect of the theory that leads to things like particles being in multiple states at the same time. All we can predict is probability, and prior to a measurement that determines a particular outcome, the system being measured is in an indeterminate state that mathematically maps to a superposition of all possibilities with different probabilities. Whether you consider this as the system really being in all of the states at once, or just being in one unknown state depends largely on your feelings about ontic versus epistemic models, though these are both subject to constraints from the next item on the list:

Quantum Physics Is Non-Local

A quantum teleportation experiment in action. (Credit: IQOQI/Vienna)

The last great contribution Einstein made to physics was not widely recognized as such, mostly because he was wrong. In a 1935 paper with his younger colleagues Boris Podolsky and Nathan Rosen (the "EPR paper"), Einstein provided a clear mathematical statement of something that had been bothering him for some time, an idea that we now call "entanglement."

The EPR paper argued that quantum physics allowed the existence of systems where measurements made at widely separated locations could be correlated in ways that suggested the outcome of one was determined by the other. They argued that this meant the measurement outcomes must be determined in advance, by some common factor, because the alternative would require transmitting the result of one measurement to the location of the other at speeds faster than the speed of light. Thus, quantum mechanics must be incomplete, a mere approximation to some deeper theory (a "local hidden variable" theory, one where the results of a particular measurement do not depend on anything farther away from the measurement location than a signal could travel at the speed of light ("local"), but are determined by some factor common to both systems in an entangled pair (the "hidden variable")).

This was regarded as an odd footnote for about thirty years, as there seemed to be no way to test it, but in the mid-1960's the Irish physicist John Bell worked out the consequences of the EPR paper in greater detail. Bell showed that you can find circumstances in which quantum mechanics predicts correlations between distant measurements that are stronger than any possible theory of the type preferred by E, P, and R. This was tested experimentally in the mid-1970's by John Clauser, and a series of experiments by Alain Aspect in the early 1980's is widely considered to have definitively shown that these entangled systems cannot possibly be explained by any local hidden variable theory.

The most common approach to understanding this result is to say that quantum mechanics is non-local: that the results of measurements made at a particular location can depend on the properties of distant objects in a way that can't be explained using signals moving at the speed of light. This does not, however, permit the sending of information at speeds exceeding the speed of light, though there have been any number of attempts to find a way to use quantum non-locality to do that. Refuting these has turned out to be a surprisingly productive enterprise-- check out David Kaiser's How the Hippies Saved Physics for more details. Quantum non-locality is also central to the problem of information in evaporating black holes, and the "firewall" controversy that has generated a lot of recent activity. There are even some radical ideas involving a mathematical connection between the entangled particles described in the EPR paper and wormholes.

Quantum Physics Is (Mostly) Very Small

Images of a hydrogen atom as seen through a quantum telescope. (Credit: Stodolna et al. Phys. Rev.... [+] Lett.)

Quantum physics has a reputation of being weird because its predictions are dramatically unlike our everyday experience (at least, for humans-- the conceit of my book is that it doesn't seem so weird to dogs). This happens because the effects involved get smaller as objects get larger-- if you want to see unambiguously quantum behavior, you basically want to see particles behaving like waves, and the wavelength decreases as the momentum increases. The wavelength of a macroscopic object like a dog walking across the room is so ridiculously tiny that if you expanded everything so that a single atom in the room were the size of the entire Solar System, the dog's wavelength would be about the size of a single atom within that solar system.

This means that, for the most part, quantum phenomena are confined to the scale of atoms and fundamental particles, where the masses and velocities are small enough for the wavelengths to get big enough to observe directly. There's an active effort in a bunch of areas, though, to push the size of systems showing quantum effects up to larger sizes. I've blogged a bunch about experiments by Markus Arndt's group showing wave-like behavior in larger and larger molecules, and there are a bunch of groups in "cavity opto-mechanics" trying to use light to slow the motion of chunks of silicon down to the point where the discrete quantum nature of the motion would become clear. There are even some suggestions that it might be possible to do this with suspended mirrors having masses of several grams, which would be amazingly cool.

Quantum Physics Is Not Magic

Comic from "Surviving the World" by Dante Shepherd. (http://survivingtheworld.net/Lesson1518.html )... [+] Used with permission.

The previous point leads very naturally into this one: as weird as it may seem, quantum physics is most emphatically not magic. The things it predicts are strange by the standards of everyday physics, but they are rigorously constrained by well-understood mathematical rules and principles.

So, if somebody comes up to you with a "quantum" idea that seems too good to be true-- free energy, mystical healing powers, impossible space drives-- it almost certainly is. That doesn't mean we can't use quantum physics to do amazing things-- you can find some really cool physics in mundane technology-- but those things stay well within the boundaries of the laws of thermodynamics and just basic common sense.

So there you have it: the core essentials of quantum physics. I've probably left a few things out, or made some statements that are insufficiently precise to please everyone, but this ought to at least serve as a useful starting point for further discussion.

Go here to see the original:

Six Things Everyone Should Know About Quantum Physics

Posted in Quantum Physics | Comments Off on Six Things Everyone Should Know About Quantum Physics

A new Approach Could Tease out the Connection Between Gravity and Quantum Mechanics – Universe Today

Posted: at 2:19 pm

In physics, there are two main ways to model the universe. The first is the classical way. Classical models such as Newtons laws of motion and Einsteins theory of relativity assume that the properties of an object such as its position and motion are absolute. There are practical limits to how accurately we can measure an objects path through space and time, but thats on us. Nature knows their motion with infinite precision. Quantum models such as atomic physics assume that objects are governed by interactions. These interactions are probabilistic and indefinite. Even if we constrain an interaction to limited outcomes, we can never know the motion of an object with infinite precision, because nature doesnt allow it.

These two theoretical worlds, the definite classical and indefinite quantum, each work extremely well. The classical for large, massive objects such as baseballs and planets, and the quantum for small, light objects such as atoms and molecules. But both of these approaches break down when we try to study massive but small things such as the interiors of black holes, or the observable universe in the earliest moments of the big bang. For that has all the properties of general relativity with all the properties of quantum theory. This theory is sometimes referred to as quantum gravity, but right now we dont know it would work.

Its difficult to study this theory because we dont have any experiments to test it directly. But a new study proposes an experiment that could give us a glimpse of how quantum gravity might work.

The key is to have an object that is quantum in nature, but massive enough that classical gravity has an effect. To do this the team proposes using a super-cooled state of matter known as Bose-Einstein condensate. This occurs when certain groups of atoms are cooled so much that they effectively blur together in a single quantum state. If billions of atoms were cooled to a Bose-Einstein condensate, they would form a single quantum object with a mass roughly equal to that of a virus. Tiny, but massive enough for the effects of gravity to be studied.

The team proposes making such a condensate, then suspending it magnetically so that only gravity can interact with it. In their work, they show that if gravity works on a quantum level, then the shape of the condensate will shift slightly from its weightless Gaussian shape. If gravity only interacts on a classical level, then the condensate will remain Gaussian.

This approach could be done with our current technology. Unlike other proposed studies, this experiment would only rely on a basic property of quantum systems rather than more complex interactions such as entanglement. If the experiment can be performed, it could give us the first real look at the fundamental nature of quantum gravity.

Reference: Richard Howl, et al. Non-Gaussianity as a Signature of a Quantum Theory of Gravity. PRX Quantum 2.1 (2021): 010325.

Like Loading...

Read more from the original source:

A new Approach Could Tease out the Connection Between Gravity and Quantum Mechanics - Universe Today

Posted in Quantum Physics | Comments Off on A new Approach Could Tease out the Connection Between Gravity and Quantum Mechanics – Universe Today

And So It Begins Quantum Physicists Create a New Universe With Its Own Rules – The Daily Galaxy –Great Discoveries Channel

Posted: at 2:19 pm

Albert Einstein was fond of saying that Imagination is everything. It is the preview of lifes coming attractions. What if our world, our universe, following Einsteins insight, is the result of a quantum-physics experiment performed by some ancient hyper-advanced alien civilization. A civilization that, as astrophysicist Paul Davies speculates, may exist beyond matter.

In The Eerie Silence: Renewing Our Search for Alien Intelligence Davies writes: Thinking about advanced alien life requires us to abandon all our presuppositions about the nature of life, mind, civilization, technology and community destiny. In short, it means thinking the unthinkable. Five hundred years ago the very concept of a device manipulating information, or software, would have been incomprehensible. Might there be a still higher level, as yet outside all human experience?

Within an infinite space of current cosmology, suggests the University of Chicago theoretical physicist, Dan Hooper, there are inevitably an infinite number of universes that are indistinguishable from our own. Yet some of the regions within the multiverse, says Hooper, are likely to be alien worlds with unknown forces and new forms of matter along with more or fewer than three dimensions of space worlds utterly unlike anything we can imagine.

Coming Attractions Alien Intelligence as Physics

Challenging 100-year Old Notions at the Quantum Level

As if on cue, reports Finlands Aalto University, a team of physicists used an IBM quantum computer to explore an overlooked area of physics, and have challenged 100 year old cherished notions about information at the quantum level, creating new quantum equations that describe a universe with its own peculiar set of rules. For example, by looking in the mirror and reversing the direction of time you should see the same version of you as in the actual world. In their new paper they created a toy-universe that behaves according to these new rules.

Beyond Fundamental Equations

The rules of quantum physics which govern how very small things behave use mathematical operators called Hermitian Hamiltonians. Hermitian operators have underpinned quantum physics for nearly 100 years but recently, theorists have realized that it is possible to extend its fundamental equations to making use of Hermitian operators that are not Hermitian. The new equations describe a universe with its own peculiar set of rules: for example, by looking in the mirror and reversing the direction of time you should see the same version of you as in the actual world.

New Rules of Non-Hermitian Quantum Mechanics

The researchers made qubits, the part of the quantum computer that carries out calculations, behave according to the new rules of non-Hermitian quantum mechanics. They demonstrated experimentally a couple of exciting results which are forbidden by regular Hermitian quantum mechanics. The first discovery was that applying operations to the qubits did not conserve quantum information a behavior so fundamental to standard quantum theory that it results in currently unsolved problems like Stephen Hawkings Black Hole Information paradox.

The Spooky Reality Behind the Quantum Universe We Havent a Clue What It Is

Enter Entanglement

The second exciting result came when they experimented with two entangled qubits. Entanglement is a type of correlations that appears between qubits, as if they would experience a magic connection that makes them behave in sync with each other. Einstein was famously very uncomfortable with this concept, referring to it as spooky action at a distance. Under regular quantum physics, it is not possible to alter the degree of entanglement between two particles by tampering with one of the particles on its own. However in non-Hermitian quantum mechanics, the researchers were able to alter the level of entanglement of the qubits by manipulating just one of them: a result that is expressly off-limits in regular quantum physics.

Einsteins Spooky Action Becomes Spookier

The exciting thing about these results is that quantum computers are now developed enough to start using them for testing unconventional ideas that have been only mathematical so far said lead researcher Sorin Paraoanu. With the present work, Einsteins spooky action at a distance becomes even spookier. And, although we understand very well what is going on, it still gives you the shivers.

Source: Quantum simulation of parity-time symmetry breaking with a superconducting quantum processor. The work was performed under the Finnish Center of Excellence in Quantum Technology (QTF) of the Academy of Finland.

The Daily Galaxy, Max Goldberg, via Aalto University and Nature

Image credit: Shutterstock License

The Galaxy Report newsletter brings you twice-weekly news of space and science that has the capacity to provide clues to the mystery of our existence and add a much needed cosmic perspective in our current Anthropocene Epoch.

Read the original:

And So It Begins Quantum Physicists Create a New Universe With Its Own Rules - The Daily Galaxy --Great Discoveries Channel

Posted in Quantum Physics | Comments Off on And So It Begins Quantum Physicists Create a New Universe With Its Own Rules – The Daily Galaxy –Great Discoveries Channel

IBM adds 10 historically Black colleges and universities to quantum computing center – TechRepublic

Posted: at 2:19 pm

The IBM-HBCU Quantum Center is a research network and a hands-on learning program.

The IBM-HBCU Quantum Center announced on Monday that it is adding 10 historically Black colleges and universities to the center's 13 founding institutions. The center was launched last fall with the goal of advancing quantum information science and expanding science and technology opportunities to a broader group of students.

Kayla Lee, PhD, growth product manager for community partnerships at IBM Quantum and Qiskit, said she anticipates that new career paths such as quantum developer will become more defined as the field continues to evolve over the next few years.

"I hope that the IBM-HBCU Quantum Center accomplishes two things: inspires people to consider careers in quantum computing and provides additional support for students and faculty as they explore various research topics in quantum computing," she said. "I hope that our students participating in the center are more than equipped to thrive in this emerging industry."

The new schools joining the center are:

This multiyear investment connects researchers and students across a network of HBCUs. The program provides schools with access to IBM quantum computers via the cloud, educational support for students learning to use the Qiskit open source software development framework, and funding for undergraduate and graduate research.

SEE:Quantum computing: A cheat sheet(TechRepublic)

One of the initiative's goals is to create a more diverse quantum-ready workforce from students across multiple disciplines including physics, chemistry, computer science and business.

Researchers from the HBCUs are also on center's board, including Howard University associate professor of physics Thomas Searles; Serena Eley, an assistant professor of physics at the Colorado School of Mines and head of the Eley Quantum Materials Group; and Anderson Sunda-Meya, an associate professor of physics at Xavier University of Louisiana.

Since opening last fall, the center has hosted a community hack-a-thon and contributed to a pre-print on arXiv that investigates the use of machine learning and quantum computing to better understand unknown quantum systems. arXiv is a free distribution service and an open-access archive for scholarly articles in the fields of physics, mathematics, computer science, quantitative biology, quantitative finance, statistics, electrical engineering and systems science, and economics.

IBM is measuring the impact of the center by tracking student engagement, talent and workforce development and research capacity. The center also plans to look for ways to support professors and students map out career plans that have a long-term impact on quantum computing.

SEE: To do in 2021: Get up to speed with quantum computing 101 (TechRepublic)

JPMorgan Chase also is building a pipeline of people with quantum computing experience. The banking company was one of the early customers for IBM's quantum computer and is planning a Quantum Computing Summer Associates program for 2021.

The quantum industry is supporting several initiatives to expand educational opportunities. The European Organization for Nuclear Research recently offered a series of free webinars about quantum computing. The course covers the basic concepts of the quantum circuit model, including qubits, gates, and measures, as well as quantum algorithms and protocols. Q-CTRL recently hired quantum physics professor Chris Ferrie as a quantum education adviser. Q-CTRL specializes in controls for quantum computing.

This is your go-to resource for XaaS, AWS, Microsoft Azure, Google Cloud Platform, cloud engineering jobs, and cloud security news and tips. Delivered Mondays

Read the original here:

IBM adds 10 historically Black colleges and universities to quantum computing center - TechRepublic

Posted in Quantum Physics | Comments Off on IBM adds 10 historically Black colleges and universities to quantum computing center – TechRepublic

Physicists Need to Be More Careful with How They Name Things – Scientific American

Posted: at 2:19 pm

In 2012, the quantum physicist John Preskill wrote, We hope to hasten the day when well controlled quantum systems can perform tasks surpassing what can be done in the classical world. Less than a decade later, two quantum computing systems have met that mark: Googles Sycamore, and the University of Science and Technology of Chinas Jizhng. Both solved narrowly designed problems that are, so far as we know, impossible for classical computers to solve quickly. How quickly? How impossible? To solve a problem that took Jizhng 200 seconds, even the fastest supercomputers are estimated to take at least two billion years.

Describing what then may have seemed a far-off goal, Preskill gave it a name: quantum supremacy. In a blog post at the time, he explained Im not completely happy with this term, and would be glad if readers could suggest something better.

Were not happy with it either, and we believe that the physics community should be more careful with its language, for both social and scientific reasons. Even in the abstruse realms of matter and energy, language matters because physics is done by people.

The word supremacyhaving more power, authority or status than anyone elseis closely linked to white supremacy. This isnt supposition; its fact. The Corpus of Contemporary American English finds white supremacy is 15 times more frequent than the next most commonly used two-word phrase, judicial supremacy. Though English is the global lingua franca of science, it is notable that the USTC team avoided quantum supremacy because in Chinese, the character meaning supremacy also has uncomfortable, negative connotations. The problem is not confined merely to English.

White supremacist movements have grown around the globe in recent years, especially in the United States, partly as a racist backlash to the Black Lives Matter movement. As Preskill has recently acknowledged, the word unavoidably evokes a repugnant political stance.

Quantum supremacy has also become a buzzword in popular media (for example, here and here). Its suggestion of domination may have contributed to unjustified hype, such as the idea that quantum computers will soon make classical computers obsolete. Tamer alternatives such as quantum advantage, quantum computational supremacy and even quantum ascendancy have been proposed, but none have managed to supplant Preskills original term. More jargony proposals like Noisy Intermediate Scale Quantum computing (NISQ) and tongue-in-cheek suggestions like quantum non-uselessness have similarly failed to displace supremacy.

Here, we propose an alternative we believe succinctly captures the scientific implications with less hype andcruciallyno association with racism: quantum primacy.

Whats in a name? Its not just that quantum supremacy by any other name would smell sweeter. By making the case for quantum primacy we hope to illustrate some of the social and scientific issues at hand. In President Joe Bidens letter to his science adviser, the biologist Eric Lander, he asks How can we ensure that Americans of all backgrounds are drawn into both the creation and the rewards of science and technology? One small change can be in the language we use. GitHub, for example, abandoned the odious master/slave terminology after pressure from activists.

Were physics, computer science and engineering more diverse, perhaps we would not still be having this discussion, which one of us wrote about four years ago. But in the U.S., when only 2 percent of bachelors degrees in physics are awarded to Black students, when Latinos comprise less than 7 percent of engineers, and women account for a mere 12 percent of full professors in physics, this is a conversation that needs to happen. As things stand, quantum supremacy can come across as adding insult to injury.

The nature of quantum computing, and its broad interest to the public outside of industry laboratories and academia means that the debate around quantum supremacy was inevitably going to be included in the broader culture war.

In 2019, a short correspondence to Nature argued that the quantum computing community should adopt different terminology to avoid overtones of violence, neocolonialism and racism. Within days, the dispute was picked up by the conservative editorial pages of the Wall Street Journal, which attacked quantum wokeness and suggested that changing the term would be a slippery slope all the way down to cancelling Diana Ross The Supremes.

The linguist Steven Pinker weighed in to argue that the prissy banning of words by academics should be resisted. It dumbs down understanding of language: word meanings are conventions, not spells with magical powers, and all words have multiple senses, which are distinguished in context. Also, it makes academia a laughingstock, tars the innocent, and does nothing to combat actual racism & sexism.

It is true that supremacy is not a magic word, that its meaning comes from convention, not conjurers. But the context of quantum supremacy, which Pinker neglects, is that of a historically white, male-dominated discipline. Acknowledging this by seeking better language is a basic effort to be polite, not prissy.

Perhaps the most compelling argument raised in favor of quantum supremacy is that it could function to reclaim the word. Were quantum supremacy 15 times more common than white supremacy, the shoe would be on the other foot. Arguments for reclamation, however, must account for who is doing the reclaiming. If the charge to take back quantum supremacy were led by Black scientists and other underrepresented minorities in physics, that would be one thing. No survey exists, but anecdotal evidence suggests this is decidedly not the case.

To replace supremacy, we need to have a thoughtful conversation. Not any alternative will do, and there is genuinely tricky science at stake. Consider the implications of quantum advantage. An advantage might be a stepladder that makes it easier to reach a high shelf, or a small head start in a race. Some quantum algorithms are like this. Grovers search algorithm is only quadratically faster than its classical counterpart, so a quantum computer running Grovers algorithm might solve a problem that took classical computers 100 minutes in the square root of that time10 minutes. Not bad! Thats definitely an advantage, especially as runtimes get longer, but it doesnt compare to some quantum speedups.

Perhaps the most famous quantum speedup comes from Shor's algorithm, which can find the factors of numbers (e.g. 5 and 3 are factors of 15) almost exponentially faster than the best classical algorithms. While classical computers are fine with small numbers, every digit takes a toll. For example, a classical computer might factor a 100-digit number in seconds, but a 1000-digit number would take billions of years. A quantum computer running Shor's algorithm could do it in an hour.

When quantum computers can effectively do things that are impossible for classical computers, they have something much more than an advantage. We believe primacy captures much of this meaning. Primacy means preeminent position or the condition of being first. Additionally, it shares a Latin root (primus, or first) with mathematical terms such as prime and primality.

While quantum computers may be first to solve a specific problem, that does not imply they will dominate; we hope quantum primacy helps avoid the insinuation that classical computers will be obsolete. This is especially important because quantum primacy is a moving target. Classical computers and classical algorithms can and do improve, so quantum computers will have to get bigger and better to stay ahead.

These kinds of linguistic hotfixes do not reach even a bare minimum for diversifying science; the most important work involves hiring and retention and actual material changes to the scientific community to make it less white and male. But if opposition to improving the language of science is any indication about broader obstacles to diversifying it, this is a conversation we must have.

Physicists may prefer vacuums for calculation, but science does not occur in one. It is situated in the broader social and political landscape, one which both shapes and is shaped by the decisions of researchers.

This is an opinion and analysis article.

See the article here:

Physicists Need to Be More Careful with How They Name Things - Scientific American

Posted in Quantum Physics | Comments Off on Physicists Need to Be More Careful with How They Name Things – Scientific American

Can the laws of physics disprove God? – The Conversation UK

Posted: at 2:19 pm

I still believed in God (I am now an atheist) when I heard the following question at a seminar, first posed by Einstein, and was stunned by its elegance and depth: If there is a God who created the entire universe and ALL of its laws of physics, does God follow Gods own laws? Or can God supersede his own laws, such as travelling faster than the speed of light and thus being able to be in two different places at the same time? Could the answer help us prove whether or not God exists or is this where scientific empiricism and religious faith intersect, with NO true answer? David Frost, 67, Los Angeles.

I was in lockdown when I received this question and was instantly intrigued. Its no wonder about the timing tragic events, such as pandemics, often cause us to question the existence of God: if there is a merciful God, why is a catastrophe like this happening? So the idea that God might be bound by the laws of physics which also govern chemistry and biology and thus the limits of medical science was an interesting one to explore.

If God wasnt able to break the laws of physics, she arguably wouldnt be as powerful as youd expect a supreme being to be. But if she could, why havent we seen any evidence of the laws of physics ever being broken in the universe?

This article is part of Lifes Big QuestionsThe Conversations new series, co-published with BBC Future, seeks to answer our readers nagging questions about life, love, death and the universe. We work with professional researchers who have dedicated their lives to uncovering new perspectives on the questions that shape our lives.

To tackle the question, lets break it down a bit. First, can God travel faster than light? Lets just take the question at face value. Light travels at an approximate speed of 3 x 105 kilometres every second, or 186,000mph. We learn at school that nothing can travel faster than the speed of light not even the USS Enterprise in Star Trek when its dilithium crystals are set to max.

But is it true? A few years ago, a group of physicists posited that particles called tachyons travelled above light speed. Fortunately, their existence as real particles is deemed highly unlikely. If they they did exist, they would have an imaginary mass and the fabric of space and time would become distorted leading to violations of causality (and possibly a headache for God).

It seems, so far, that no object has been observed that can travel faster than the speed of light. This in itself does not say anything at all about God. It merely reinforces the knowledge that light travels very fast indeed.

Things get a bit more interesting when you consider how far light has travelled since the beginning. Assuming a traditional big bang cosmology and a light speed of 3 x 105 km/s, then we can calculate that light has travelled roughly 1024 km in the 13.8 billion years of the universes existence. Or rather, the observable universes existence.

The universe is expanding at a rate of approximately 70km/s per Mpc (1 Mpc = 1 Megaparsec ~ 30 million km), so current estimates suggest that the distance to the edge of the universe is 46 billion light years. As time goes on, the volume of space increases, and light has to travel for longer to reach us.

There is a lot more universe out there than we can view, but the most distant object that we have seen is a galaxy, GN-z11, observed by the Hubble Space Telescope. This is approximately 1023 km or 13.4 billion light years away, meaning that it has taken 13.4 billion years for light from the galaxy to reach us. But when the light set off, the galaxy was only about 3 billion light years away from our galaxy, the Milky Way.

We cannot observe or see across the entirety of the universe that has grown since the big bang because insufficient time has passed for light from the first fractions of a second to reach us. Some argue that we therefore cannot be sure whether the laws of physics could be broken in other cosmic regions perhaps they are just local, accidental laws. And that leads us on to something even bigger than the universe.

Many cosmologists believe that the universe may be part of a more extended cosmos, a multiverse, where many different universes co-exist but dont interact. The idea of the multiverse is backed by the theory of inflation the idea that the universe expanded hugely before it was 10-32 seconds old. Inflation is an important theory because it can explain why the universe has the shape and structure that we see around us.

But if inflation could happen once, why not many times? We know from experiments that quantum fluctuations can give rise to pairs of particles suddenly coming into existance, only to disappear moments later. And if such fluctuations can produce particles, why not entire atoms or universes? Its been suggested that, during the period of chaotic inflation, not everything was happening at the same rate quantum fluctuations in the expansion could have produced bubbles that blew up to become universes in their own right.

But how does God fit into the multiverse? One headache for cosmologists has been the fact that our universe seems fine-tuned for life to exist. The fundamental particles created in the big bang had the correct properties to enable the formation of hydrogen and deuterium substances which produced the first stars.

The physical laws governing nuclear reactions in these stars then produced the stuff that lifes made of carbon, nitrogen and oxygen. So how come all the physical laws and parameters in the universe happen to have the values that allowed stars, planets and ultimately life to develop?

Some argue its just a lucky coincidence. Others say we shouldnt be surprised to see biofriendly physical laws they after all produced us, so what else would we see? Some theists, however, argue it points to the existence of a God creating favourable conditions.

But God isnt a valid scientific explanation. The theory of the multiverse, instead, solves the mystery because it allows different universes to have different physical laws. So its not surprising that we should happen to see ourselves in one of the few universes that could support life. Of course, you cant disprove the idea that a God may have created the multiverse.

This is all very hypothetical, and one of the biggest criticisms of theories of the multiverse is that because there seem to have been no interactions between our universe and other universes, then the notion of the multiverse cannot be directly tested.

Now lets consider whether God can be in more than one place at the same time. Much of the science and technology we use in space science is based on the counter-intuitive theory of the tiny world of atoms and particles known as quantum mechanics.

The theory enables something called quantum entanglement: spookily connected particles. If two particles are entangled, you automatically manipulate its partner when you manipulate it, even if they are very far apart and without the two interacting. There are better descriptions of entanglement than the one I give here but this is simple enough that I can follow it.

Imagine a particle that decays into two sub-particles, A and B. The properties of the sub-particles must add up to the properties of the original particle this is the principle of conservation. For example, all particles have a quantum property called spin roughly, they move as if they were tiny compass needles. If the original particle has a spin of zero, one of the two sub-particles must have a positive spin and the other a negative spin, which means that each of A and B has a 50% chance of having a positive or a negative spin. (According to quantum mechanics, particles are by definition in a mix of different states until you actually measure them.)

The properties of A and B are not independent of each other they are entangled even if located in separate laboratories on separate planets. So if you measure the spin of A and you find it to be positive. Imagine a friend measured the spin of B at exactly the same time that you measured A. In order for the principle of conservation to work, she must find the spin of B to be negative.

But and this is where things become murky like sub-particle A, B had a 50:50 chance of being positive, so its spin state became negative at the time that the spin state of A was measured as positive. In other words, information about spin state was transferred between the two sub-particles instantly. Such transfer of quantum information apparently happens faster than the speed of light. Given that Einstein himself described quantum entanglement as spooky action at a distance, I think all of us can be forgiven for finding this a rather bizarre effect.

So there is something faster than the speed of light after all: quantum information. This doesnt prove or disprove God, but it can help us think of God in physical terms maybe as a shower of entangled particles, transferring quantum information back and forth, and so occupying many places at the same time? Even many universes at the same time?

I have this image of God keeping galaxy-sized plates spinning while juggling planet-sized balls tossing bits of information from one teetering universe to another, to keep everything in motion. Fortunately, God can multitask keeping the fabric of space and time in operation. All that is required is a little faith.

Has this essay come close to answering the questions posed? I suspect not: if you believe in God (as I do), then the idea of God being bound by the laws of physics is nonsense, because God can do everything, even travel faster than light. If you dont believe in God, then the question is equally nonsensical, because there isnt a God and nothing can travel faster than light. Perhaps the question is really one for agnostics, who dont know whether there is a God.

This is indeed where science and religion differ. Science requires proof, religious belief requires faith. Scientists dont try to prove or disprove Gods existence because they know there isnt an experiment that can ever detect God. And if you believe in God, it doesnt matter what scientists discover about the universe any cosmos can be thought of as being consistent with God.

Our views of God, physics or anything else ultimately depends on perspective. But lets end with a quotation from a truly authoritative source. No, it isnt the bible. Nor is it a cosmology textbook. Its from Reaper Man by Terry Pratchett:

Light thinks it travels faster than anything but it is wrong. No matter how fast light travels, it finds the darkness has always got there first, and is waiting for it.

To get all of lifes big answers, join the hundreds of thousands of people who value evidence-based news by subscribing to our newsletter. You can send us your big questions by email at bigquestions@theconversation.com and well try to get a researcher or expert on the case.

More Lifes Big Questions:

See the original post:

Can the laws of physics disprove God? - The Conversation UK

Posted in Quantum Physics | Comments Off on Can the laws of physics disprove God? – The Conversation UK

A New Measurement of Quantum Space-Time Has Found Nothing Going On – ScienceAlert

Posted: at 2:19 pm

In the very smallest measured units of space and time in the Universe, not a lot is going on. In a new search for quantum fluctuations of space-time on Planck scales, physicists have found that everything is smooth.

This means that - for now at least - we still can't find a way to resolve general relativity with quantum mechanics.

It's one of the most vexing problems in our understanding of the Universe.

General relativity is the theory of gravitation that describes gravitational interactions in the large-scale physical Universe. It can be used to make predictions about the Universe; general relativity predicted gravitational waves, for instance, and some behaviours of black holes.

Space-time under relativity follows what we call the principle of locality - that is, objects are only directly influenced by their immediate surroundings in space and time.

In the quantum realm - atomic and subatomic scales - general relativity breaks down, and quantum mechanics takes over. Nothing in the quantum realm happens at a specific place or time until it is measured, and parts of a quantum system separated by space or time can still interact with each other, a phenomenon known as nonlocality.

Somehow, in spite of their differences, general relativity and quantum mechanics exist and interact. But so far, resolving the differences between the two has proven extremely difficult.

This is where the Holometer at Fermilab comes into play - a project headed by astronomer and physicist Craig Hogan from the University of Chicago. This is an instrument designed to detect quantum fluctuations of space-time at the smallest possible units - a Planck length, 10-33 centimetres, and Planck time, how long it takes light to travel a Planck length.

It consists of two identical 40-metre (131-foot) interferometers that intersect at a beam splitter. A laser is fired at the splitter and sent down two arms to two mirrors, to be reflected back to the beam splitter to recombine. Any Planck-scale fluctuations will mean the beam that returns is different from the beam that was emitted.

A few years ago, the Holometer made a null detection of back-and-forth quantum jitters in space-time. This suggested that space-time itself as we can currently measure it is not quantised; that is, could be broken down into discrete, indivisible units, or quanta.

Because the interferometer arms were straight, it could not detect other kinds of fluctuating motion, such as if the fluctuations were rotational. And this could matter a great deal.

"In general relativity, rotating matter drags space-time along with it. In the presence of a rotating mass, the local nonrotating frame, as measured by a gyroscope, rotates relative to the distant Universe, as measured by distant stars," Hogan wrote on the Fermilab website.

"It could well be that quantum space-time has a Planck-scale uncertainty of the local frame, which would lead to random rotational fluctuations or twists that we would not have detected in our first experiment, and much too small to detect in any normal gyroscope."

So, the team redesigned the instrument. They added additional mirrors so that they would be able to detect any rotational quantum motion. The result was an incredibly sensitive gyroscope that can detect Planck-scale rotational twists that change direction a million times per second.

In five observing runs between April 2017 and August 2019, the team collected 1,098 hours of dual interferometer time series data. In all that time, there was not a single jiggle. As far as we know, space-time is still a continuum.

But that doesn't mean the Holometer, as has been suggested by some scientists, is a waste of time. There's no other instrument like it in the world. The results it returns - null or not - will shape future efforts to probe the intersection of relativity and quantum mechanics at Planck scales.

"We may never understand how quantum space-time works without some measurement to guide theory,"Hogan said. "The Holometer program is exploratory. Our experiment started with only rough theories to guide its design, and we still do not have a unique way to interpret our null results, since there is no rigorous theory of what we are looking for.

"Are the jitters just a bit smaller than we thought they might be, or do they have a symmetry that creates a pattern in space that we haven't measured? New technology will enable future experiments better than ours and possibly give us some clues to how space and time emerge from a deeper quantum system."

The research has been published on arXiv.

See more here:

A New Measurement of Quantum Space-Time Has Found Nothing Going On - ScienceAlert

Posted in Quantum Physics | Comments Off on A New Measurement of Quantum Space-Time Has Found Nothing Going On – ScienceAlert

Page 89«..1020..88899091..100110..»