Page 75«..1020..74757677..8090..»

Category Archives: Quantum Physics

Lost in Reality: A Broad Perspective – Commentary Box Sports

Posted: July 12, 2021 at 7:54 am

By combining the entire history of physics and the philosophical musings associated with it in one book Hans Plets proves its possible.

You would expect a diverse book from someone who has studied physics, astronomy, philosophy, and business administration. But Hans Plets success in capturing 2,500 years of developments in less than three hundred pages is astonishing.

at actually lost Plets describe the entire history of physics, from the ancient Greeks to the present. Then theres also a lot of philosophy woven into it. Plets tells how scientists over time have looked at basic questions such as What is everything made of? and Can we understand our world?

The author briskly discusses all the breakthroughs in our thinking about nature. With the advent of quantum mechanics, the theory of relativity and the laws of symmetry, the content of logic became increasingly hot. But thanks to the clear summaries, the explanation remains easy to follow.

The common thread in the book is that we as human beings play an increasingly smaller role in the cosmic whole. How should we deal with our insignificance? With this question, the two allies who parted somewhat in recent centuries are united. Pletts mentions, among other things, physicist Sean Carrolls poetic naturalism as a solution: Its okay to give reality your own touch, as long as it doesnt conflict with science.

Through the book, Blitz certainly succeeds in providing the broad perspective he advocates in the introduction. The flip side of the coin is that current problems are only discussed at the end, so that expert writers dont tell much. But it does give an overview which of course is always helpful for missing people.

Coffee fanatic. Friendly zombie aficionado. Devoted pop culture practitioner. Evil travel advocate. Typical organizer.

Read more:

Lost in Reality: A Broad Perspective - Commentary Box Sports

Posted in Quantum Physics | Comments Off on Lost in Reality: A Broad Perspective – Commentary Box Sports

A Quantum Critique of the Western Worldview – Fair Observer

Posted: July 7, 2021 at 3:11 pm

Most people recognize that they have something that can be called a worldview, even though few would attempt to define it. Anyones worldview contains all the perceptions and shared ideas that allow us to assume we have a stable idea of how our physical environment supports us. It is an important part of everyones culture. But a worldview is more comprehensive than what we call culture, which tends to be more of a group view than a worldview. Our worldview and culture always find a way of living together, but they remain distinct.

Most of the main features of our worldview escape our critical attention. Because everyone seems to agree, we simply assume they are real. But what if they are not? Peter Evans, in the title of an article in The Conversation, asks the question, Is reality a game of quantum mirrors? He also suggests that science has actually come to that conclusion.

READ MORE

In his book Helgoland, the Italian physicist Carlo Rovelli explains that our inherited assumptions about the world may have misled an entire civilization. In todays consumer culture that focuses on objects and the value we attribute to them this could have radical implications. Evans clarifies the terms of the debate: Expecting objects to have their own independent existence independent of us, and any other objects is actually a deep-seated assumption we make about the world.

Evans agrees with Rovelli that the object-oriented worldview we have inherited from Europes four-century-old scientific and industrial revolutions needs reassessment. He supports Rovellis thesis that quantum theory the physical theory that describes the universe at the smallest scales almost certainly shows this worldview to be false. Instead, Rovelli argues we should adopt a relational worldview.

Todays Daily Devils Dictionary definition:

Relational worldview:

A mindset which recognizes that things have no identity or measurable value without other things, as opposed to an object-oriented mindset that isolates all things to put a price tag on them

Individuals have a worldview, but so do civilizations. Most individuals worldview shares a preponderant number of common features with their civilizational worldview. But every individuals worldview varies in some respects with regard to societys worldview. This is especially true in the modern civilizations that value individualism and have come to dominate in the West.

The features that diverge from the dominant worldview give us a sense of our own identity, either as a unique individual or even as a member of a group. We tend to respect commonly shared assumptions and beliefs while at the same time needing to affirm the subtle features that make our worldview distinct. Most people nevertheless remain unaware of what distinguishes their civilizations worldview from other real and possible worldviews. In that sense, worldviews can become tyrannical.

Rovelli believes that the fundamental understanding scientists have acquired about the undeniable laws of quantum mechanics indicates that one of the key features of our inherited and shared worldview our belief in the reality of objects is mistaken. As Evans points out, He claims the objects of quantum theory, such as a photon, electron, or other fundamental particle, are nothing more than the properties they exhibit when interacting with in relation to other objects. In other words, without the interaction, nothing can be said to exist.

The corollary of this is that there is no underlying individual substance that has the properties. This contradicts our current worldview that has imposed the idea that we live in a world of substances, each of which has properties. It turns out that properties are the only things that are real. They combine to create the illusion of substance.

Rovelli proposes moving away from the culture that divides the world into an indefinite number of substances to which we attribute properties and replacing it with the understanding that the form of anything we perceive is entirely the effect of a relationship. As Evans expresses it, according to the relational interpretation, the state of any system is always in relation to some other system. In this reading of reality, nothing can be understood or even exist outside of its context. It also means that contexts have context. These are two radically different ways of looking at the world.

There is an interesting parallel with the research of contemporary anthropologists who have focused on the comparative analysis of cultures. Just as Rovelli opposes an object-oriented worldview to a relation-oriented worldview, the anthropologists contrast relationship-oriented cultures and task-oriented cultures. US culture is usually cited as an extreme example of a task-oriented culture. More than in any other culture and in stark contrast to most Asian, African and Latin American cultures Americans view social interaction as a pretext for accomplishing something. They always have an object to achieve. Relationships are acknowledged primarily as contexts that permit transactions or the transfer of property rather than as the invisible but powerfully organized foundation of social reality.

Reviewing Rovellis book for The Guardian, Manjit Kumar sums up Rovellis thesis in these terms: The world that we observe is continuously interacting; it is better understood as a web of interactions and relations rather than objects. Most cultures in human history have made similar assumptions about society itself, emphasizing the reality of relationships.

Todays Americanized worldview serves the useful purpose of justifying capitalistic notions of ownership and attribution of value, an industrial organization that treats people who produce and consume as objects, and the consumer society itself made up of people purchasing objects. It came into being over the past four centuries in Western Europe and North America. It now dominates what education and the media have conditioned us to think about the world. Contemporary physics and cross-cultural anthropology stand as exceptions that embrace relationships as the foundation of everything. Philosophy, economics and political science clearly have not caught up.

The interesting parallel between the findings of quantum physics and cross-cultural anthropology reveals something about what would be implied if the paradigm shift Rovelli recommends were to occur. It would imply our civilizations taking on board the new understanding of physical reality and applying the lessons to social organization itself.

The task-oriented, fundamentally utilitarian culture that emerged in the English-speaking world during the industrial revolution became massively dominant across the globe in the 20th century. Its global victory can be attributed to the emergence of mass media and the expanding wave of soft power that accompanied the hard power of the US military and the almighty dollar. Today, all our institutions reflect an implicit political worldview that sees the nation-state as the only legally recognized source of social legitimacy.

These institutions, and their rules and practices, require thinking of the human environments as enclosed zones that contain a finite collection of substances (economic resources) endowed with properties, one of those properties being commercial value. It has inevitably produced the widely accepted, though often disparaged, truism: Everything has a price.

Changing an entire populations worldview would be a monumental task, but it has occurred in the past. Rovelli is an optimist. His optimism stems from his belief which is widely shared in the modern Western worldview that science has a mission of communicating the truth about the world we live in and that, once that truth is unambiguously established, the logical thing to do would be to modify the civilizations worldview. But Rovellis thesis contradicts the culture that embraces it, which means that resistance will be fierce. We now tend to think of truth itself as a quantifiable object, as when we say things like, How much truth is anyone willing to accept?

In the 16th and 17th centuries in Europe, a radical shift took place in the dominant worldview. It led to the scientific and industrial revolutions. We traditionally represent it as a shift from superstition to science. But it was also a shift from a relationship-based worldview to a mechanical, object-oriented one. Feudalism itself was a system of interacting relationships, as Thomas Piketty insisted in his book Capital and Ideology. Now science has led us to a point at which the scientific revolution may appear as a denial of the true understanding of science. In his book, Rovelli reveals that he himself looks beyond the various Western worldviews to Eastern philosophy, that of the second-century Indian philosopher Nagarjuna, for inspiration.

*[In the age of Oscar Wilde and Mark Twain, another American wit, the journalist Ambrose Bierce, produced a series of satirical definitions of commonly used terms, throwing light on their hidden meanings in real discourse. Bierce eventually collected and published them as a book, The Devils Dictionary, in 1911. We have shamelessly appropriated his title in the interest of continuing his wholesome pedagogical effort to enlighten generations of readers of the news. Read more of The Daily Devils Dictionary on Fair Observer.]

The views expressed in this article are the authors own and do not necessarily reflect Fair Observers editorial policy.

Read the original post:

A Quantum Critique of the Western Worldview - Fair Observer

Posted in Quantum Physics | Comments Off on A Quantum Critique of the Western Worldview – Fair Observer

Quantum Key Distribution: Is it as secure as claimed and what can it offer the enterprise? – The Register

Posted: at 3:11 pm

Feature Do the laws of physics trump mathematical complexity, or is Quantum Key Distribution (QKD) nothing more than 21st-century enterprise encryption snake oil? The number of QKD news headlines that have included unhackable, uncrackable or unbreakable could certainly lead you towards the former conclusion.

However, we at The Reg are unrelenting sceptics for our sins and take all such claims with a bulk-buy bag of Saxa. What this correspondent is not, however, is a physicist nor a mathematician, let alone a quantum cryptography expert. Thankfully, I know several people who are, so I asked them the difficult questions. Here's how those conversations went.

I can tell you what QKD isn't, and that's quantum cryptography. Instead, as the name suggests, it's just the part that deals with the exchange of encryption keys.

As defined by the creators of the first Quantum key distribution (QKD) protocol, (Bennett and Brassard, 1984) it is a method to solve the problem of the need to distribute secret keys among distant Alice and Bobs in order for cryptography to work. The way QKD solves this problem is by using quantum communication. "It relies on the fact that any attempt of an adversary to wiretap the communication would, by the laws of quantum mechanics, inevitably introduce disturbances which can be detected."

Quantum security expert, mathematician and security researcher Dr Mark Carney explains there "are a few fundamental requirements for QKD to work between Alice (A) and Bob (B), these being a quantum key exchange protocol to guarantee the key exchange has a level of security, a quantum and classical channel between A and B, and the relevant hardware and control software for A and B to enact the protocol we started with."

If you are the diagrammatical type, there's a nifty if nerdy explanatory one here.

It's kind of a given that, in and of themselves, quantum key exchange protocols are primarily very secure, as Dr Carney says most are derived from either BB84 (said QKD protocol of Bennett and Brassard, 1984) or E91 (Eckert, 1991) and sometimes a mixture of the two.

"They've had a lot of scrutiny, but they are generally considered to be solid protocols," Dr Carney says, "and when you see people claiming that 'quantum key exchange is totally secure and unhackable' there are a few things that are meant: that the key length is good (at least 256 bits), the protocol can detect someone eavesdropping on the quantum channel and the entropy of the system gives unpredictable keys, and the use of quantum states to encode these means they are tamper-evident."

So, if the protocol is accepted as secure, where do the snake oil claims enter the equation? According to Dr Carney, it's in the implementation where things start to get very sticky.

"We all know that hardware, firmware, and software have bugs even the most well researched, well assessed, widely hacked pieces of tech such as the smartphone regularly has bug updates, security fixes, and emergency patches. Bug-free code is hard, and it shouldn't be considered that the control systems for QKD are any different," Carney insists.

In other words, it's all well and good having a perfected quantum protocol, but if someone can do memory analysis on A or B's systems, then your "super secure" key can get pwned. "It's monumentally naive in my view that the companies producing QKD tech don't take this head on," Dr Carney concludes. "Hiding behind 'magic quantum woo-woo security' is only going to go so far before people start realising."

Professor Rob Young, director of the Quantum Technology Centre at Lancaster University, agrees that there is a gap between an ideal QKD implementation and a real system, as putting the theory into practice isn't easy without making compromises.

QKD connections can be blocked using a DDoS attack as simple as using a pneumatic drill in the vicinity of the cable

"When you generate the states to send from the transmitter," he explains, "errors are made, and detecting them at the receiver efficiently is challenging. Security proofs typically rely on a long list of often unmet assumptions in the real world."

Then there are the hardware limitations, with most commercially implemented QKD systems using a discrete-state protocol sending single photons down low-loss fibres. "Photons can travel a surprising distance before being absorbed, but it means that the data exchange rate falls off exponentially with distance," Young says.

"Nodes in networks need to be trusted currently, as we can't practically relay or switch quantum channels without trusting the nodes. Solutions to these problems are in development, but they could be years away from commercial implementation."

This lack of quantum repeaters is a red flag, according to Duncan Jones, head of Quantum Cybersecurity at Cambridge Quantum, who warns that "trusted repeaters" are not the same thing. "In most cases this simply means a trusted box which reads the key material from one fibre cable and re-transmits it down another. This is not a quantum-safe approach and negates the security benefits of QKD."

Then there's the motorway junction conundrum. Over to Andersen Cheng, CEO at Post-Quantum, to explain. Cheng points to problems such as QKD only telling you that a person-in-the-middle attack has happened, with photons disturbed because of the interception, but not where that attack is taking place or how many attacks are happening.

"If someone is going to put a tap along your 150km high-grade clear fibre-optic cable, how are you going to locate and weed out those taps quickly?" Cheng asks.

What if an attacker locates your cable grid and cuts a cable off? Where is the contingency for redundancy to ensure no disruption? This is where the motorway junction conundrum comes in.

"QKD is like two junctions of a motorway," Cheng explains. "You know car accidents are happening because the road surface is being attacked, but you do not know how many accidents have happened or where or who the culprit is, so you cannot go and kick the offenders out and patch up the road surface."

This all comes to the fore when Anderson insists: "QKD connections can be blocked using a DDoS attack as simple as using a pneumatic drill in the vicinity of the cable."

Sally Epstein, head of Strategic Technology at Cambridge Consultants, throws a couple of pertinent questions into the "ask any QKD vendor" ring.

Quantum-safe cryptography, coupled with verifiable quantum key generation, is an excellent approach to the same problem and works perfectly today

"1. Supply chain: There is a much greater potential for well-funded bad actors to get into the supply chain. How do they manage their supply chain security?

"2. Human fallibility: There are almost certainly exploitable weaknesses in the control software, optical sub-assemblies, electronic, firmware, etc. What penetration testing has the supplier conducted in terms of software and hardware?"

Professor Young thinks that QKD currently offers little return on investment for your average enterprise. "QKD can distribute keys with provable security metrics, but current systems are expensive, slow and difficult to implement," he says.

As has already been pointed out, security proofs are generally based on ideal cases without taking the actual physical implementation into account. This, Young says, "troubles the central premise of using QKD in the first place."

However, he doesn't think that the limitations are fundamental and sees an exciting future for the technology.

Because QKD technology is still maturing, and keys can only be sent across relatively short distances using dedicated fibre-optic cables, Jones argues that "only the biggest enterprises and telcos should be spending any money on researching this technology today."

Not least, he says, because the problems QKD solves are equally well addressed through different means. "Quantum-safe cryptography, coupled with verifiable quantum key generation, is an excellent approach to the same problem and works perfectly today," Jones concludes.

Professor Andrew Lord, head of Optical Network Research at BT, has a less pessimistic outlook.

"Our trial with NCC in Bristol illustrates a client with a need to transmit data which should remain secure for many years into the future," Lord told The Reg. "QKD is attractive here because it provides security against the 'tap now, decrypt later' risk, where data could be stored and decrypted when a quantum computer becomes available."

The UK's National Cyber Security Centre (NCSC) has gone on the record to state it does not endorse the use of QKD for any government or military application, and the National Security Agency (NSA) in the US has reached the same conclusion.

Jones of Cambridge Quantum says he completely agrees with the NCSC/NSA perspectives because the "first generation of quantum security technologies has failed to deliver tangible benefits for commercial or government applications."

Young goes further: "Both NCSC and NSA echo the views of all serious cryptographers with regards to QKD, and I am in complete agreement with them."

So what needs to change to make QKD solutions relevant to enterprises in the real world? Lord admits that the specialised hardware requirements of QKD does mean it won't be the best solution for all use cases, but foresees "photonic-chip based QKD ultimately bringing the price down to a point where it can be integrated into standard optical transmission equipment."

Dr Carney adds: "In closing, all this leaves us with the biggest misunderstanding about QKD vs classical key exchange; in classical key exchange the mathematics that makes Elliptic Curve Diffie-Hellman Ephemeral (ECDHE) or your favourite Post-Quantum Cryptography (PQC) key exchange secure is distinct and independent of the physical channel (the classical channel) that is being used for the protocol.

"On a QKD system, the mathematics is in some way intrinsically, and necessarily, linked to the actual physicality of the system. This situation is unavoidable, and we would do well to design for and around it."

More:

Quantum Key Distribution: Is it as secure as claimed and what can it offer the enterprise? - The Register

Posted in Quantum Physics | Comments Off on Quantum Key Distribution: Is it as secure as claimed and what can it offer the enterprise? – The Register

French researchers on the verge of quantum computing milestone – RFI English

Posted: at 3:11 pm

Issued on: 05/07/2021 - 15:05

Researchers at theFrench Alternative Energies and Atomic Energy Commission(CEA) in Grenoble are confident of reaching a key milestone at the end of this year in their quest to build a quantum computer.

Maud Vinet and Silvano De Franceschi from the CEA along with Tristan Meunier of CNRSare leading a multidisciplinary teamof around 50 scientists and engineers to build a silicon based quantum machine, the first critical step of which would be to operate a network of two qubits in the coming months.

How quantum computing works

Qubits are the units of information in quantum computing. They are the quantum equivalent of bits. Unlike classical computing where bits can exist as either 0 or 1, in quantum systems they possess both values at the same time. This property is called superposition.

The other key quantum property is called entanglement. It refers to the almost instantaneous effect two qubits have on each other even at a distance after having been initially coupled. Entanglement and superposition give quantum computers their phenomenal calculating power.

But keeping qubits entangled is a big challenge. It is subject to interference from the environment. Any disturbances, whether thermal, electrical or mechanical, can cause errors, De Franceschi says.

One way to limit the errors caused by these factors is to operate the qubits in a deep freeze mode.

When qubits are cooled down to sufficiently low temperature, typically below a few degrees Kelvin, they are no longer susceptible to undesirable thermal excitations and their coherence can be preserved, Vinet says.

Though the system to cool qubits uses the similar principle as that of a household refrigerator, it is much bigger and way more complex.

The CEA has several cryostats that use helium to achieve a temperature between 15 millikelvin to 1 Kelvin.

That corresponds to 272 degrees below waters freezing point. Besides the above-mentioned cryostats, the CEA also boasts of a cryogenic prober that can carry out automatic measurements of 300 mm silicon wafers below 2 Kelvin or minus 271 degree Celsius. There are only two such machines in the world.

The French approach

There are four major approaches to fabricate the qubits: photons, trapped ions, superconductors and semiconductors like silicon.

Vinet and De Franceschi have adopted the last approach which involves the use of the magnetic moment of an electron in silicon to create the two different states of the qubit. They have chosen silicon even though it seems to be lagging behind the others in terms of the number of interacting qubits in a network.

The other three approaches seem to have made more progress. But we are sticking with silicon. Thats because building workable quantum computers is not a short term race. It doesnt matter where you stand today. What matters more is the growth potential for the future, De Franceschi told RFI.

According to Vinet, in order to build a practical quantum computer, scalability will be the key. In this regard, theres no better candidate than silicon, which is central to the semiconductor industry. With silicon we can fabricate millions or even billions of qubits that can be assembled in a relatively compact system. Its also convenient for control electronics.

Moreover, according to De Franceschi, when it comes to performance, the silicon qubits are on par with the other platforms in terms of fidelity and speed of operations. De Franceschi contends that some of the other approaches may be appropriate but they may not be equally suitable when it comes to effective, massive and easy manufacturing.

You need to consider how good you can scale up and handle the controlling of qubits once the processor size grows. There are other problems such as possible interference when you are manipulating qubits. The successful approach will be the one that copes the best with all these issues, he says.

Researchers at CEA have a unique advantage as both the physics and the engineering requirements necessary to build a quantum computer are available under the same roof.

While De Franceschi and his team are engaged in perfecting the fabrication and interactions between qubits, Vinet and his group are working in parallel to make qubits truly scalable and to build the other components of a quantum computer.

What we are trying to do here is build a full stack quantum computer. We are developing the quantum chip, control electronics, implementation of the quantum algorithm as well as an interface that translates the algorithm into electrical signals, Vinet says.

Quantum appeal

Quantum computers have elicited huge interest from not just research labs but also IT giants, start ups and governments. In January 2021, French President Emmanuel Macron announced a 1.8 billion euro Quantum Plan initiative for supporting research and development of quantum technologies.

The enormous appeal of quantum computing lies in its promise to easily outperform even the worlds most powerful supercomputers on certain types of calculations.

They are expected to solve complex problems such as protein simulations, calculating air flow on aircraft, finding new materials such as possibly room temperature superconductors, Vinet says, adding researchers still dont know how powerful these machines will turn out to be.

More:

French researchers on the verge of quantum computing milestone - RFI English

Posted in Quantum Physics | Comments Off on French researchers on the verge of quantum computing milestone – RFI English

Quantum computing to reduce operational costs for oil and gas companies – Oil Review Africa

Posted: June 28, 2021 at 9:54 pm

Quantum computing is a very specialised field requiring niche expertise, which is not readily available with oil and gas companies. (Image source: Adobe Stock)

Although classical computers are capable enough in delivering efficiency gains, quantum computers and their optimisation algorithms could deliver these gains in a much shorter time,

Quantum computers are machines that use the properties of quantum physics to store data and perform computations. Theoretically, these machines can complete a task in seconds that would take classical computers thousands of years. The company (or government) that owns the first at-scale quantum computer will be powerful indeed.

According to GlobalDatas latest report, Quantum Computing in Oil & Gas, full-fledged commercial computers are not expected to be ready for approximately another 20 years. However, intermediate versions would be available within the next five to seven years, offering a quantum advantage over classical computers in optimization applications across several sectors, including space warfare, logistics, drug discovery, and options trading.

Ravindra Puranik, oil and gas analyst at GlobalData, commented, Oil majors ExxonMobil, Total, Shell and BP are among the few industry participants to venture into quantum computing. Although these companies intend to use the technology to solve diverse business problems, quantum chemistry is emerging as the common focus area of research in the initial phase.

Puranik further added, These majors are seeking to develop advanced materials for carbon capture technologies. This could potentially lower the operational costs of carbon capture and storage (CCS) projects, enabling companies to deploy them on a wider scale to curb operational emissions.

According to Puranik, IBM is at the forefront in providing quantum computing tools to a host of industries, including oil and gas. The company has brought on board leading oil and gas and chemical companies, such as ExxonMobil, BP, Woodside, Mitsubishi Chemical and JSR, to facilitate the advancement of quantum computing via cross-domain research. Besides IBM, oil and gas companies have also collaborated with other quantum computing experts, including D-Wave, Microsoft, and Atos.

Go here to see the original:

Quantum computing to reduce operational costs for oil and gas companies - Oil Review Africa

Posted in Quantum Physics | Comments Off on Quantum computing to reduce operational costs for oil and gas companies – Oil Review Africa

Is reality a game of quantum mirrors? A new theory suggests it might be – The Conversation AU

Posted: at 9:39 pm

Imagine you sit down and pick up your favourite book. You look at the image on the front cover, run your fingers across the smooth book sleeve, and smell that familiar book smell as you flick through the pages. To you, the book is made up of a range of sensory appearances.

But you also expect the book has its own independent existence behind those appearances. So when you put the book down on the coffee table and walk into the kitchen, or leave your house to go to work, you expect the book still looks, feels, and smells just as it did when you were holding it.

Expecting objects to have their own independent existence independent of us, and any other objects is actually a deep-seated assumption we make about the world. This assumption has its origin in the scientific revolution of the 17th century, and is part of what we call the mechanistic worldview. According to this view, the world is like a giant clockwork machine whose parts are governed by set laws of motion.

This view of the world is responsible for much of our scientific advancement since the 17th century. But as Italian physicist Carlo Rovelli argues in his new book Helgoland, quantum theory the physical theory that describes the universe at the smallest scales almost certainly shows this worldview to be false. Instead, Rovelli argues we should adopt a relational worldview.

During the scientific revolution, the English physics pioneer Isaac Newton and his German counterpart Gottfried Leibniz disagreed on the nature of space and time.

Newton claimed space and time acted like a container for the contents of the universe. That is, if we could remove the contents of the universe all the planets, stars, and galaxies we would be left with empty space and time. This is the absolute view of space and time.

Leibniz, on the other hand, claimed that space and time were nothing more than the sum total of distances and durations between all the objects and events of the world. If we removed the contents of the universe, we would remove space and time also. This is the relational view of space and time: they are only the spatial and temporal relations between objects and events. The relational view of space and time was a key inspiration for Einstein when he developed general relativity.

Rovelli makes use of this idea to understand quantum mechanics. He claims the objects of quantum theory, such as a photon, electron, or other fundamental particle, are nothing more than the properties they exhibit when interacting with in relation to other objects.

These properties of a quantum object are determined through experiment, and include things like the objects position, momentum, and energy. Together they make up an objects state.

According to Rovellis relational interpretation, these properties are all there is to the object: there is no underlying individual substance that has the properties.

Consider the well-known quantum puzzle of Schrdingers cat. We put a cat in a box with some lethal agent (like a vial of poison gas) triggered by a quantum process (like the decay of a radioactive atom), and we close the lid.

The quantum process is a chance event. There is no way to predict it, but we can describe it in a way that tells us the different chances of the atom decaying or not in some period of time. Because the decay will trigger the opening of the vial of poison gas and hence the death of the cat, the cats life or death is also a purely chance event.

According to orthodox quantum theory, the cat is neither dead nor alive until we open the box and observe the system. A puzzle remains concerning what it would be like for the cat, exactly, to be neither dead nor alive.

Read more: Quantum philosophy: 4 ways physics will challenge your reality

But according to the relational interpretation, the state of any system is always in relation to some other system. So the quantum process in the box might have an indefinite outcome in relation to us, but have a definite outcome for the cat.

So it is perfectly reasonable for the cat to be neither dead nor alive for us, and at the same time to be definitely dead or alive itself. One fact of the matter is real for us, and one fact of the matter is real for the cat. When we open the box, the state of the cat becomes definite for us, but the cat was never in an indefinite state for itself.

In the relational interpretation there is no global, Gods eye view of reality.

Rovelli argues that, since our world is ultimately quantum, we should heed these lessons. In particular, objects such as your favourite book may only have their properties in relation to other objects, including you.

Thankfully, that also includes all other objects, such as your coffee table. So when you do go to work, your favourite book continues to appear is it does when you were holding it. Even so, this is a dramatic rethinking of the nature of reality.

Read more: A new quantum paradox throws the foundations of observed reality into question

On this view, the world is an intricate web of interrelations, such that objects no longer have their own individual existence independent from other objects like an endless game of quantum mirrors. Moreover, there may well be no independent metaphysical substance constituting our reality that underlies this web.

As Rovelli puts it:

We are nothing but images of images. Reality, including ourselves, is nothing but a thin and fragile veil, beyond which there is nothing.

Read more:

Is reality a game of quantum mirrors? A new theory suggests it might be - The Conversation AU

Posted in Quantum Physics | Comments Off on Is reality a game of quantum mirrors? A new theory suggests it might be – The Conversation AU

Quantum Theory: A Scientific Revolution that Changed Physics Forever – Interesting Engineering

Posted: at 9:39 pm

To many, quantum physics, or quantum mechanics, may seem an obscure subject, with little application for everyday life, but its principles and laws form the basis for explanations of howmatter and light work on the atomic and subatomic scale. If you want to understand how electrons move through acomputerchip, how photons of light travel in a solar panel or amplify themselves in alaser, or even why the sun keeps burning, you will need to use quantum mechanics.

Quantum mechanics is the branch of physics relating to the elementary components of nature, it is the study of the interactions that take place between subatomic forces. Quantum mechanics was developed because many of the equations ofclassical mechanics, which describe interactions at larger sizes and speeds, cease to be useful or predictive when trying to explain the forces of nature that work on the atomic scale.

Quantum mechanics, and the math that underlies it, is not based on a single theory, but on a series of theories inspired by new experimental results, theoretical insights, and mathematical methods which were elucidated beginning in the first half of the 20th century, and together create a theoretical system whose predictive power has made it one of the most successful scientific models created.

The story of quantum mechanics can be said to begin in 1859, a full 32 years before the discovery of the electron. Many physicists were concerned with a puzzling phenomenon: no matter what an object is made of, if it can survive being heated to a given temperature, the spectrum of light it emits is exactly the same as for any other substance.

In1859, physicistGustav Kirchhoff proposed a solution when he demonstrated thatthe energy emitted by a blackbody objectdepends on the temperature and the frequencyof the emitted energy, i.e

E=J(T,v)

A blackbody is a perfect emitter - an idealized object that absorbs all the energy that falls on it (because it reflects no light, it would appear black to an observer).Kirchhoff challenged physicists to find the function J, which would allow the energy emitted by light to be described for all wavelengths.

In the years following, a number of physicists would work on this problem. One of these was Heinrich Rubens, who worked to measurethe energy of black-body radiation. In 1900, Rubens visited fellow physicist Max Planckand explained his results to him. Within a few hours of Rubens leavingPlanck's house,Planckhad come up with an answer to Kirchoff's function whichfitted the experimental evidence.

Planck sought to use the equation to explain the distribution of colors emitted over the spectrum in the glow of red-hot and white-hot objects. However, when doing this, Planck realized the equation implied that only combinations of certain colors were emitted, and ininteger multiples of a small constant (which became known as Plank's Constant) times the frequency of the light.

This was unexpected because, at the time, light was believed to act as a wave, which meant that the values of color emitted should be a continuous spectrum. However,Planck realized that his solution gave different values at different wavelengths.

In order toexplain howatomswere being prevented from producing certain colors, Planck made a novel assumption - thatatoms absorb and emit energy in the form of indistinguishable energy units - what came to be called quanta.

At the time, Planck regarded quantization as a mathematical trick to make his theory work. However, a few years later, physicists provedthat classical electromagnetism couldneveraccount for the observed spectrum. These proofs helped to convincephysicists that Planck's notion of quantized energy levels may in fact be more than a mathematical "trick".

One of the proofs was given by Einstein, who published a paper in 1905 in which he envisioned light traveling not as a wave, but as a packet of "energy quanta" which could be absorbed or generated when an atom "jumps" between quantized vibration rates. In this model, the quanta contained the energy difference of the jump; when divided by Plancks constant, that energy difference determined the wavelength of light given off by those quanta.

In 1913 Niels Bohrapplied Planck's hypothesis of quantization to Ernest Rutherford's 1911 "planetary" model of the atom. This model, which came to be called the Rutherford-Bohr model, postulated that electrons orbited the nucleus in a similar way to how planets orbit the sun. Bohr proposed that electrons could only orbit at certain distances from the nucleau, and could "jump" between the orbits; doing so would give off energy at certain wavelengths of light, which could be observed as spectral lines.

It now appeared that light could act as a wave and as a particle. However, what about the matter?

In 1924, French physicist Louis de Broglie used the equations of Einstein'stheory of special relativityto show that particles can exhibit wave-like characteristics, and vice-versa.

German physicist Werner Heisenberg met with Neils Bohr at the University of Copenhagen in 1925, and after this meeting, he applied de Broglie's reasoning to understand the spectrum intensity of an electron.At the same time, Austrian physicist ErwinSchrdinger, working independently, also used de Broglie's reasoning to explain how electrons moved around in atoms.The following year, Schrdinger demonstrated that the two approaches were equivalent.

In 1927, Heisenberg reasoned that if matter can act as a wave, there must be a limit to how precisely we can know some properties, such as an electron's position and speed. In what would later be called "Heisenberg'suncertainty principle," he reasoned that the more precisely an electron's position is known, the less precisely its speed can be known, and vice versa. The proved an important piece of the quantum puzzle.

In the Heisenberg-Schrdingerquantum mechanical model of the atom, each electron acts as a wave, or "cloud") around the nucleus of an atom, with the ability to measure only the speed or position of an electron to a particular probability. This model replaced the Rutherford-Bohr model.

All these revelations regarding quantum theory revolutionized the world of physics and revealed important details about universal actions at atomic and subatomic levels.

Quantum mechanics further combined with other phenomena in physics such as relativity, gravitation, electromagnetism, etc. also increased our understanding of the physical world and how construction and destruction occur within it.

For their exceptional contributions, Planck, Einstein, Bohr, Heisenberg, and Schrdinger were awarded the Nobel Prize in Physics in 1918, 1921, 1922, 1932, and 1933 respectively.

While it may seem as though quantum mechanics progressed in a fairly straightforward series of theoretical leaps, in reality, there was a lot of disagreement among physicists over its relevance.

These disagreements reached a peak at the 1927 Solvay Conference in Brussels, where 29 of the world's most brilliant scientists gathered to discuss the many seemingly contradictory observations inquantum theory that could not be reconciled. One major point of contention had to do with the theory that, until they are observed, the location and speed of entities such as electrons, can only exist as a "probability".

Bohr, in particular,emphasized that quantum predictions founded on probability are able to accurately describe physical actions in the real world. In what later came to be called the Copenhagen interpretation, he proposed that while wave equations described the probability of where entities like electronscouldbe found,theseentities didn't actually exist as particles unless they were observed. In Bohr's words, they had no "independent reality" in the ordinary physical sense.

He described that the events that take place on atomic levels can alter the outcome of quantum interaction.According to Bohr, a system behaves as a wave or a particle depending on context, but you cannot predict what it will do.

Einstein, in contrast, argued thatan electron was an electron, even if no one was looking at it, that particles like electrons had independent reality, and prompting his famous claim that God does not play dice with the universe.

Einstein and Bohr would debate their views until Einstein's death three decades later, but remained colleagues and good friends.

Einstein argued that the Copenhagen interpretation was incomplete. He theorized that there might be hidden variables or processes underlying quantum phenomena.

In 1935, Einstein, along with fellow physicists Boris Podolsky and Nathan Rosen published a paper on what would be known as the Einstein-Boris-Podolsky (EPR) paradox. The EPR paradox described in the paper again raised doubts on the quantum theory.

The EPR paper featured predetermined values of momentum and particle velocity and suggested that the description of physical reality provided by the wave function in quantum theory is incomplete, and therefore, physical reality can not be derived from the wave function or in the context of quantum-mechanical theory.

The same year, Bohr replied to the claims made by Einstein. In his response, published in the Physical Review, Bohr proved that the predetermined values of the second particles velocity and momentum, as per the EPR paradox were incorrect. He also argued that the paradox failed to justify the inability of quantum mechanics to explain physical reality.

The understanding of elementary particles and their behavior helped to create groundbreaking innovations in healthcare, communication, electronics, and various other fields. Moreover, there are numerous modern technologies that operate on the principles mentioned in quantum physics.

Laser-based equipment

Laser technology involves equipment that emits light by the means of a process called optical amplification. Laser equipment work on the principle of photon emission and they release the light with a well-defined wavelength in a very narrow beam. Hence, the laser beams function in alignment with theories (such as the photoelectric effect) mentioned in quantum mechanics.

A report published in 2009 reveals that extreme ultraviolet lasers when hit a metal surface can cause electrons to move out of the atom, this outcome is said to further extend Einsteins photoelectric effect in the context of super-intense lasers.

Electronic Devices and Machines

From flash memory storage devices like USB drives to complex lab equipment such as electron microscopes, an understanding of quantum mechanics led to countless modern-day inventions. Light-emitting diodes, electric switches, transistors, quantum computers, etc are examples of some highly useful devices that resulted from the advent of quantum physics.

Let us understand this from the example of Magnetic Resonance Imaging (MRI) machine, this medical equipment is very useful in diagnosing the brain and other body organs. MRI works on the principle of electromagnetism, it has a strong magnetic field that uses the spin of protons in hydrogen atoms to analyze the composition of different tissues.

MRI aligns all the protons in the body as per their spin, due to the magnetic field, the protons absorb energy and emit the same (quantum theory), the MRI scanner uses the emitted energy signals received from all the water molecules to deliver a detailed image of the internal body parts.

X-Rays

Used in medical diagnosis, border inspection, industrial tomography, cancer treatment, and for many other purposes, X-rays are a form of electromagnetic radiation. While the discovery of X-rays predates quantum mechanics, quantum mechanical theory has allowed the use of X-rays in a practical way.

A beam of X-rays can be regarded as consisting of a stream of quanta. These quanta are projected out from the target of the X-ray tube, and, on penetrating tissue, there is an effect produced that is proportional to the number of the quanta multiplied by the energy carried by each quantum.

The emitted electrons also emit photons, whichare able to penetrate the matter and form its image on the X-ray screen. Therefore, the elementary particles mentioned in quantum mechanics interact with X-ray energy to deliver the inside look of an object.

Fluorescence-based Applications

Fluorescence is referred to the emission of light under UV exposure that takes place when an electron achieves a higher quantum state and emits photons, fluorescent lamps and spectrometers work on basis of quantum theory. Various minerals such as Aragonit, Calcite, and Fluorite are also known to exhibit fluorescence.

Fluorescence is also used to lit synthetic gems and diamonds, jewelry manufacturers use this phenomenon to create artificial imitation stones that look brighter and more beautiful than the naturally occurring original stones.

Apart from these applications, quantum mechanics has contributed to our understanding of many areas of technology, biological systems, and cosmic forces and bodies. While there are several important questions remaining inquantum physics, the core concepts, which define the behavior of energy, particles, and matter have continued to hold constant.

Here is the original post:

Quantum Theory: A Scientific Revolution that Changed Physics Forever - Interesting Engineering

Posted in Quantum Physics | Comments Off on Quantum Theory: A Scientific Revolution that Changed Physics Forever – Interesting Engineering

NIST’s Quantum Security Protocols Near the Finish Line The U.S. standards and technology authority is searching – IoT World Today

Posted: at 9:39 pm

The U.S. standards and technology authority is searching for a new encryption method to prevent the Internet of Things succumbing to quantum-enabled hackers

As quantum computing moves from academic circles to practical uses, it is expected to become the conduit for cybersecurity breaches.

The National Institute of Standards and Technology aims to nip these malicious attacks preemptively. Its new cybersecurity protocols would help shield networks from quantum computing hacks.

National Institute of Standards and Technology (NIST) has consulted with cryptography thought leaders on hardware and software options to migrate existing technologies to post-quantum encryption.

The consultation forms part of a wider national contest, which is due to report back with its preliminary shortlist later this year.

IT pros can download and evaluate the options through the open source repository at NISTs Computer Security Resource Center.

[The message] is to educate the market but also to try to get people to start playing around with [quantum computers] and understanding it because, if you wait until its a Y2K problem, then its too late, said Chris Sciacca, IBMs communications manager for research in Europe, Middle East, Africa, Asia and South America. So the message here is to start adopting some of these schemes.

Businesses need to know how to contend with quantum decryption, which could potentially jeopardize many Internet of Things (IoT) endpoints.

Quantum threatens society because IoT, in effect, binds our digital and physical worlds together. Worryingly, some experts believe hackers could already be recording scrambled IoT transmissions, to be ready when quantum decryption arrives.

Current protocols such as Transport Layer Security (TLS) will be difficult to upgrade, as they are often baked into the devices circuitry or firmware,

Estimates for when a quantum computer capable of running Shors algorithm vary. An optimist in the field would say it may take 10 to 15 years. But then it could be another Y2K scenario, whose predicted problems never came to pass.

But its still worth getting the enterprises IoT network ready, to be on the safe side.

Broadly speaking, all asymmetric encryption thats in common use today will be susceptible to a future quantum computer with adequate quantum volume, said Christopher Sherman, a senior analyst at Forrester Research, Anything that uses prime factorization or discrete log to create separate encryption and decryption keys, those will all be vulnerable to a quantum computer potentially within the next 15 years.

Why Do We Need Quantum Security?

Quantum computers would answer queries existing technologies cannot resolve, by applying quantum mechanics to compute various combinations of data simultaneously.

As the quantum computing field remains largely in the prototyping phase, current models largely perform only narrow scientific or computational objectives.

All asymmetric cryptography systems, however, could one day be overridden by a quantum mechanical algorithm known as Shors algorithm.

Thats because the decryption ciphers rely on mathematical complexities such as factorization, which Shors could hypothetically unravel in no time.

In quantum physics, what you can do is construct a parameter that cancels some of the probabilities out, explained Luca De Fao, a researcher at IBM who is involved with the NIST quantum-security effort, Shors algorithm is such an apparatus. It makes many quantum particles interact in such a way that the probabilities of the things you are not interested in will cancel out.

Will Quantum Decryption Spell Disaster For IoT?

Businesses must have safeguards against quantum decryption, which threatens IoT endpoints secured by asymmetric encryption.

A symmetric encryption technique, Advanced Encrypton Standard, is believed to be immune to Shors algorithm attacks, but is considered computationally expensive for resource-constrained IoT devices.

For businesses looking to quantum-secure IoT in specific verticals, theres a risk assessment model published by University of Waterloos quantum technology specialist Dr. Michele Mosca. The model is designed to predict the risk and outline times for preparing a response,depending on the kind of organization involved.

As well as integrating a new quantum security standard, theres also a need for mechanisms to make legacy systems quantum-secure. Not only can encryption be broken, but theres also potential for quantum forgeries of digital identities, in sectors such as banking.

I see a lot of banks now asking about quantum security, and definitely governments, Sherman said, They are not just focused on replacing RSA which includes https and TLS but also elliptic curve cryptography (ECC), for example blockchain-based systems. ECC-powered digital signatures will need to be replaced as well.

One option, which NIST is considering, is to blend post-quantum security at network level with standard ciphers on legacy nodes. The latter could then be phased out over time.

A hybrid approach published by NIST guidance around using the old protocols that satisfy regulatory requirements at a security level thats been certified for a given purpose, Sherman said, But then having an encapsulation technique that puts a crypto technique on top of that. It wraps up into that overall encryption scheme, so that in the future you can drop one thats vulnerable and just keep the post-quantum encryption.

Governments Must Defend Against Quantum Hacks

For national governments, its becoming an all-out quantum arms race. And the U.S. may well be losing. Russia and China have both already unveiled initial post-quantum security options, Sherman said.

They finished their competitions over the past couple of years. I wouldnt be surprised if the NIST standard also becomes something that Europe uses, he added.

The threats against IoT devices have only grown more pronounced with current trends.

More virtual health and connected devices deployed in COVID-19, for example, will mean more medical practices are now quantum-vulnerable.

According to analyst firm Omdia, there are three major fault lines in defending the IoT ecosystem: endpoint security, network security and public cloud security. With 46 billion things currently in operation globally, IoT already provides an enlarged attack surface for cybercriminals.

The challenge is protecting any IoT device thats using secure communications or symmetric protocols, said Sherman, Considering that by, 2025 theres over a trillion IoT devices expected to be deployed. Thats obviously quite large in terms of potential exposure. Wherever RSA or TLS is being used with IoT, theres a threat.

Weighing Up Post-Quantum And Quantum Cryptography Methods

Post-quantum cryptography differs from methods such as quantum key distribution (QKD), which use quantum mechanics to secure technology against the coming threat.

QKD is already installed on some government and research communications lines, and hypothetically its impenetrable.

But the average business needs technology that can be implemented quickly and affordably. And, as we dont even know how a quantum decryption device would work in practice, its unrealistic to transfer QKD onto every IoT network.

One of the main post-quantum cryptography standards in the frame is lattice-based cryptography, an approach that is thought to be more resilient against Shors algorithm.

While these are still based on mathematics and could be endangered by future quantum decryption algorithms, they might buy scientists enough time to come up with other economically viable techniques.

Another advantage would be in IoT applications that need the point-to-point security channel, such as connected vehicles, De Fao said.

Probably the lattice-based schemes are the best right now to run on IoT devices. Some efforts will be needed in the chip design process to make these even easier to run, he added, But we should probably start thinking about this right now. Because it will probably take around five-to-seven years after the algorithms have been found for the chips to reach peoples homes or industrial systems.

And then potentially [if the optimistic estimates are right,] quantum computers will have arrived.

Read this article:

NIST's Quantum Security Protocols Near the Finish Line The U.S. standards and technology authority is searching - IoT World Today

Posted in Quantum Physics | Comments Off on NIST’s Quantum Security Protocols Near the Finish Line The U.S. standards and technology authority is searching – IoT World Today

Science Should Not Try to Absorb Religion and Other Ways of Knowing – Scientific American

Posted: at 9:39 pm

An edgy biography of Stephen Hawking has me reminiscing about sciences good old days. Or were they bad? I cant decide. Im talking about the 1990s, when scientific hubris ran rampant. As journalist Charles Seife recalls in Hawking Hawking: The Selling of a Scientific Celebrity, Hawking and other physicists convinced us that they were on the verge of a theory of everything that would solve the riddle of existence. It would reveal why there is something rather than nothing, and why that something is the way it is.

In this column, Ill look at an equally ambitious and closely related claim, that science will absorb other ways of seeing the world, including the arts, humanities and religion. Nonscientific modes of knowledge wont necessarily vanish, but they will become consistent with science, our supreme source of truth. The most eloquent advocate of this perspective is biologist Edward Wilson, one of our greatest scientist-writers.

In his 1998 bestseller Consilience: The Unity of Knowledge, Wilson prophesies that science will soon yield such a compelling, complete theory of nature, including human nature, that the humanities, ranging from philosophy and history to moral reasoning, comparative religion, and interpretation of the arts, will draw closer to the sciences and partly fuse with them. Wilson calls this unification of knowledge consilience, an old-fashioned term for coming together or converging. Consilience will resolve our age-old identity crisis, helping us understand once and for all who we are and why we are here, as Wilson puts it.

Dismissing philosophers warnings against deriving ought from is, Wilson insists that we can deduce moral principles from science. Science can illuminate our moral impulses and emotions, such as our love for those who share our genes, as well as giving us moral guidance. This linkage of science to ethics is crucial, because Wilson wants us to share his desire to preserve nature in all its wild variety, a goal that he views as an ethical imperative.

At first glance you might wonder: Who could possibly object to this vision? Wouldnt we all love to agree on a comprehensive worldview, consistent with science, that tells us how to behave individually and collectively? And in fact. many scholars share Wilsons hope for a merger of science with alternative ways of engaging with reality. Some enthusiasts have formed the Consilience Project, dedicated to developing a body of social theory and analysis that explains and seeks solutions to the unique challenges we face today. Last year, poet-novelist Clint Margrave wrote an eloquent defense of consilience for Quillette, noting that he has often drawn inspiration from science.

Another consilience booster is psychologist and megapundit Steven Pinker, who praised Wilsons excellent book in 1998 and calls for consilience between science and the humanities in his 2018 bestseller Enlightenment Now. The major difference between Wilson and Pinker is stylistic. Whereas Wilson holds out an olive branch to postmodern humanities scholars who challenge sciences objectivity and authority, Pinker scolds them. Pinker accuses postmodernists of defiant obscurantism, self-refuting relativism and suffocating political correctness.

The enduring appeal of consilience makes it worth revisiting. Consilience raises two big questions: (1) Is it feasible? (2) Is it desirable? Feasibility first. As Wilson points out, physics has been an especially potent unifier, establishing over the past few centuries that the heavens and earth are made of the same stuff ruled by the same forces. Now physicists seek a single theory that fuses general relativity, which describes gravity, with quantum field theory, which accounts for electromagnetism and the nuclear forces. This is Hawkings theory of everything and Steven Weinbergs final theory."

Writing in 1998, Wilson clearly expected physicists to find a theory of everything soon, but today they seem farther than ever from that goal. Worse, they still cannot agree on what quantum mechanics means. As science writer Philip Ball points out in his 2018 book Beyond Weird: Why Everything You Thought You Knew about Quantum Physics Is Different, there are more interpretations of quantum mechanics now than ever.

The same is true of scientific attempts to bridge the explanatory chasm between matter and mind. In the 1990s, it still seemed possible that researchers would discover how physical processes in the brain and other systems generate consciousness. Since then, mind-body studies have undergone a paradigm explosion, with theorists espousing a bewildering variety of models, involving quantum mechanics, information theory and Bayesian mathematics. Some researchers suggest that consciousness pervades all matter, a view called panpsychism; others insist that the so-called hard problem of consciousness is a pseudoproblem because consciousness is an illusion.

There are schisms even within Wilsons own field of evolutionary biology. In Consilience and elsewhere, Wilson suggests that natural selection promotes traits at the level of tribes and other groups; in this way, evolution might have bequeathed us a propensity for religion, war and other social behaviors. Other prominent Darwinians, notably Richard Dawkins and Robert Trivers, reject group selection, arguing that natural selection operates only at the level of individual organisms and even individual genes.

If scientists cannot achieve consilience even within specific fields, what hope is there for consilience between, say, quantum chromodynamics and queer theory? (Actually, in her fascinating 2007 book Meeting the Universe Halfway: Quantum Physics and the Entanglement of Matter and Meaning, physicist-philosopher Karen Barad finds resonances between physics and gender politics; but Barads book represents the kind of postmodern analysis deplored by Wilson and Pinker.) If consilience entails convergence toward a consensus, science is moving away from consilience.

So, consilience doesnt look feasible, at least not at the moment. Next question: Is consilience desirable? Although Ive always doubted whether it could happen, I once thought consilience should happen. If humanity can agree on a single, rational worldview, maybe we can do a better job solving our shared problems, like climate change, inequality, pandemics and militarism. We could also get rid of bad ideas, such as the notion that God likes some of us more than others; or that racial and sexual inequality and war are inevitable consequences of our biology.

I also saw theoretical diversity, or pluralism, as philosophers call it, as a symptom of failure; the abundance of solutions to the mind-body problem, like the abundance of treatments for cancer, means that none works very well. But increasingly, I see pluralism as a valuable, even necessary counterweight to our yearning for certitude. Pluralism is especially important when it comes to our ideas about who we are, can be and should be. If we settle on a single self-conception, we risk limiting our freedom to reinvent ourselves, to discover new ways to flourish.

Wilson acknowledges that consilience is a reductionistic enterprise, which will eliminate many ways of seeing the world. Consider how he treats mystical visions, in which we seem to glimpse truths normally hidden behind the surface of things. To my mind, these experiences rub our faces in the unutterable weirdness of existence, which transcends all our knowledge and forms of expression. As William James says in The Varieties of Religious Experience, mystical experiences should forbid a premature closing of our accounts with reality.

Wilson disagrees. He thinks mystical experiences are reducible to physiological processes. In Consilience, he focuses on Peruvian shaman-artist Pablo Amaringo, whose paintings depict fantastical, jungly visions induced by ayahuasca, a hallucinogenic tea (which I happen to have taken) brewed from two Amazonian plants. Wilson attributes the snakes that slither through Amaringos paintings to natural selection, which instilled an adaptive fear of snakes in our ancestors; it should not be surprising that snakes populate many religious myths, such as the biblical story of Eden.

Moreover, ayahuasca contains psychotropic compounds, including the potent psychedelic dimethyltryptamine, like those that induce dreams, which stem from, in Wilsons words, the editing of information in the memory banks of the brain that occurs while we sleep. These nightly neural discharges are arbitrary in content, that is, meaningless; but the brain desperately tries to assemble them into coherent narratives, which we experience as dreams.

In this way, Wilson explains Amaringos visions in terms of evolutionary biology, psychology and neurochemistry. This is a spectacular example of what Paul Feyerabend, my favorite philosopher and a fierce advocate for pluralism, calls the tyranny of truth. Wilson imposes his materialistic, secular worldview on the shaman, and he strips ayahuasca visions of any genuine spiritual significance. While he exalts biological diversity, Wilson shows little respect for the diversity of human beliefs.

Wilson is a gracious, courtly man in person as well on the page. But his consilience project stems from excessive faith in science, or scientism. (Both Wilson and Pinker embrace the term scientism, and they no doubt think that the phrase excessive faith in science is oxymoronic.) Given the failure to achieve consilience within physics and biologynot to mention the replication crisis and other problemsscientists should stop indulging in fantasies about conquering all human culture and attaining something akin to omniscience. Scientists, in short, should be more humble.

Ironically, Wilson himself questioned the desirability of final knowledge early in his career. At the end of his 1975 masterpiece Sociobiology, Wilson anticipates the themes of Consilience, predicting that evolutionary theory plus genetics will soon absorb the social sciences and humanities. But Wilson doesnt exult at this prospect. When we can explain ourselves in mechanistic terms, he warns, the result might be hard to accept; we might find ourselves, as Camus put it, divested of illusions.

Wilson neednt have worried. Scientific omniscience looks less likely than ever, and humans are far too diverse, creative and contrary to settle for a single worldview of any kind. Inspired by mysticism and the arts, as well as by science, we will keep arguing about who we are and reinventing ourselves forever. Is consilience a bad idea, which wed be better off without? I wouldnt go that far. Like utopia, another byproduct of our yearning for perfection, consilience, the dream of total knowledge, can serve as a useful goad to the imagination, as long as we see it as an unreachable ideal. Lets just hope we never think weve reached it.

This is an opinion and analysis article; the views expressed by theauthor or authorsare not necessarily those of Scientific American.

Further Reading:

The Delusion of Scientific Omniscience

The End of Science (updated 2015 edition)

Mind-Body Problems: Science, Subjectivity and Who We Really Are

I just talked about consilience with science journalist Philip Ball on my podcast Mind-Body Problems.

I brood over the limits of knowledge in my new book Pay Attention: Sex, Death, and Science.

Read the original here:

Science Should Not Try to Absorb Religion and Other Ways of Knowing - Scientific American

Posted in Quantum Physics | Comments Off on Science Should Not Try to Absorb Religion and Other Ways of Knowing – Scientific American

Lars Jaeger: Quantum Computers Have Reached the Mainstream – finews.asia

Posted: at 9:39 pm

The discussion about quantum computers has reached the mainstream including investors. This is one of the numerous examples that such technological development is happening much faster today than 50 years ago, Lars Jaeger writes on finews.first.

This article is published on finews.first, a forum for authors specialized in economic and financial topics.

A word that is becoming more and more popular, but still sounds like science fiction, is the term quantum computer. Only 10 to 15 years ago, the construction of such a computer as a future technology seemed impossible within any reasonable time frame.

Thus, the discussion about it was limited to a small team of experts or just material for science fiction. Just as transistor effect or von Neumann processors were not even remotely familiar terms to non-physicists in the 1940s, the same was true for the term quantum computer until recently.

The discussion about quantum computers has even reached the mainstream including investors. And this could become one of the numerous examples that such technological development is happening much faster today than 50 years ago.

The quantum world offers even more

However, most people are still completely unaware of what a quantum computer actually is, as in principle all computers today are still entirely based on classical physics, on the so-called von Neumann architecture from the 1940s.

In it, the individual computing steps are processed sequentially bit by bit. The smallest possible unit of information (a so-called binary digits, or bit for short) thereby always takes a well-defined state of either 1 or 0. In contrast, quantum computers use the properties of quantum systems that are not reducible to classical bits but are based on quantum bits, or qubits for short.

These can assume the different states of bits, i.e. 0 and 1 and all values in between simultaneously. So, they can be half 1 and half 0 as well as in any other possible combination of them. This possibility is beyond our classical (everyday) imagination, according to which a state is either one or the other, tertium non datur, but is very typical for quantum systems. Physicists call such mixed quantum states superpositions.

Quantum computers are supposed to be the crowning achievement

But the quantum world offers even more: Different quantum particles can be in so-called entangled states. This is another property that does not exist in our classical world. It is as if the qubits are coupled to each other with an invisible spring. They are then all in direct contact with each other, without any explicit acting force. Each quantum bit knows so to say over any distance what the others are doing. Such entanglement was the subject of a heated debate in early quantum physics. Albert Einstein, for example, considered entanglement to be physically impossible and derisively called it a spooky action-at-a-distance.

In the meantime, however, this controversial quantum property is already being exploited in many technical applications. Quantum computers are supposed to be the crowning achievement here. They could open completely new, fantastic possibilities in at least five fields:

Some physicists even believe that a quantum computer could be used to calculate and thus solve any problem in nature, from the behavior of black holes, the development of the very early universe, the collisions of high-energy elementary particles, to the phenomenon of superconductivity as well as the modeling of the 100 billion neurons and the thousand times larger number of their connections in our brain. Quantum computers could therefore represent a revolution in science as well as in the technology world.

Some even spoke of a Sputnik moment in information technology

Less than two years ago, Google announced that its engineers had succeeded in building a quantum computer that for the first time was able to solve a problem that any conventional computer could not. The corresponding computer chip Sycamore needed just 200 seconds for a special computing task that would have taken the worlds best supercomputer 10,000 years.

It had been Google itself that some years earlier had christened such an ability of a quantum computer to be superior to any existing classical computer in accomplishing certain tasks with quantum supremacy. The moment of such quantum supremacy seemed to have finally come. Some even spoke of a Sputnik moment in information technology.

However, this was more a symbolic milestone, since the problem solved by Sycamore was still a very special and purely academic one. But there is no doubt that it represented a significant step forward (which, however, was also called into question in some cases: IBM even doubted the quantum nature of this computing machine).

Jiuzhang was also controversial as a quantum computer

Then, in December 2020, a team-based mainly at the University of Science and Technology of China in Hefei communicated in the journal Science that a new quantum computer they had developed and which they had named Jiuzhang, was up to 10 billion times faster than Googles Sycamore.

That this news came from China was not quite as surprising as it might have been to those with little familiarity with today's Chinese science. Partly still seen as a developing country and thus technologically behind, China has meanwhile invested heavily in potential quantum computing and other quantum processes as well as artificial intelligence, genetic engineering, and a bunch of other cutting-edge technologies. Communist General Secretary Xi Jinpings government is spending $10 billion over several years on the countrys National Laboratory for Quantum Information Sciences.

Jiuzhang was also controversial as a quantum computer. But if both Sycamore and Jiuzhang could indeed solve their (still very specific) problems incomparably fast with quantum technologies and this can no longer be easily dismissed there would already be two quantum computers that have achieved the desired quantum superiority.

Just these days, there was another (money-big) announcement

From here, we could then expect numerous further versions quite soon, which can solve more and more problems faster and faster. A few weeks ago, Google announced that they want to have built a powerful quantum computer that can be used on a very broad scale (no longer limited to exotic peripheral problems) by 2029. To this end, they want to bring together one million physical qubits that work together in an error-correcting quantum computer (in todays quantum computers this number still stands at less than 100 qubits).

In addition to Google and the Chinese research center in Hefei, there are countless other quantum computer development sites. And they are increasingly supported by governments. Germany, for example, announced in 2020 that the country will invest billions into quantum computing technology.

The new entity could become another global leader

And just these days, there was another (money-big) announcement: Cambridge Quantum Computing, a British company founded in 2014, announced that it will partner with the quantum solutions division of U.S. industrial giant Honeywell to build a new quantum computer. This deal brings together Honeywells expertise in (quantum) hardware with the one of Cambridge Quantum in software and algorithms.

The new entity could become another global leader (along with Google, IBM, and the Chinese) in developing quantum computers. Without the belief that initial breakthroughs in quantum computing have already been achieved, it is unlikely that so much money would be flowing into the industry already.

These sums are likely to multiply again as further progress is made. One might feel transported back to the early 1970s before commercial computers existed. Only this time, everything will probably happen even much faster.

Lars Jaeger is a Swiss-German author and investment manager. He writes on the history and philosophy of science and technology and has in the past been an author on hedge funds, quantitative investing, and risk management.

Previous contributions: Rudi Bogni, Peter Kurer, Rolf Banz, Dieter Ruloff, Werner Vogt, Walter Wittmann, Alfred Mettler, Robert Holzach, Craig Murray, David Zollinger, Arthur Bolliger, Beat Kappeler, Chris Rowe, Stefan Gerlach, Marc Lussy, Nuno Fernandes, Richard Egger, Maurice Pedergnana, Marco Bargel, Steve Hanke, Urs Schoettli, Ursula Finsterwald, Stefan Kreuzkamp, Oliver Bussmann, Michael Benz, Albert Steck, Martin Dahinden, Thomas Fedier, Alfred Mettler,Brigitte Strebel, Mirjam Staub-Bisang, Nicolas Roth, Thorsten Polleit, Kim Iskyan, Stephen Dover, Denise Kenyon-Rouvinez, Christian Dreyer, Kinan Khadam-Al-Jame, Robert Hemmi,Anton Affentranger,Yves Mirabaud, Katharina Bart, Frdric Papp, Hans-Martin Kraus, Gerard Guerdat, MarioBassi, Stephen Thariyan, Dan Steinbock, Rino Borini,Bert Flossbach, Michael Hasenstab, Guido Schilling, Werner E. Rutsch,Dorte Bech Vizard, Adriano B. Lucatelli, Katharina Bart, Maya Bhandari, Jean Tirole, Hans Jakob Roth,Marco Martinelli, Thomas Sutter,Tom King,Werner Peyer, Thomas Kupfer, Peter Kurer,Arturo Bris,Frederic Papp,James Syme, DennisLarsen, Bernd Kramer, Ralph Ebert, Armin Jans,Nicolas Roth, Hans Ulrich Jost, Patrick Hunger, Fabrizio Quirighetti,Claire Shaw, Peter Fanconi,Alex Wolf, Dan Steinbock, Patrick Scheurle, Sandro Occhilupo, Will Ballard, Nicholas Yeo, Claude-Alain Margelisch, Jean-Franois Hirschel, Jens Pongratz, Samuel Gerber, Philipp Weckherlin, Anne Richards, Antoni Trenchev, Benoit Barbereau, Pascal R. Bersier, Shaul Lifshitz, Klaus Breiner, Ana Botn, Martin Gilbert, Jesper Koll, Ingo Rauser, Carlo Capaul, Claude Baumann, Markus Winkler, Konrad Hummler, Thomas Steinemann, Christina Boeck, Guillaume Compeyron, Miro Zivkovic, Alexander F. Wagner, Eric Heymann, Christoph Sax, Felix Brem, Jochen Moebert, Jacques-Aurlien Marcireau, Ursula Finsterwald, Claudia Kraaz, Michel Longhini, Stefan Blum, Zsolt Kohalmi, Karin M. Klossek, Nicolas Ramelet, Sren Bjnness, Lamara von Albertini, Andreas Britt, Gilles Prince, Darren Willams, Salman Ahmed, Stephane Monier, and Peter van der Welle, Ken Orchard, Christian Gast, Jeffrey Bohn, Juergen Braunstein, Jeff Voegeli, Fiona Frick, Stefan Schneider, Matthias Hunn, Andreas Vetsch, Fabiana Fedeli, Marionna Wegenstein, Kim Fournais, Carole Millet, Ralph Ebert, Swetha Ramachandran, Brigitte Kaps, Thomas Stucki, Neil Shearing, Claude Baumann, Tom Naratil, Oliver Berger, Robert Sharps, Tobias Mueller, Florian Wicki, Jean Keller, Niels Lan Doky, Karin M. Klossek, Ralph Ebert, Johnny El Hachem, Judith Basad, Katharina Bart, Thorsten Polleit, Bernardo Brunschwiler, Peter Schmid, Karam Hinduja, Zsolt Kohalmi, Raphal Surber, Santosh Brivio, Grard Piasko, Mark Urquhart, Olivier Kessler, Bruno Capone, Peter Hody, Lars Jaeger, Andrew Isbester, Florin Baeriswyl, and Michael Bornhaeusser, Agnieszka Walorska, Thomas Mueller, Ebrahim Attarzadeh, Marcel Hostettler,Hui Zhang, Michael Bornhaeusser, Reto Jauch, Angela Agostini, Guy de Blonay, Tatjana Greil Castro, Jean-Baptiste Berthon, Marc Saint John Webb, Dietrich Goenemeyer, Mobeen Tahir, Didier Saint-Georges, Serge Tabachnik, Rolando Grandi, Vega Ibanez, Beat Wittmann, Carina Schaurte, and David Folkerts-Landau, Andreas Ita, Teodoro Cocca, Michael Welti, Mihkel Vitsur, Fabrizio Pagani, Roman Balzan, Todd Saligman, Christian Kaelin, Stuart Dunbar, and Fernando Fernndez.

Read this article:

Lars Jaeger: Quantum Computers Have Reached the Mainstream - finews.asia

Posted in Quantum Physics | Comments Off on Lars Jaeger: Quantum Computers Have Reached the Mainstream – finews.asia

Page 75«..1020..74757677..8090..»