The Prometheus League
Breaking News and Updates
- Abolition Of Work
- Ai
- Alt-right
- Alternative Medicine
- Antifa
- Artificial General Intelligence
- Artificial Intelligence
- Artificial Super Intelligence
- Ascension
- Astronomy
- Atheism
- Atheist
- Atlas Shrugged
- Automation
- Ayn Rand
- Bahamas
- Bankruptcy
- Basic Income Guarantee
- Big Tech
- Bitcoin
- Black Lives Matter
- Blackjack
- Boca Chica Texas
- Brexit
- Caribbean
- Casino
- Casino Affiliate
- Cbd Oil
- Censorship
- Cf
- Chess Engines
- Childfree
- Cloning
- Cloud Computing
- Conscious Evolution
- Corona Virus
- Cosmic Heaven
- Covid-19
- Cryonics
- Cryptocurrency
- Cyberpunk
- Darwinism
- Democrat
- Designer Babies
- DNA
- Donald Trump
- Eczema
- Elon Musk
- Entheogens
- Ethical Egoism
- Eugenic Concepts
- Eugenics
- Euthanasia
- Evolution
- Extropian
- Extropianism
- Extropy
- Fake News
- Federalism
- Federalist
- Fifth Amendment
- Fifth Amendment
- Financial Independence
- First Amendment
- Fiscal Freedom
- Food Supplements
- Fourth Amendment
- Fourth Amendment
- Free Speech
- Freedom
- Freedom of Speech
- Futurism
- Futurist
- Gambling
- Gene Medicine
- Genetic Engineering
- Genome
- Germ Warfare
- Golden Rule
- Government Oppression
- Hedonism
- High Seas
- History
- Hubble Telescope
- Human Genetic Engineering
- Human Genetics
- Human Immortality
- Human Longevity
- Illuminati
- Immortality
- Immortality Medicine
- Intentional Communities
- Jacinda Ardern
- Jitsi
- Jordan Peterson
- Las Vegas
- Liberal
- Libertarian
- Libertarianism
- Liberty
- Life Extension
- Macau
- Marie Byrd Land
- Mars
- Mars Colonization
- Mars Colony
- Memetics
- Micronations
- Mind Uploading
- Minerva Reefs
- Modern Satanism
- Moon Colonization
- Nanotech
- National Vanguard
- NATO
- Neo-eugenics
- Neurohacking
- Neurotechnology
- New Utopia
- New Zealand
- Nihilism
- Nootropics
- NSA
- Oceania
- Offshore
- Olympics
- Online Casino
- Online Gambling
- Pantheism
- Personal Empowerment
- Poker
- Political Correctness
- Politically Incorrect
- Polygamy
- Populism
- Post Human
- Post Humanism
- Posthuman
- Posthumanism
- Private Islands
- Progress
- Proud Boys
- Psoriasis
- Psychedelics
- Putin
- Quantum Computing
- Quantum Physics
- Rationalism
- Republican
- Resource Based Economy
- Robotics
- Rockall
- Ron Paul
- Roulette
- Russia
- Sealand
- Seasteading
- Second Amendment
- Second Amendment
- Seychelles
- Singularitarianism
- Singularity
- Socio-economic Collapse
- Space Exploration
- Space Station
- Space Travel
- Spacex
- Sports Betting
- Sportsbook
- Superintelligence
- Survivalism
- Talmud
- Technology
- Teilhard De Charden
- Terraforming Mars
- The Singularity
- Tms
- Tor Browser
- Trance
- Transhuman
- Transhuman News
- Transhumanism
- Transhumanist
- Transtopian
- Transtopianism
- Ukraine
- Uncategorized
- Vaping
- Victimless Crimes
- Virtual Reality
- Wage Slavery
- War On Drugs
- Waveland
- Ww3
- Yahoo
- Zeitgeist Movement
-
Prometheism
-
Forbidden Fruit
-
The Evolutionary Perspective
Daily Archives: February 14, 2021
Emerging face of facial recognition technology – DTNEXT
Posted: February 14, 2021 at 2:00 pm
Chennai:
In the 1960s, Star Trek dazzled audiences, demonstrating retina scans and facial recognition scans as a part of their digital security system. Robocop, a 1987 Hollywood movie, unveiled futuristic police checking digital facial recognition instead of a drivers licence. Steven Spielbergs Minority Report (2002) showed Tom Cruise, who is on the run, walk into a retail store. The retailer has technology that recognises each arriving shopper and instantly displays the images of clothing by the taste and preferences of the customer.
Facial recognition is a relatively new technology that law enforcement agencies worldwide have started adopting to identify persons of interest. Face recognition identifies or verifies an individual by comparing and analysing patterns, shapes and proportions of their facial characteristics and contours. Police organisations are regularly utilising facial recognition to uncover probable crime suspects and witnesses by skimming through millions of photos.
Authorities are also exploiting this technology for surveillance at public venues like concerts and stadiums and to gain entry into specific properties. Most police departments are today considering face recognition to be an indispensable tool to solve the most heinous crimes, like terrorist attacks and violent assaults.
For instance, in New York City in 2019, a man followed a young woman home from work. He attempted to kidnap and rape her at knifepoint, after hauling her into a grassy area before ultimately letting her go. Investigators employed facial recognition technology to compare pictures from surveillance video at a food shop closeby with a mugshot database. A little more investigative work enabled the police to identify a suspect and arrest him within 24 hours. The 27-year-old suspect arrested previously for raping a 73-year-old lady was on bail when the offence was committed. In February this year, Delhi police used facial recognition technology to identify more than 1,500 rioters who had created communal unrest in the north-east region of Delhi.
Further, the use of facial recognition technology in India helped police find 3,000 missing children in four days. Scanning 45,000 children in New Delhi would be an almost ridiculous chore using conventional methods. But facial recognition technology could sift through the data in a matter of hours, enabling thousands of children to be identified, matched to missing person complaints, and reunited with their families.
The Interpol Face Recognition System (IFRS) contains facial images from more than 179 countries, making it a distinctive transnational criminal database. Since 2016, more than 1,000 criminals, fugitives, persons of interest or missing persons have been identified using Interpol system.
Back in India, there have been several FRT endeavours. Telangana police have created a facial recognition system that enables them identify offenders by comparing the suspects faces with digital photographs in a central database called Crime and Criminal Tracking Networks and Systems (CCTNS). Chennai police used a face recognition software called FaceTagr, developed by a Chennai-based company to police the Deepavali shopping crowds. Similarly, Amritsar police use a face recognition technology developed by a Gurugram AI company Staqu Technologies called Punjab Artificial Intelligence System (PAIS) that could detect a murder case within 24 hours.
The Staqu-developed PAIS claims that it can match images with a precision of 98 per cent if the database has five photos of the person. Elsewhere, the Surat Police are using the state-of-the-art NeoFace tech of NEC for solving crimes. In 2018, Andhra Pradesh launched e-Pragati, a searchable database of millions of people containing e-KYC Aadhaar numbers. Uttar Pradesh Police in December 2018 attempted Trinetra, an AI-based application with face recognition capabilities and a database containing five lakh criminals.
Unlike DNA technology, facial recognition is not expensive and time-consuming, once installed the facial system needs little overheads or expenses. The relative ease of the process makes it easy to incorporate it as a part of daily work. Much of the fear about facial recognition technology is because the public knows little about how police are using the technology and whether it has effectively lessened crime. The police departments that use facial recognition have not been forthright about how they use the technology. As long as police departments continue to use face recognition under this information void, the retaliation against the technology will likely grow more robust, no matter the potential upsides. Unfortunately, most of the time, police have been found using the technology to solve routine crimes.
The technology can add value if used properly. At present, people do not have a good understanding of technology; hence a little education could help in gaining acceptance. Civil liberties groups have been crying hoarse that facial recognition contributes to privacy erosion, bolsters bias against minorities and is susceptible to misuse. San Francisco, Boston and a prominent police body camera manufacturer have banned FRT by law enforcement. IBM too has backed away from its work in this area. The biggest fear is that the government might misuse the technology for surveillance.
The Boston Marathon bombings brought out the limitations of facial-recognition technology. Facial recognition technology is less accurate on people of colour. Further, the error rate is higher for men than women. CyberExtruder, a reputable company supplying facial recognition software to law enforcement agencies, has accepted that some skin colours give high error rates. Some facial recognition systems have an accuracy rate of 99.31 per cent on the still frontal face. Modifications in lighting, face positioning, facial expressions, profile pictures etc diminishes the precision rate. A big smile can render the system less effective.
Facial recognition being a powerful technology, the State should consider its use only for law enforcement and national security, that too, with adequate safeguards. Aadhaar has iris and biometric information; there appears to be a move to strengthen it with facial recognition. Once that is done, Aadhaar will have total surveillance infrastructure. Usage of facial recognition technology in the absence of any data protection or data privacy law could result in misuse of the technology. There is no legal provision to stop its misuse in India. The Information Technology Act, 2000 does not have provisions to deal with the abuse of technology. Cybercriminals appear to be taking advantage of this situation by making such data available on the darknet.
Finally, recognising our spiritual nature or spiritual recognition technology could go a long way in overcoming facial recognition technology concerns. If we treat faces as just another unit of data, that is to be harvested by the global surveillance machine, something sacred and spiritually deep within us gets transgressed. Counteracting the consequent dystopia recognising our spiritual nature and connecting to the divine blueprint of the soul could help us experience utopia even in a dystopian world.
The author is director, Directorate of Vigilance and Anti-Corruption (DVAC)
Read more:
Posted in New Utopia
Comments Off on Emerging face of facial recognition technology – DTNEXT
Quantum Mechanics, Free Will and the Game of Life – Scientific American
Posted: at 1:59 pm
Before I get to the serious stuff, a quick story about John Conway, a.k.a. the mathematical magician. I met him in 1993 in Princeton while working on The Death of Proof. When I poked my head into his office, Conway was sitting with his back to me staring at a computer. Hair tumbled down his back, his sagging pants exposed his ass-cleft. His office overflowed with books, journals, food wrappers and paper polyhedrons, many dangling from the ceiling. When I tentatively announced myself, he yelled without turning, Whats your birthday! Uh, June 23, I said. Year! Conway shouted. Year! 1953, I replied. After a split second he blurted out, Tuesday! He tapped his keyboard, stared at the screen and exulted, Yes! Finally facing me, Conway explained that he belongs to a group of people who calculate the day of the week of any date, past or present, as quickly as possible. He, Conway informed me with a manic grin, is one of the worlds fastest day-of-the-week calculators.
This encounter came back to me recently as I read a wonderful New York Times tribute to Conway, felled by COVID-19 last year at the age of 82. The Times focuses on the enduring influence of the Game of Life, a cellular automaton invented by Conway more than a half century ago. Scientific Americans legendary math columnist Martin Gardner introduced the Game of Life, sometimes just called Life, to the world in 1970 after receiving a letter about it from Conway. The Times riff on Life got me thinking anew about old riddles. Like, Does free will exist?
Some background. A cellular automaton is a grid of cells whose states depend on the states of neighboring cells, as determined by preset rules. The Game of Life is a two-dimensional cellular automaton with square cells that can be in one of two states, alive or dead (often represented by black or white). A given cells state depends on the state of its four immediate neighbors. If two or three of the neighbors are alive, the cell comes to life or stays alive. If zero, one or all four of the neighbors are alive, the cell dies or remains dead, presumably from loneliness or overcrowding. So simple! And yet Life, when the rules are applied over and over, ideally by a computer, yields endlessly varied patterns, including quasianimated clusters of cells known as longboats, gliders, spaceships and my favorite, Speed Demonoids.
Like the Mandelbrot set, the famous fractal icon, the Game of Life inspired the fields of chaos and complexity, which are so similar that I lump them together under a single term: chaoplexity. Chaoplexologists assume that just as Lifes odd digital fauna and flora stem from straightforward rules, so do many real-world things. With the help of computer simulations, chaoplexologists hoped to discover the rules, or algorithms, underpinning stuff that has long resisted conventional scientific analysis, from immune systems and brains to stock markets and whole civilizations. (The big data movement has recycled the hope, and hype, of chaoplexology.)
Of course, the Game of Life can be interpreted in different ways. It resembles a digital, animated Rorschach test upon which scholars project their biases. For example, philosopher Daniel Dennett, commenting on Conways invention in the Times, points out that Lifes higher-order patterns emerge from processes that are completely unmysterious and explicable.... No psionic fields, no morphic resonances, no lan vital, no dualism.
Dennetts comment annoyed me at first; Life just gives him an excuse to reiterate his defense of hard-core materialism. But Life, Dennett goes on to say, shows that deterministic rules can generate complex adaptively appropriate structures capable of action and control. Yes! I thought, my own bias coming into play. Dennett clearly means that deterministic processes can spawn phenomena that transcend determinism, like minds with free will.
Then another thought occurred to me, inspired by my ongoing effort to understand quantum mechanics. Conventional cellular automata, including Life, are strictly local, in the sense that what happens in one cell depends on what happens in its neighboring cells. But quantum mechanics suggests that nature seethes with nonlocal spooky actions. Remote, apparently disconnected things can be entangled, influencing each other in mysterious ways, as if via the filaments of ghostly, hyperdimensional cobwebs.
I wondered: Can cellular automata incorporate nonlocal entanglements? And if so, might these cellular automata provide even more support for free will than the Game of Life? Google gave me tentative answers. Yes, researchers have created many cellular automata that incorporate quantum effects, including nonlocality. There are even quantum versions of the Game of Life. But, predictably, experts disagree on whether nonlocal cellular automata bolster the case for free will.
One prominent explorer of quantum cellular automata, Nobel laureate Gerard t Hooft, flatly rules out the possibility of free will. In his 2015 monograph The Cellular Automaton Interpretation of Quantum Mechanics, t Hooft argues that some annoying features of quantum mechanicsnotably its inability to specify precisely where an electron will be when we observe itcan be eliminated by reconfiguring the theory as a cellular automaton. t Hoofts model assumes the existence of hidden variables underlying apparently random quantum behavior. His model leads him to a position called superdeterminism, which eliminates (as far as I can tell; t Hoofts arguments arent easy for me to follow) any hope for free will. Our fates are fixed from the big bang on.
Another authority on cellular automata, Stephen Wolfram, creator of Mathematica and other popular mathematical programs, proposes that free will is possible. In his 2002 opus A New Kind of Science, Wolfram argues that cellular automata can solve many scientific and philosophical puzzles, including free will. He notes that many cellular automata, including the Game of Life, display the property of computational irreducibility. That is, you cannot predict in advance what the cellular automata are going to do, you can only watch and see what happens. This unpredictability is compatible with free will, or so Wolfram suggests.
John Conway, Lifes creator, also defended free will. In a 2009 paper, The Strong Free Will Theorem, Conway and Simon Kochen argue that quantum mechanics, plus relativity, provide grounds for belief in free will. At the heart of their argument is a thought experiment in which physicists measure the spin of particles. According to Conway and Kochen, the physicists are free to measure the particles in dozens of ways, which are not dictated by the preceding state of the universe. Similarly, the particles spin, as measured by the physicists, is not predetermined.
Their analysis leads Conway and Kochen to conclude that the physicists possess free willand so do the particles they are measuring. Our provocative ascription of free will to elementary particles is deliberate, Conway and Kochen write, since our theorem asserts that if experimenters have a certain freedom, then particles have exactly the same kind of freedom. That last part, which ascribes free will to particles, threw me at first; it sounded too woo. Then I recalled that prominent scientists are advocating panpsychism, the idea that consciousness pervades all matter, not just brains. If we grant electrons consciousness, why not give them free will, too?
To be honest, I have a problem with all these treatments of free will, pro and con. They examine free will within the narrow, reductionistic framework of physics and mathematics, and they equate free will with randomness and unpredictability. My choices, at least important ones, are not random, and they are all too predictable, at least for those who know me.
For example, here I am arguing for free will once again. I do so not because physical processes in my brain compel me to do so. I defend free will because the idea of free will matters to me, and I want it to matter to others. I am committed to free will for philosophical, ethical and even political reasons. I believe, for example, that deterministic views of human nature make us more likely to accept sexism, racism and militarism. No physics modelnot even the most complex, nonlocal cellular automaton--can capture my rational and, yes, emotional motives for believing in free will, but that doesnt mean these motives lack causal power.
Just as it cannot prove or disprove Gods existence, science will never decisively confirm or deny free will. In fact, t Hooft might be right. I might be just a mortal, 3-D, analog version of the Speed Demonoid, plodding from square to square, my thoughts and actions dictated by hidden, superdeterministic rules far beyond my ken. But I cant accept that grim worldview. Without free will, life lacks meaning, and hope. Especially in dark times, my faith in free will consoles me, and makes me feel less bullied by the deadly Game of Life.
Further Reading:
I obsess over free will and related riddles in my two most recent books: Pay Attention: Sex, Death, and Science, and Mind-Body Problems: Science, Subjectivity & Who We Really Are.
Read the original here:
Quantum Mechanics, Free Will and the Game of Life - Scientific American
Posted in Quantum Physics
Comments Off on Quantum Mechanics, Free Will and the Game of Life – Scientific American
Quantum Theory Proposes That Cause and Effect Can Go In Loops – Universe Today
Posted: at 1:59 pm
Causality is one of those difficult scientific topics that can easily stray into the realm of philosophy. Sciences relationship with the concept started out simply enough: an event causes another event later in time. That had been the standard understanding of the scientific community up until quantum mechanics was introduced. Then, with the introduction of the famous spooky action at a distance that is a side effect of the concept of quantum entanglement, scientists began to question that simple interpretation of causality.
Now, researchers at the Universit Libre de Bruxelles (ULB) and the University of Oxford have come up with a theory that further challenges that standard view of causality as a linear progress from cause to effect. In their new theoretical structure, cause and effect can sometimes take place in cycles, with the effect actually causing the cause.
The quantum realm itself as it is currently understood is inherently messy. There is no true understanding of things at that scale, which can be thought of better as a set of mathematical probabilities rather than actualities. These probabilities do not exactly lend themselves well to the idea of a definite cause and effect interaction between events either.
The researchers further muddied the waters using a tool known as a unitary transformation. Simply put, a unitary transformation is a fudge used to solve some of the math that is necessary to understand complex quantum systems. Using it makes solving the famous Schrodinger equation achievable using real computers.
To give a more complete explanation requires delving a bit into the space that quantum mechanics operates in. In quantum mechanics, time is simply another dimension that must be accounted for similarly to how the usual three dimensions of what we think of as linear space are accounted for. Physicists usually use another mathematical tool called a Hamiltonian to solve Schrodingers equation.
A Hamiltonian, though a mathematical concept, is often time dependent. However, it is also the part of the equation that is changed when a unitary transformation is introduced. As part of that action, it is possible to eliminate the time dependency of the Hamiltonian, to make it such that, instead of requiring time to go a certain direction (i.e. for action and reaction to take place linearly), the model turns more into a circle than a straight line, with action causing reaction and reaction causing action.
If this isnt all confusing enough, there are some extremely difficult to conceive of implications of this model (and to be clear, from a macro level, it is just a model). One important facet is that this finding has little to no relevance to every day cause and effect. The causes and effects that would be cyclical in this framework are not local in spacetime, according to the press release from ULB, so they are unlikely to have any impact on day to day life.
Even if it doesnt have any everyday impact now, this framework could hint at a combined theory of quantum mechanics and general relativity that has been the most sought after prize in physics for decades. If that synthesis is ever fully realized, there will be more implications for everyday life than just the existential questions of whether we are actually in control of our own actions or not.
Learn More:Eureka Alert: Quantum Causal LoopsNature Communications: Cyclic Quantum Causal ModelsFlorida News Times: Quantum Causal LoopUT: The three-body problem shows us why we cant accurately calculate the past
Lead Image:Artist depiction of quantum causal loopsCredit: ULB
Like Loading...
Read the original:
Quantum Theory Proposes That Cause and Effect Can Go In Loops - Universe Today
Posted in Quantum Physics
Comments Off on Quantum Theory Proposes That Cause and Effect Can Go In Loops – Universe Today
The search for dark matter gets a speed boost from quantum technology – The Conversation US
Posted: at 1:59 pm
Nearly a century after dark matter was first proposed to explain the motion of galaxy clusters, physicists still have no idea what its made of.
Researchers around the world have built dozens of detectors in hopes of discovering dark matter. As a graduate student, I helped design and operate one of these detectors, aptly named HAYSTAC. But despite decades of experimental effort, scientists have yet to identify the dark matter particle.
Now, the search for dark matter has received an unlikely assist from technology used in quantum computing research. In a new paper published in the journal Nature, my colleagues on the HAYSTAC team and I describe how we used a bit of quantum trickery to double the rate at which our detector can search for dark matter. Our result adds a much-needed speed boost to the hunt for this mysterious particle.
There is compelling evidence from astrophysics and cosmology that an unknown substance called dark matter constitutes more than 80% of the matter in the universe. Theoretical physicists have proposed dozens of new fundamental particles that could explain dark matter. But to determine which if any of these theories is correct, researchers need to build different detectors to test each one.
One prominent theory proposes that dark matter is made of as-yet hypothetical particles called axions that collectively behave like an invisible wave oscillating at a very specific frequency through the cosmos. Axion detectors including HAYSTAC work something like radio receivers, but instead of converting radio waves to sound waves, they aim to convert axion waves into electromagnetic waves. Specifically, axion detectors measure two quantities called electromagnetic field quadratures. These quadratures are two distinct kinds of oscillation in the electromagnetic wave that would be produced if axions exist.
The main challenge in the search for axions is that nobody knows the frequency of the hypothetical axion wave. Imagine youre in an unfamiliar city searching for a particular radio station by working your way through the FM band one frequency at a time. Axion hunters do much the same thing: They tune their detectors over a wide range of frequencies in discrete steps. Each step can cover only a very small range of possible axion frequencies. This small range is the bandwidth of the detector.
Tuning a radio typically involves pausing for a few seconds at each step to see if youve found the station youre looking for. Thats harder if the signal is weak and theres a lot of static. An axion signal in even the most sensitive detectors would be extraordinarily faint compared with static from random electromagnetic fluctuations, which physicists call noise. The more noise there is, the longer the detector must sit at each tuning step to listen for an axion signal.
Unfortunately, researchers cant count on picking up the axion broadcast after a few dozen turns of the radio dial. An FM radio tunes from only 88 to 108 megahertz (one megahertz is one million hertz). The axion frequency, by contrast, may be anywhere between 300 hertz and 300 billion hertz. At the rate todays detectors are going, finding the axion or proving that it doesnt exist could take more than 10,000 years.
On the HAYSTAC team, we dont have that kind of patience. So in 2012 we set out to speed up the axion search by doing everything possible to reduce noise. But by 2017 we found ourselves running up against a fundamental minimum noise limit because of a law of quantum physics known as the uncertainty principle.
The uncertainty principle states that it is impossible to know the exact values of certain physical quantities simultaneously for instance, you cant know both the position and the momentum of a particle at the same time. Recall that axion detectors search for the axion by measuring two quadratures those specific kinds of electromagnetic field oscillations. The uncertainty principle prohibits precise knowledge of both quadratures by adding a minimum amount of noise to the quadrature oscillations.
In conventional axion detectors, the quantum noise from the uncertainty principle obscures both quadratures equally. This noise cant be eliminated, but with the right tools it can be controlled. Our team worked out a way to shuffle around the quantum noise in the HAYSTAC detector, reducing its effect on one quadrature while increasing its effect on the other. This noise manipulation technique is called quantum squeezing.
In an effort led by graduate students Kelly Backes and Dan Palken, the HAYSTAC team took on the challenge of implementing squeezing in our detector, using superconducting circuit technology borrowed from quantum computing research. General-purpose quantum computers remain a long way off, but our new paper shows that this squeezing technology can immediately speed up the search for dark matter.
Our team succeeded in squeezing the noise in the HAYSTAC detector. But how did we use this to speed up the axion search?
Quantum squeezing doesnt reduce the noise uniformly across the axion detector bandwidth. Instead, it has the largest effect at the edges. Imagine you tune your radio to 88.3 megahertz, but the station you want is actually at 88.1. With quantum squeezing, you would be able to hear your favorite song playing one station away.
In the world of radio broadcasting this would be a recipe for disaster, because different stations would interfere with one another. But with only one dark matter signal to look for, a wider bandwidth allows physicists to search faster by covering more frequencies at once. In our latest result we used squeezing to double the bandwidth of HAYSTAC, allowing us to search for axions twice as fast as we could before.
Quantum squeezing alone isnt enough to scan through every possible axion frequency in a reasonable time. But doubling the scan rate is a big step in the right direction, and we believe further improvements to our quantum squeezing system may enable us to scan 10 times faster.
Nobody knows whether axions exist or whether they will resolve the mystery of dark matter; but thanks to this unexpected application of quantum technology, were one step closer to answering these questions.
More:
The search for dark matter gets a speed boost from quantum technology - The Conversation US
Posted in Quantum Physics
Comments Off on The search for dark matter gets a speed boost from quantum technology – The Conversation US
Microsofts Big Win in Quantum Computing Was an Error After All – WIRED
Posted: at 1:59 pm
Whatever happened, the Majorana drama is a setback for Microsofts ambitions to compete in quantum computing. Leading computing companies say the technology will define the future by enabling new breakthroughs in science and engineering.
Quantum computers are built from devices called qubits that encode 1s and 0s of data but can also use a quantum state called a superposition to perform math tricks not possible for the bits in a conventional computer. The main challenge to commercializing that idea is that quantum states are delicate and easily quashed by thermal or electromagnetic noise, making qubits error-prone.
Google, IBM, and Intel have all shown off prototype quantum processors with around 50 qubits, and companies including Goldman Sachs and Merck are testing the technology. But thousands or millions of qubits are likely required for useful work. Much of a quantum computers power would probably have to be dedicated to correcting its own glitches.
Microsoft has taken a different approach, claiming qubits based on Majorana particles will be more scalable, allowing it to leap ahead. But after more than a decade of work, it does not have a single qubit.
From the fuller data, theres no doubt that theres no Majorana.
Sergey Frolov, University of Pittsburgh
Majorana fermions are named after Italian physicist Ettore Majorana, who hypothesized in 1937 that particles should exist with the odd property of being their own antiparticles. Not long after, he boarded a ship and was never seen again. Physicists wouldnt report a good glimpse of one of his eponymous particles until the next millennium, in Kouwenhovens lab.
Microsoft got interested in Majoranas after company researchers in 2004 approached tech strategy chief Craig Mundie and said they had a way to solve one problem holding back quantum computersqubits flakiness.
The researchers seized on theoretical physics papers suggesting a way to build qubits that would make them more dependable. These so-called topological qubits would be built around unusual particles, of which Majorana particles are one example, that can pop into existence in clumps of electrons inside certain materials at very low temperatures.
Microsoft created a new team of physicists and mathematicians to flesh out the theory and practice of topological quantum computing, centered on an outpost in Santa Barbara, California, christened Station Q. They collaborated with and funded leading experimental physicists hunting for the particles needed to build this new form of qubit.
Kouwenhoven, in Delft, was one of the physicists who got Microsofts backing. His 2012 paper reporting signatures of Majorana particles inside nanowires started chatter about a future Nobel prize for proving the elusive particles existence. In 2016, Microsoft stepped up its investmentand the hype.
Everything you ever wanted to know about qubits, superpositioning, and spooky action at a distance.
Kouwenhoven and another leading physicist, Charles Marcus, at the University of Copenhagen were hired as corporate Majorana hunters. The plan was to first detect the particles and then invent more complex devices that could control them and function as qubits. Todd Holmdahl, who previously led hardware for Microsofts lucrative Xbox games console, took over as leader of the topological quantum computing project. Early in 2018, he told Barrons he would have a topological qubit by the end of the year. The now-disputed paper appeared a month later.
While Microsoft sought Majoranas, competitors working on established qubit technologies reported steady progress. In 2019, Google announced it had reached a milestone called quantum supremacy, showing that a chip with 53 qubits could perform a statistical calculation in minutes that would take a supercomputer millennia. Soon after, Microsoft appeared to hedge its quantum bet, announcing it would offer access to quantum hardware from other companies via its cloud service Azure. The Wall Street Journal reported that Holmdahl left the project that year after missing an internal deadline.
Microsoft has been quieter about its expected pace of progress on quantum hardware since Holmdahl's departure. Competitors in quantum computing continue to tout hardware advances and urge software developers to access prototypes over the internet, but none appear close to creating a quantum computer ready for prime time.
Go here to read the rest:
Microsofts Big Win in Quantum Computing Was an Error After All - WIRED
Posted in Quantum Physics
Comments Off on Microsofts Big Win in Quantum Computing Was an Error After All – WIRED
Kangaroo Court: Quantum Computing Thinking on the Future – JD Supra
Posted: at 1:59 pm
The promise of quantum computers is that certain computational tasks might be executed exponentially faster on a quantum processor than on a classical processor.
Quantum computing is a beautiful fusion of quantum physics with computer science. It incorporates some of the most stunning ideas of physics from the twentieth century into an entirely new way of thinking about computation. Quantum computers have the potential to resolve problems of a high complexity and magnitude across many different industries and application, including finance, transportation, chemicals, and cybersecurity. Solving the impossible in a few hours of computing time.
Quantum computing is often in the news: China teleported a qubit from earth to a satellite; Shors algorithm has put our current encryption methods at risk; quantum key distribution will make encryption safe again; Grovers algorithm will speed up data searches. But what does all this really mean? How does it all work?
Todays computers operate in a very straightforward fashion: they manipulate a limited set of data with an algorithm and give you an answer. Quantum computers are more complicated. After multiple units of data are input into qubits, the qubits are manipulated to interact with other qubits, allowing for several calculations to be done simultaneously. Thats where quantum computers are a lot faster than todays machines.
Quantum computers have four fundamental capabilities that differentiate them from todays classical computers:
All computations involve inputting data, manipulating it according to certain rules, and then outputting the final answer. For classical computations, the bit is the basic unit of data. For quantum computation, this unit is the quantum bit usually shortened to qubit.
The basic unit of quantum computing is a qubit. A classical bit is either 0 or 1. If its 0 and we measure it, we get 0. If its 1 and we measure 1, we get 1. In both cases the bit remains unchanged. The standard example is an electrical switch that can be either on or off. The situation is totally different for qubits. Qubits are volatile. A qubit can be in one of an infinite number of states a superposition of both 0 and 1 but when we measure it, as in the classical case, we just get one of two values, either 0 or 1. Qubits can also become entangled. In fact, the act of measurement changes the qubit. When we make a measurement of one of them, it affects the state of the other. Whats more, they interact with other qubits. In fact, these interactions are what make it possible to conduct multiple calculations at once.
Nobody really knows quite how or why entanglement works. It even baffled Einstein, who famously described it as spooky action at a distance. But its key to the power of quantum computers. In a conventional computer, doubling the number of bits doubles its processing power. But thanks to entanglement, adding extra qubits to a quantum machine produces an exponential increase in its number-crunching ability.
These three things superposition, measurement, and entanglement are the key quantum mechanical ideas. Controlling these interactions, however, is very complicated. The volatility of qubits can cause inputs to be lost or altered, which can throw off the accuracy of results. And creating a computer of meaningful scale would require hundreds of thousands of millions of qubits to be connected coherently. The few quantum computers that exist today can handle nowhere near that number. But the good news is were getting very, very close.
Quantum computing and classical computer are not two distinct disciplines. Quantum computing is the more fundamental form of computing anything that can be computed classically can be computed on a quantum computer. The qubit is the basic unit of computation, not the bit. Computation, in its essence, really means quantum computing. A qubit can be represented by the spin of an electron or the polarization of a photon.
In 2019 Google achieved a level of quantum supremacy when they reported the use of a processor with programmable superconducting qubits to create quantum states on 54 qubits, corresponding to a computational state-space of dimension 253(about 1016). This incredible achievement was slightly short of their mission goal for creating quantum states of 72 qubits. What is so special about this number? Classical computers can simulate quantum computers if the quantum computer doesnt have too many qubits, but as the number of qubits increases we reach the point where that is no longer possible.
There are 8 possible three-bit combinations: 000,001, 010, 011, 100, 101, 110, 111. The number 8 comes from 23. There are two choices for the first bit, two for the second and two for the third, and we might multiple these three 2s together. If instead of bits we switch to qubits, each of these 8 three-bit strings is associated with a basis vector, so the vector space is 8-dimensional. If we have 72 qubits, the number of basis elements is 2. This is about 4,000,000,000,000,000,000,000. It is a large number and is considered to be the point at which classical computers cannot simulate quantum computers. Once quantum computers have more than 72 or so qubits we truly enter the age of quantum supremacy when quantum computers can do computations that are beyond the ability of any classical computer.
To provide a little more perspective, lets consider a machine with 300 qubits. This doesnt seem an unreasonable number of the not too distant future. But 2300 is an enormous number. Its more than the number of elementary particles in the known universe. A computation using 300 qubits would be working with 2300 basis elements.
Some calculations required for the effective simulation of real-life scenarios are simply beyond the capability of classical computers whats known as intractable problems. Quantum computers, with their huge computational power, are ideally suited to solving these problems. Indeed, some problems, like factoring, are hard on a classical computer, but are easy on a quantum computer. This creates a world of opportunities, across almost every aspect of modern life.
Healthcare: classical computers are limited in terms of size and complexity of molecules they can simulate and compare (an essential process of early drug development). Quantum computers will allow much larger molecules to be simulated. At the same time, researchers will be able to model and simulate interactions between drugs and all 20,000+ proteins encoded in the human genome, leading to greater advancements in pharmacology.
Finance: one potential application is algorithmic trading using complex algorithms to automatically trigger share dealings based on a wide variety of market variables. The advantages, especially for high-volume transactions, are significant. Another application is fraud detection. Like diagnostics in healthcare, fraud detection is reliant upon pattern recognition. Quantum computers could deliver a significant improvement in machine learning capabilities; dramatically reducing the time taken to train a neural network and improving the detection rate.
Logistics: Improved data analysis and modelling will enable a wide range of industries to optimize workflows associated with transport, logistics and supply-chain management. The calculation and recalculation of optimal routes could impact on applications as diverse as traffic management, fleet operations, air traffic control, freight and distribution.
It is, of course, impossible to predict the long-term impact of quantum computing with any accuracy. Quantum computing is now in its infancy, and the comparison to the first computers seems apt. The machines that have been constructed so far tend to be large and not very powerful, and they often involve superconductors that need cooled to extremely low temperatures. To minimize the interaction of quantum computers with the environment, they are always protected from light and heat. They are shieled against electromagnetic radiation, and they are cooled. One thing that can happen in cold places is that certain materials become superconductors they lose all electrical resistance and superconductors have quantum properties that can be exploited.
Many countries are experimenting with small quantum networks using optic fiber. There is the potential of connecting these via satellite and being able to form a worldwide quantum network. This work is of great interest to financial institutions. One early impressive result involves a Chinese satellite that is devoted to quantum experiments. Its named Micius after a Chinese philosopher who did work in optics. A team in China connected to a team in Austria the first time that intercontinental quantum key distribution (QKD) had been achieved. Once the connection was secured, the teams sent pictures to one another. The Chinese team sent the Austrians a picture of Micius, and the Austrians sent a picture of Schrodinger to the Chinese.
To actually make practical quantum computers you need to solve a number of problems, the most serious being decoherence the problem of your qubit interacting with something from the environment that is not part of the computation. You need to set a qubit to an initial state and keep it in that state until you need to use it. Their quantum state is extremely fragile. The slightest vibration or change in temperature disturbances known as noise in quantum-speak can cause them to tumble out of superposition before their job has been properly done. Thats why researchers are doing the best to protect qubits from the outside world in supercooled fridges and vacuum chambers.
Alan Turing is one of the fathers of the theory of computation. In his landmark paper of 1936 he carefully thought about computation. He considered what humans did as they performed computations and broke it down to its most elemental level. He showed that a simple theoretical machine, which we now call a Turing machine, could carry out any algorithm. But remember, Turing was analyzing computation based on what humans do. With quantum computation the focus changes from how humans compute to how the universe computes. Therefore, we should think of quantum computation as not a new type of computation but as the discovery of the true nature of computation.
See the original post:
Kangaroo Court: Quantum Computing Thinking on the Future - JD Supra
Posted in Quantum Physics
Comments Off on Kangaroo Court: Quantum Computing Thinking on the Future – JD Supra
New EU Consortium shaping the future of Quantum Computing USA – PRNewswire
Posted: at 1:59 pm
Europe has always been excellent in academic research, but over the past few decades commercializing research projects has been slow compared to international competition. This is starting to change with quantum technologies. As one of the largest efforts in Europe and worldwide, Germany announced 2 Billion funding into quantum programs in June 2020, from which 120 Million are invested in this current round of research grants.
Today, IQM announced a Quantum project consortium that includes Europe's leading startups (ParityQC, IQM), industry leaders (Infineon Technologies), research centers (Forschungszentrum Jlich),supercomputing centers (Leibniz Supercomputing Centre), and academia (Freie Universitt Berlin) has been awarded 12.4 Million from the German Ministry of Education and Research (BMBF) (Announcement in German).
The scope of the project is to accelerate commercialization through an innovative co-design concept. This project focuses on application-specific quantum processors, which have the potential to create a fastlane to quantum advantage. The digital-analog concept used to operate the processors will further lay the foundation for commercially viable quantum computers. This project will run for four years and aims to develop a 54-qubit quantum processor.
The project is intended to support the European FET Flagship project EU OpenSuperQ, announced in 2018 which is aimed at designing, building, and operating a quantum information processing system of up to 100 qubits. Deploying digital-analog quantum computing, this consortium adds a new angle to the OpenSuperQ project and widens its scope. With efforts from Munich, Berlin and Jlich, as well as Parity QC from Austria, the project builds bridges and seamlessly integrates into the European quantum landscape.
"The grant from the Federal Ministry of Education and Research of Germanyis a huge recognition of our unique co-design approach for quantum computers. Last year when we established our office in Munich, this was one of our key objectives. The concept allows us to become a system integrator for full-stack quantum computers by bringing together all the relevant players. As Europe's leading startup in quantum technologies, this gives us confidence to further invest in Germany and other European countries" said Dr. Jan Goetz, CEO of IQM Quantum Computers.
As European technology leader, Germany is taking several steps to lead the quantum technology race. An important role of such leadership is to bring together the European startups, industry, research and academic partners. This project will give the quantum landscape in Germany an accelerated push and will create a vibrant quantum ecosystem in the region for the future.
Additional Quotes:
"DAQC is an important project for Germany and Europe. It enables us to take a leading role in the area of quantum technologies. It also allows us to bring quantum computing into one of the prime academic supercomputing centres to more effectively work on the important integration of high-performance computing and quantum computing. We are looking forward to a successful collaboration," said Prof. DrMartinSchulz, Member of the Board of Directors, Leibniz Supercomputing Centre (LRZ).
"The path towards scalable and fully programmable quantum computing will be the parallelizability of gates and building with reduced complexity in order to ensure manageable qubit control. Our ParityQC architecture is the blueprint for a fully parallelizable quantum computer, which comes with the associated ParityOS operating system. With the team of extraordinary members of the DAQC consortium this will allow us to tackle the most pressing and complex industry-relevant optimization problems." saidMagdalena Hauser & Wolfgang Lechner, CEOs & Co-founder ParityQC
"We are looking forward to exploring and realizing a tight connection between hardware and applications, and having DAQC quantum computers as a compatible alternative within the OpenSuperQ laboratory. Collaborations like this across different states, and including both public and private partners, have the right momentum to move quantum computing in Germany forward." saidProf. Frank Wilhelm-Mauch, Director, Institute for Quantum Computing Analytics, Forschungszentrum Jlich
"At Infineon, we are looking forward to collaborating with top-class scientists and leading start-ups in the field of quantum computing in Europe. We must act now if we in Germany and Europe do not want to become solely dependent on American or Asian know-how in this future technology area. We are very glad to be part of this highly innovative project and happy to contribute with our expertise in scaling and manufacturing processes." saidDr.Sebastian Luber, Senior Director Technology & Innovation, Infineon Technologies AG
"This is a hugely exciting project. It is a chance of Europe and Germany to catch up in the development of superconducting quantum computers. I am looking forward to adventures on understanding how such machines can be certified in their precise functioning." said Prof.Jens Eisert, Professor of Quantum Physics, Freie Universitt Berlin
About IQM Quantum Computers:
IQM is the European leader in superconducting quantum computers, headquartered in Espoo, Finland. Since its inception in 2018, IQM has grown to 80+ employees and has also established a subsidiary in Munich, Germany, to lead the co-design approach. IQM delivers on-premises quantum computers for research laboratories and supercomputing centers and provides complete access to its hardware. For industrial customers, IQM delivers quantum advantage through a unique application-specific co-design approach. IQM has raised 71 Million from VCs firms and also public grants and is also building Finland's first quantum computer.
For more information, visit http://www.meetiqm.com.
Registered offices:
IQM Finland OyKeilaranta 1902150 EspooFINLANDwww.meetiqm.com
IQM GERMANY GmbHNymphenburgerstr. 8680636 MnchenGermany
IQM: Facts and Figures
Founders:
Media Contact: Raghunath Koduvayur, Head of Marketing and Communications, [emailprotected], +358504876509
Photo - https://mma.prnewswire.com/media/1437806/IQM_Quantum_Computers_Founders.jpg Photo - https://mma.prnewswire.com/media/1437807/IQM_Quantum_computer_design.jpg Logo - https://mma.prnewswire.com/media/1121497/IQM_Logo.jpg
SOURCE IQM Finland Oy
Read the original here:
New EU Consortium shaping the future of Quantum Computing USA - PRNewswire
Posted in Quantum Physics
Comments Off on New EU Consortium shaping the future of Quantum Computing USA – PRNewswire
2020 Quantum Communications in Space Research Report: Quantum Communications are Expected to Solve the Problem of Secure communications First on…
Posted: at 1:59 pm
Dublin, Feb. 11, 2021 (GLOBE NEWSWIRE) -- The "Quantum Communications in Space" report has been added to ResearchAndMarkets.com's offering.
The modern world more and more relies on information exchange using data transfer technologies.
Private and secure communications are fundamental for the Internet, national defence and e-commerce, thus justifying the need for a secure network with the global protection of data. Information exchange through existing data transfer channels is becoming prone to hacker attacks causing problems on an international scale, such as interference with democratic elections, etc.
In reality the scale of the "hacking" problem is continual, in 2019 British companies were reportedly hit by about 5,000 "ransomware" attacks that paid out more than $200 million to cyber criminals [1]. During the first half of 2020, $144.2 million has already been lost in 11 of the biggest ransomware attacks [2]. Communications privacy is therefore of great concern at present.
The reasons for the growing privacy concerns are [3]: the planned increase of secure information (requiring encryption) data traffic rates from the current 10 to future 100 Gbit/s; annual increases in data traffic of 20-25% and the application of fibre optic cables not only for mainstream network lines by also for the "final mile" to the end-user. These developments are accompanied by [3]: growing software vulnerabilities; more powerful computational resources available to hackers at lower costs; possible quantum computer applications for encryption cracking and the poor administration of computer networks.
Conventional public key cryptography relies on the computational intractability of certain mathematical functions.
Applied conventional encryption algorithms (DH, RSA, ECDSA TLS/SSL, HTTPS, IPsec, X.509) are good in that there is currently no way to find the key (with a sufficient length) for any acceptable time. Nevertheless, in principle it is possible, and there are no guarantees against the discovery in the future of a fast factorization algorithm for classical computers or from the implementation of already known algorithms on a quantum computer, which will make conventional encryption "hacking" possible. Another "hacking" strategy involves original data substitution. A final vulnerability comes from encryption keys being potentially stolen. Hence, the demand exists for a truly reliable and convenient encryption system.
Quantum communications are expected to solve the problem of secure communications first on international and national scales and then down to the personal level.
Quantum communication is a field of applied quantum physics closely related to quantum information processing and quantum teleportation [4]. It's most interesting application is protecting information channels against eavesdropping by means of quantum cryptography [4].
Quantum communications are considered to be secure because any tinkering with them is detectable. Thus, quantum communications are only trustful and safe in the knowledge that any eavesdropping would leave its mark.
By quantum communications two parties can communicate secretly by sharing a quantum encryption key encoded in the polarization of a string of photons.
This quantum key distribution (QKD) idea was proposed in the mid-1980s [5]. QKD theoretically offers a radical new way of an information secure solution to the key exchange problem, ensured by the laws of quantum physics. In particular, QKD allows two distant users, who do not share a long secret key initially, to generate a common, random string of secret bits, called a secret key.
Using the one-time pad encryption, this key has been proven to be secure [6] to encrypt/decrypt a message, which can then be transmitted over a standard communication channel. The information is encoded in the superposition states of physical carriers at a single-quantum level, where photons, the fastest traveling qubits, are usually used. Any eavesdropper on the quantum channel attempting to gain information of the key will inevitably introduce disturbance to the system that can be detected by the communicating users.
Key Topics Covered:
1. INTRODUCTION
2. Quantum Experiments at a Space Scale (QUESS)2.1. European root of the Chinese project 2.2. Chinese Counterpart2.3. The QUESS Mission set-up 2.3.1. Spacecraft 2.3.2. Ground stations 2.3.3. Project budget 2.4. International cooperation2.5. Results2.6. Tiangong-2 Space Lab QKD
3. Future plans
4. Comparison to alternatives4.1. Small Photon-Entangling Quantum System4.2. Hyperentangled Photon Pairs 4.3. QEYSSat 4.4. Reflector satellites 4.5. GEO satellite communications 4.6. Airborne4.7. Ground4.7.1. Moscow quantum communications line4.7.2. Telephone & optical line communications
5. CONCLUSIONS
REFERENCES
Companies Mentioned
For more information about this report visit https://www.researchandmarkets.com/r/li9vd4
Visit link:
Posted in Quantum Physics
Comments Off on 2020 Quantum Communications in Space Research Report: Quantum Communications are Expected to Solve the Problem of Secure communications First on…
Mutually unbiased bases and symmetric informationally complete measurements in Bell experiments – Science Advances
Posted: at 1:59 pm
INTRODUCTION
Measurements are crucial and compelling processes at the heart of quantum physics. Quantum measurements, in their diverse shapes and forms, constitute the bridge between the abstract formulation of quantum theory and concrete data produced in laboratories. Crucially, the quantum formalism of measurement processes gives rise to experimental statistics that elude classical models. Therefore, appropriate measurements are indispensable for harvesting and revealing quantum phenomena. Sophisticated manipulation of quantum measurements is both at the heart of the most well-known features of quantum theory such as contextuality (1) and the violation of Bell inequalities (2) as well as its most groundbreaking applications such as quantum cryptography (3) and quantum computation (4). In the broad landscape of quantum measurements (5), certain classes of measurements are outstanding because of their breadth of relevance in foundations of quantum theory and applications in quantum information processing.
Two widely celebrated, intensively studied, and broadly useful classes of measurements are known as mutually unbiased bases (MUBs) and symmetric informationally complete measurements (SICs). Two measurements are said to be mutually unbiased if by preparing any eigenstate of the first measurement and then performing the second measurement, one finds that all outcomes are equally likely (6). A typical example of MUBs corresponds to measuring two perpendicular components of the polarization of a photon. A SIC is a quantum measurement with the largest number of possible outcomes such that all measurement operators have equal magnitude overlaps (7, 8). Thus, the former is a relationship between two different measurements, whereas the latter is a relationship within a single measurement. Since MUBs and SICs are both conceptually natural, elegant, and (as it turns out) practically important classes of measurements, they are often studied in the same context (914). Let us briefly review their importance to foundational and applied aspects of quantum theory.
MUBs are central to the concept of complementarity in quantum theory, i.e., how the knowledge of one quantity limits (or erases) the knowledge of another quantity [see, e.g., (15) for a review of MUBs]. This is often highlighted through variants of the famous Stern-Gerlach experiment in which different Pauli observables are applied to a qubit. For instance, after first measuring (say) x, we know whether our system points up or down the x axis. If we then measure z, our knowledge of the outcome of yet another x measurement is entirely erased since z and x are MUBs. This phenomenon leads to an inherent uncertainty for the outcomes of MUB measurements on all quantum states, which can be formalized in terms of entropic quantities, leading to so-called entropic uncertainty relations. It is then natural that MUBs give rise to the strongest entropic uncertainties in quantum theory (16). Moreover, MUBs play a prominent role in quantum cryptography, where they are used in many of the most well-known quantum key distribution protocols (1721) and in secret sharing protocols (2224). Their appeal to cryptography stems from the idea that eavesdroppers who measure an eigenstate of one basis in another basis unbiased to it obtain no useful information, while they also induce a large disturbance in the state that allows their presence to be detected. Furthermore, complete (i.e., largest possible in a given dimension) sets of MUBs are tomographically complete, and their symmetric properties make them pivotal for quantum state tomography (25, 26). In addition, MUBs are useful for a range of other problems such as quantum random access coding (2731), quantum error correction (32, 33), and entanglement detection (34). This broad scope of relevance has motivated much effort toward determining the largest number of MUBs that exist in general Hilbert space dimensions (15).
The motivations behind the study of SICs are quite similar to the ones discussed for MUBs. It has been shown that SICs are natural measurements for quantum state tomography (35), which has also prompted several experimental realizations of SICs (3638). In addition, some protocols for quantum key distribution derive their success directly from the defining properties of SICs (39, 40), which have also been experimentally demonstrated (41). Furthermore, a key property of SICs is that they have the largest number of outcomes possible while still being extremal measurements, i.e., they cannot be simulated by stochastically implementing other measurements. This gives SICs a central role in a range of applications, which include random number generation from entangled qubits (42), certification of nonprojective measurements (4346), semidevice-independent self-testing (45), and entanglement detection (47, 48). Moreover, SICs have a key role in quantum Bayesianism (49), and they exhibit interesting connections to several areas of mathematics, for instance, Lie and Jordan algebras (50) and algebraic number theory (51). Because of their broad interest, much research effort has been directed toward proving the existence of SICs in all Hilbert space dimensions (presently known, at least, up to dimension 193) (7, 8, 5255). See, e.g., (54) for a recent review of SICs.
In this work, we broadly investigate MUBs and SICs in the context of Bell nonlocality experiments. In these experiments, two separated observers perform measurements on entangled quantum systems that can produce nonlocal correlations that elude any local hidden variable model (56). In recent years, Bell inequalities have played a key role in the rise of device-independent quantum information processing where they are used to certify properties of quantum systems. Naturally, certification of a physical property can be achieved under different assumptions of varying strength. Device-independent approaches offer the strongest form of certification since the only assumptions made are space-like separation and the validity of quantum theory. The advent of device-independent quantum information processing has revived interest in Bell inequalities, as these can now be tailored to the purpose of certifying useful resources for quantum information processing. The primary focus of such certification has been on various types of entangled states (57). However, quantum measurements are equally important building blocks for quantum information processing. Nevertheless, our understanding of which arrangements of high-dimensional measurements can be certified in a device-independent manner is highly limited. We speak of arrangements of measurements because for a single measurement (acting on a quantum system with no internal structure), no interesting property can be certified. The task becomes nontrivial when at least two measurements are present and we can certify the relation between them. The simplest approach relies on combining known self-testing results for two-qubit systems, which allows us to certify high-dimensional measurements constructed out of qubit building blocks (58, 59). Alternatively, device-independent certification of high-dimensional structures can be proven from scratch, but to the best of our knowledge, only two results of this type have been proven: (i) a triple of MUBs in dimension three (60) and (ii) the measurements conjectured to be optimal for the Collins-Gisin-Linden-Massar-Popescu Bell inequality (the former is a single result, while the latter is a family parameterized by the dimension d 2) (61). None of these results can be used to certify MUBs in dimension d 4.
Since mutual unbiasedness and symmetric informational completeness are natural and broadly important concepts in quantum theory, they are prime candidates of interest for such certification in general Hilbert space dimensions. This challenge is increasingly relevant because of the broader experimental advances toward high-dimensional systems along the frontier of quantum information theory. This is also reflected in the fact that recent experimental implementations of MUBs and SICs can go well beyond the few lowest Hilbert space dimensions (38, 41, 62).
Focusing on mutual unbiasedness and symmetric informational completeness, we solve the above challenges. To this end, we first construct Bell inequalities that are maximally violated using a maximally entangled state of local dimension d and, respectively, a pair of d-dimensional MUBs and a d-dimensional SIC. In the case of MUBs, we show that the maximal quantum violation of the proposed Bell inequality device independently certifies that the measurements satisfy an operational definition of mutual unbiasedness as well as that the shared state is essentially a maximally entangled state of local dimension d. Similarly, in the case of SICs, we find that the maximal quantum violation device independently certifies that the measurements satisfy an analogous operational definition of symmetric informational completeness. Moreover, we also show that our Bell inequalities are useful in two practically relevant tasks. For the case of MUBs, we consider a scheme for device-independent quantum key distribution and prove a key rate of log d bits, which is optimal for any protocol that extracts key from a d-outcome measurement. For SICs, we construct a scheme for device-independent random number generation. For two-dimensional SICs, we obtain the largest amount of randomness possible for any protocol based on qubits. For three-dimensional SICs, we obtain more randomness than can be obtained in any protocol based on projective measurements and quantum systems of dimension up to seven. For low dimensions, we numerically show that both protocols are robust to noise, which is imperative to any experiment. The implementation of these two protocols involves performing a Bell-type experiment, estimating the outcome statistics and computing the resulting Bell inequality violation. The efficiency and security of the protocol is then deduced only from the observed Bell inequality violation, i.e., it does not require a complete characterization of the devices. Device-independent protocols can, in principle, be implemented on any experimental platform suitable for Bell nonlocality experiments, such as entangled spins (63), entangled photons (64, 65), and entangled atoms (66).
The task of finding Bell inequalities that are maximally violated by MUBs for d 3 has been attempted several times (6770) but with limited success. The only convincing candidate is the inequality corresponding to d = 3 studied in (67), and even then, there is only numerical evidence (no analytical proof is known). Some progress has been made in (60), which considers the case of prime d and proposes a family of Bell inequalities maximally violated by a specific set of d MUBs in dimension d. These inequalities, however, have two drawbacks: (i) There is no generalization to the case of nonprime d, and (ii) even for the case of prime d, we have no characterization of the quantum realizations that achieve the maximal violation.
In this work, we present a family of Bell inequalities in which the maximal quantum violation is achieved with a maximally entangled state and any pair of d-dimensional MUBs. These Bell inequalities have been constructed so that their maximal quantum violation can be computed analytically, which then enables us to obtain a detailed characterization of the optimal realizations. As a result we find a previously unidentified, intermediate form of device-independent certification.
We formally define a pair of MUBs as two orthonormal bases on a d-dimensional Hilbert space d, namely, {ej}j=1d and {fk}k=1d, with the property thatejfk2=1d(1)for all j and k. The constant on the right-hand side is merely a consequence of the two bases being normalized. To this end, consider a bipartite Bell scenario parameterized by an integer d 2. Alice randomly receives one of d2 possible inputs labeled by x x1x2 [d]2 (where [s] {1, , s}) and produces a ternary output labeled by a {1,2, }. Bob receives a random binary input labeled by y {1,2} and produces a d-valued output labeled by b [d]. The joint probability distribution in the Bell scenario is denoted by p(a, bx, y), and the scenario is illustrated in Fig. 1.
Alice receives one of d2 inputs and produces a ternary output, while Bob receives a binary input and produces a d-valued output.
To make our choice of Bell functional transparent, we will phrase it as a game in which Alice and Bob collectively win or lose points. If Alice outputs a = , then no points will be won or lost. If she outputs a {1,2}, then points will be won or lost if b = xy. More specifically, Alice and Bob win a point if a = y and lose a point if a=y, where the bar sign flips the value of y {1,2}. This leads to the scoreRdMUBx,yp(a=y,b=xyx,y)p(a=y,b=xyx,y)(2)where the sum goes over x = x1x2 [d]2 and y {1,2}.
At this point, the outcome a = might seem artificial, so let us show why it plays a crucial role in the construction of the game. To this end, we use intuition based on the hypothetical case in which Alice and Bob share a maximally entangled statedmax=1dk=1dk,k(3)
The reason that we consider the maximally entangled state is that we aim to tailor the Bell inequalities so that this state is optimal. Then, we would like to ensure that Alice, via her measurement and for her outcomes a {1,2}, remotely prepares Bob in a pure state. This would allow Bob to create stronger correlations as compared to the case of Alice remotely preparing his system is a mixed state. Hence, this corresponds to Alices outcomes a {1,2} being represented by rank-one projectors. Since the subsystems of dmax are maximally mixed, it follows that (a = 1x) = p(a = 2x) = 1/d x. Thus, we want to motivate Alice to use a strategy in which she outputs a = with probability p(a = x) = 1 2/d. Our tool for this purpose is to introduce a penalty. Specifically, whenever Alice decides to output a {1,2}, she is penalized by losing d points. Thus, the total score (the Bell functional) readsSdMUBRdMUBdx(p(a=1x)+p(a=2x))(4)
Now, outputting a {1,2} not only contributes toward RdMUB but also causes a penalty d. Therefore, we expect to see a trade-off between d and the rate at which Alice outputs a = . We must suitably choose d such that Alices best strategy is to output a = with (on average over x) the desired probability p(a = x) = 1 2/d. This accounts for the intuition that leads us to the following Bell inequalities for MUBs.
Theorem II.1 (Bell inequalities for MUBs). The Bell functional SdMUB in Eq. 4 withd=12d1d(5)obeys the tight local boundSdMUBLHV2(d1)(112d1d)(6)and the quantum boundSdMUBQd(d1)(7)
Moreover, the quantum bound can be saturated by sharing a maximally entangled state of local dimension d and Bob performing measurements in any two MUBs.
Proof. A complete proof is presented in the Supplementary Materials (section S1A). The essential ingredient to obtain the bound in Eq. 7 is the Cauchy-Schwarz inequality. Furthermore, for local models, by inspecting the symmetries of the Bell functional SdMUB, one finds that the local bound can be attained by Bob always outputting b = 1. This greatly simplifies the evaluation of the bound in Eq. 6.
To see that the bound in Eq. 7 can be saturated in quantum theory, let us evaluate the Bell functional for a particular quantum realization. Let be the shared state, {Px1}x1=1d and {Qx2}x2=1d be the measurement operators of Bob corresponding to y = 1 and y = 2, respectively, and Ax be the observable of Alice defined as the difference between Alices outcome-one and outcome-two measurement operators, i.e., Ax=Ax1Ax2. Then, the Bell functional readsSdMUB=xAx(Px1Qx2)d(Ax1+Ax2)1(8)
Now, we choose the maximally entangled state of local dimension d, i.e., =dmax, and define Bobs measurements as rank-one projectors Px1 = x1x1 and Qx2 = x2x2, which correspond to MUBs, i.e., x1x22 = 1/d. Last, we choose Alices observables as Ax=d/(d1)(Px1Qx2)T, where the prefactor ensures the correct normalization and T denotes the transpose in the standard basis. Note that Ax is a rank-two operator; the corresponding measurement operator Ax1 (Ax2) is a rank-one projector onto the eigenvector of Ax associated to the positive (negative) eigenvalue. Since the subsystems of dmax are maximally mixed, this implies dmax(Ax1+Ax2)1dmax=2/d. Inserting all this into the above quantum model and exploiting the fact that for any linear operator O, we have O1dmax=1OTdmax, we straightforwardly saturate the bound in Eq. 7.
We remark that for the case of d = 2 one could also choose 2 = 0 and retain the property that qubit MUBs are optimal. In this case, the marginal term is not necessary because in the optimal realization, Alice never outputs . Then, the quantum bound becomes 22, and the local bound becomes 2. The resulting Bell inequality resembles the Clauser-Horne-Shimony-Holt (CHSH) inequality (71) not only because it gives the same local and quantum values but also because the optimal realizations coincide. More specifically, the measurements of Bob are precisely the optimal CHSH measurements, whereas the four measurements of Alice correspond to two pairs of optimal CHSH measurements.
Theorem II establishes that a pair of MUBs of any dimension can generate a maximal quantum violation in a Bell inequality test. We now turn to the converse matter, namely, that of device-independent certification. Specifically, given that we observe the maximal quantum violation, i.e., equality in Eq. 7, what can be said about the shared state and the measurements? Since the measurement operators can only be characterized on the support of the state, to simplify the notation, let us assume that the marginal states of Alice and Bob are full rank. (Note that this is not a physical assumption but a mathematical convention that simplifies the notation in the rest of this work. Whenever the marginal state is not full rank, the local Hilbert space naturally decomposes as a direct sum of two terms, where the state is only supported on one of them. The measurement operators can only be characterized on the support of the state, and that is precisely what we achieve. This convention allows us to only write out the part that can be characterized and leave out the rest.)
Theorem II.2 (Device-independent certification). The maximal quantum value of the Bell functional SdMUB in Eq. 4 implies that (i) there exist local isometries that allow Alice and Bob to extract a maximally entangled state of local dimension d, and (i) if the marginal state of Bob is full rank, the two d-outcome measurements that he performs satisfy the relationsPa=dPaQbPaandQb=dQbPaQb(9)for all a and b.
Proof. The proof is detailed in the Supplementary Materials (section S1A). Here, we briefly summarize the part concerning Bobs measurements. Since the Cauchy-Schwarz inequality is the main tool for proving the quantum bound in Eq. 7, saturating it implies that the Cauchy-Schwarz inequality is also saturated. This allows us to deduce that the measurements of Bob are projective, and moreover, we obtain the following optimality conditionAx1=1dd1(Px1Qx2)(10)for all x1, x2 [d] where the factor d/(d1) can be regarded as a normalization. Since we do not attempt to certify the measurements of Alice, we can, without loss of generality, assume that they are projective. This implies that the spectrum of Ax only contains { + 1, 1,0} and therefore (Ax)3 = Ax. This allows us to obtain a relation that only contains Bobs operators. Tracing out Alices system and subsequently eliminating the marginal state of Bob (it is assumed to be full rank) leads toPx1Qx2=dd1(Px1Qx2)3(11)
Expanding this relation and then using projectivity and the completeness of measurements, one recovers the result in Eq. 9.
We have shown that observing the maximal quantum value of SdMUB implies that the measurements of Bob satisfy the relations given in Eq. 9. It is natural to ask whether a stronger conclusion can be derived, but the answer turns out to be negative. In the Supplementary Materials (section S1B), we show that any pair of d-outcome measurements (acting on a finite-dimensional Hilbert space) satisfying the relations in Eq. 9 is capable of generating the maximal Bell inequality violation. For d = 2,3, the relations given in Eq. 9 imply that the unknown measurements correspond to a direct sum of MUBs (see section S2C) and since, in these dimensions, there exists only a single pair of MUBs (up to unitaries and complex conjugation), our results imply a self-testing statement of the usual kind. However, since, in higher dimensions, not all pairs of MUBs are equivalent (72), our certification statement is less informative than the usual formulation of self-testing. In other words, our inequalities allow us to self-test the quantum state, but we cannot completely determine the measurements [see (73, 74) for related results]. Note that we could also conduct a device-independent characterization of the measurements of Alice. Equation 61 from the Supplementary Materials enables us to relate the measurements of Alice to the measurements of Bob, which we have already characterized. However, since we do not expect the observables of Alice to satisfy any simple algebraic relations and since they are not directly relevant for the scope of this work (namely, MUBs and SICs), we do not pursue this direction.
The certification provided in Theorem II.2 turns out to be sufficient to determine all the probabilities p(a, b x, y) that arise in the Bell experiment (see section S1C), which means that the maximal quantum value of SdMUB is achieved by a single probability distribution. Because of the existence of inequivalent pairs of MUBs in certain dimensions (e.g., for d = 4), this constitutes the first example of an extremal point of the quantum set, which admits inequivalent quantum realizations. Recall that the notion of equivalence that we use is precisely the one that appears in the context of self-testing, i.e., we allow for additional degrees of freedom, local isometries, and a transposition.
It is important to understand the relation between the condition given in Eq. 9 and the concept of MUBs. Naturally, if {Pa}a=1d and {Qb}b=1d are d-dimensional MUBs, then the relations (Eq. 9) are satisfied. However, there exist solutions to Eq. 9 that are neither MUBs nor direct sums thereof. While, as mentioned above, for d = 2,3, one can show that any measurements satisfying the relations (Eq. 9) must correspond to a direct sum of MUBs, this is not true in general. For d = 4,5, we have found explicit examples of measurement operators satisfying Eq. 9, which cannot be written as a direct sum of MUBs. They cannot even be transformed into a pair of MUBs via a completely positive unital map (see section S2 for details). These results beg the crucial question: How should one interpret the condition given in Eq. 9?
To answer this question, we resort to an operational formulation of what it means for two measurements to be mutually unbiased. An operational approach must rely on observable quantities (i.e., probabilities), as opposed to algebraic relations between vectors or operators. This notion, which we refer to as mutually unbiased measurements (MUMs), was recently formalized by Tasca et al. (75). Note that in what follows, we use the term eigenvector to refer to eigenvectors corresponding to nonzero eigenvalues.
Definition II.3 (MUMs). We say that two n-outcome measurements {Pa}a=1n and {Qb}b=1n are mutually unbiased if they are projective and the following implications holdPa=1Qb=1nQb=1Pa=1n(12)for all a and b. That is, two projective measurements are mutually unbiased if the eigenvectors of one measurement give rise to a uniform outcome distribution for the other measurement.
Note that this definition precisely captures the intuition behind MUBs without the need to specify the dimension of the underlying Hilbert space. MUMs admit a simple algebraic characterization.
Theorem II.4. Two n-outcome measurements {Pa}a=1n and {Qb}b=1n are mutually unbiased if and only ifPa=nPaQbPaandQb=nQbPaQb(13)for all a and b.
Proof. Let us first assume that the algebraic relations hold. By summing over the middle index, one finds that both measurements are projective. Moreover, if is an eigenvector of Pa, then Qb=PaQbPa=1nPa=1n
By symmetry, the analogous property holds if is an eigenvector of Qb. Conversely, let us show that MUMs must satisfy the above algebraic relations. Since aPa=1, we can choose an orthonormal basis of the Hilbert space composed only of the eigenvectors of the measurement operators. Let {eja}a,j be an orthonormal basis, where a [n] tells us which projector the eigenvector corresponds to and j labels the eigenvectors within a fixed projector (if Pa has finite rank, then j [tr Pa]; otherwise, j ). By construction, for such a basis, we have
Paeja=aaeja. To show that Pa = nPaQbPa, it suffices to show that the two operators have the same coefficients in this basis. SinceejanPaQbPaeka=naaaaejaQbeka(14)ejaPaeka=aaaajk(15)it suffices to show that nejaQbeka=jk. For j = k, this is a direct consequence of the definition in Eq. 12. To prove the other case, define =(eja+eieka)/2, for [0,2). Since Pa = , we have Qb = 1/n. Writing this equality out gives1n=12(2n+eiejaQbeka+eiekaQbeja)(16)
Choosing = 0 implies that the real part of ejaQbeka vanishes, while = /2 implies that the imaginary part vanishes. Proving the relation Qb = nQbPaQb proceeds in an analogous fashion.
Theorem II.4 implies that the maximal violation of the Bell inequality for MUBs certifies precisely the fact the Bobs measurements are mutually unbiased. To provide further evidence that MUMs constitute the correct device-independent generalization of MUBs, we give two specific situations in which the two objects behave in the same manner.
Maassen and Uffink (16) considered a scenario in which two measurements (with a finite number of outcomes) are performed on an unknown state. Their famous uncertainty relation provides a state-independent lower bound on the sum of the Shannon entropies of the resulting distributions. While the original result only applies to rank-one projective measurements, a generalization to nonprojective measurements reads (76)H(P)+H(Q)logc(17)where H denotes the Shannon entropy and c=maxa,bPaQb2, where is the operator norm. If we restrict ourselves to rank-one projective measurements on a Hilbert space of dimension d, then one finds that the largest uncertainty, corresponding to c = 1/d, is obtained only by MUBs. It turns out that precisely the same value is achieved by any pair of MUMs with d outcomes regardless of the dimension of the Hilbert spacec=maxa,bPaQb2=maxa,bPaQb2=maxa,bPaQbPa=maxaPa/d=1d(18)
A closely related concept is that of measurement incompatibility, which captures the phenomenon that two measurements cannot be performed simultaneously on a single copy of a system. The extent to which two measurements are incompatible can be quantified, e.g., by so-called incompatibility robustness measures (77). In the Supplementary Materials (section S2D), we show that according to these measures, MUMs are exactly as incompatible as MUBs. Moreover, we can show that for the so-called generalized incompatibility robustness (78), MUMs are among the most incompatible pairs of d-outcome measurements.
The fact that the maximal quantum violation of the Bell inequalities introduced above requires a maximally entangled state and MUMs and, moreover, that it is achieved by a unique probability distribution suggests that these inequalities might be useful for device-independent quantum information processing. In the task of quantum key distribution (3, 17, 18), Alice and Bob aim to establish a shared dataset (a key) that is secure against a malicious eavesdropper. Such a task requires the use of incompatible measurements, and MUBs in dimension d = 2 constitute the most popular choice. Since, in the ideal case, the measurement outcomes of Alice and Bob that contribute to the key should be perfectly correlated, most protocols are based on maximally entangled states. In the device-independent approach to quantum key distribution, the amount of key and its security is deduced from the observed Bell inequality violation.
We present a proof-of-principle application to device-independent quantum key distribution based on the quantum nonlocality witnessed through the Bell functional in Eq. 4. In the ideal case, Alice and Bob follow the strategy that gives them the maximal violation, i.e., they share a maximally entangled state of local dimension d and Bob measures two MUBs. To generate the key, we provide Alice with an extra setting that produces outcomes that are perfectly correlated with the outcomes of the first setting of Bob. This will be the only pair of settings from which the raw key will be extracted, and let us denote them by x = x* and y = y* = 1. In most rounds of the experiment, Alice and Bob choose these settings and therefore contribute toward the raw key. However, to ensure security, a small number of rounds is used to evaluate the Bell functional. In these rounds, which are chosen at random, Alice and Bob randomly choose their measurement settings. Once the experiment is complete, the resulting value of the Bell functional is used to infer the amount of secure raw key shared between Alice and Bob. The raw key can then be turned into the final key by standard classical postprocessing. For simplicity, we consider only individual attacks, and moreover, we focus on the limit of asymptotically many rounds in which fluctuations due to finite statistics can be neglected.
The key rate, K, can be lower bounded by (79)Klog(Pg)H(By*Ax*)(19)where Pg denotes the highest probability that the eavesdropper can correctly guess Bobs outcome when his setting is y* given that the Bell inequality value was observed, and H( ) denotes the conditional Shannon entropy. The guessing probability Pg is defined asPgsup{c=1dABE1PcEcABE}(20)where {Ec}c=1d is the measurement used by the eavesdropper to produce her guess, the expression inside the curly braces is the probability that her outcome is the same as Bobs for a particular realization, and the supremum is taken over all quantum realizations (the tripartite state and measurements of all three parties) compatible with the observed Bell inequality value .
Let us first focus on the key rate in a noise-free scenario, i.e., in a scenario in which SdMUB attains its maximal value. Then, one straightforwardly arrives at the following result.
Theorem II.5 (Device-independent key rate). In the noiseless case, the quantum key distribution protocol based on SdMUB achieves the key rate ofK=logd(21)for any integer d 2.
Proof. In the noiseless case, Alice and Bob observe exactly the correlations predicted by the ideal setup. In this case, the outcomes for settings (x*, y*) are perfectly correlated, which implies that H(By*Ax*) = 0. Therefore, the only nontrivial task is to bound the guessing probability.
Since the actions of the eavesdropper commute with the actions of Alice and Bob, we can assume that she performs her measurement first. If the probability of the eavesdropper observing outcome c [d], which we denote by p(c), is nonzero, then the (normalized) state of Alice and Bob conditioned on the eavesdropper observing that outcome is given byAB(c)=1p(c)trC[(11Ec)ABEABE](22)
Now, Alice and Bob share one of the postmeasurement states AB(c), and when they perform their Bell inequality test, they will obtain different distributions depending on c, which we write as pc(a, b x, y). However, since the statistics achieve the maximal quantum value of SdMUB and we have previously shown that the maximal quantum value is achieved by a single probability point, all the probability distributions pc(a, b x, y) must be the same. Moreover, we have shown that for this probability point, the marginal distribution of outcomes on Bobs side is uniform over [d] for both inputs. This implies thatPg=c=1dp(c)pc(b=cy=1)=1d(23)because pc(b=cy=1)=p(b=cy=1)=1d for all c.
We remark that the argument above is a direct consequence of a more general result that states that if a bipartite probability distribution is a nonlocal extremal point of the quantum set, then no external party can be correlated with the outcomes (80). The obtained key rate is the largest possible for general setups in which the key is generated from a d-outcome measurement. In addition, the key rate is optimal for all protocols based on a pair of entangled d-dimensional systems subject to projective measurements. This follows from the fact that projective measurements in d cannot have more than d outcomes. It has recently been shown that the same amount of randomness can be generated using a modified version of the Collins-Gisin-Linden-Massar-Popescu inequalities (61), but note that the measurements used there do not correspond to MUBs (except for the special case of d = 2).
Let us now depart from the noise-free case and estimate the key rate in the presence of noise. To ensure that both the guessing probability and the conditional Shannon entropy can be computed in terms of a single noise parameter, we have to introduce an explicit noise model. We use the standard approach in which the measurements remain unchanged, while the maximally entangled state is replaced with an isotropic state given byv=vdmaxdmax+1vd21(24)where v [0,1] is the visibility of the state. Using this state and the ideal measurements for Alice and Bob, the relation between v and SdMUB can be easily derived from Eq. 8, namelyv=12(1+SdMUBd(d1))(25)
Using this formula, we also obtain the value of H(By* Ax*) as a function of the Bell violation. The remaining part of Eq. 19 is the guessing probability (Eq. 20). In the case of d = 3, we proceed to bound this quantity through semidefinite programming.
Concretely, we implement the three-party semidefinite relaxation (81) of the set of quantum correlations at local level 1 (we attribute one operator to each outcome of Bob and the eavesdropper but only take into account the first two outcomes of Alice). This results in a moment matrix of size 532 532 with 15,617 variables. The guessing probability is directly given by the sum of three elements of the moment matrix. It can then be maximized under the constraints that the value of the Bell functional S3MUB is fixed and the moment matrix is positive semidefinite. However, we notice that this problem is invariant under the following relabeling: b (b) for y = 1, c (c), and x1 (x1), where S3 is a permutation of three elements. Therefore, it is possible to simplify this semidefinite program by requiring the matrix to be invariant under the group action of S3 on the moment matrix (i.e., it is a Reynolds matrix) (43, 82, 83). This reduces the number of free variables in the moment matrix to 2823. With the Self-Dual Minimization (SeDuMi) (84) solver, this lowers the precision (1.1 106 instead of 8.4 108) but speeds up the computation (155 s instead of 8928 s) and requires less memory (0.1 gigabytes instead of 5.5 gigabytes). For the maximal value of SdMUB, we recover the noise-free result of K = log 3 up to the fifth digit. In addition, we have a key rate of at least one bit when SdMUB2.432 and a nonzero key rate when SdMUB2.375. The latter is close to the local bound, which is SdMUB2.367. The resulting lower bound on the key rate as a function of the Bell inequality violation is plotted in Fig. 2.
We now shift our focus from MUBs to SICs. We construct Bell inequalities whose maximal quantum violations are achieved with SICs. We formally define a SIC as a set of d2 unit vectors in d, namely, {rj}j=1d2, with the property thatrjrk2=1d+1(26)for all j k, where the constant on the right-hand side is fixed by normalization. The reason for there being precisely d2 elements in a SIC is that this is the largest number of unit vectors in d that could possibly admit the uniform overlap property (Eq. 26). Moreover, we formally distinguish between a SIC as the presented set of rank-one projectors and a SIC-POVM (positive operator-valued measure), which is the generalized quantum measurement with d2 possible outcomes corresponding to the subnormalized projectors {1drkrk}k=1d2.
Since the treatment of SICs in Bell nonlocality turns out to be more challenging than for the case of MUBs, we first establish the relevance of SICs in a simplified Bell scenario subject to additional constraints. This serves as a stepping stone to a subsequent relaxation, which gives a standard (unconstrained) Bell inequality for SICs. We then focus on the device-independent certification power of these inequalities, which leads us to an operational notion of symmetric informational completeness. Last, we extend the Bell inequalities so that their maximal quantum violations are achieved with both projectors forming SICs and a single generalized measurement corresponding to a SIC-POVM.
Stepping stone: Quantum correlations for SICs. Consider a Bell scenario, parameterized by an integer d 2, involving two parties Alice and Bob who share a physical system. Alice receives an input labeled by a tuple (x1, x2) representing one of (d22) possible inputs, which we collectively refer to as x = x1x2. The tuple is randomly taken from the set Pairs(d2) {x x1, x2 [d2] and x1 < x2}. Alice performs a measurement on her part of the shared system and produces a ternary output labeled by a {1,2, }. Bob receives an input labeled by y [d2], and the associated measurement produces a binary outcome labeled by b {1, }. The joint probability distribution is denoted by p(a, b x, y), and the Bell scenario is illustrated in Fig. 3.
Alice receives one of (d22) inputs and returns a ternary outcome, while Bob receives one of d2 inputs and returns a binary outcome.
Similar to the case of MUBs, to make our choice of Bell functional transparent, we phrase it as a game played by Alice and Bob. We imagine that their inputs are supplied by a referee, who promises to provide x = x1x2 and y such that either y = x1 or y = x2. Similar to the previous game, Alice can output a = to ensure that no points are won or lost. However, in this game also, Bob can ensure that no points are won or lost by outputting b = . If neither of them outputs , then a point is either won or lost. Specifically, when a = 1, a point is won if y = x1 (and lost otherwise), whereas if a = 2, then a point is won if y = x2 (and lost otherwise). Let us remark that in this game, Bobs only role is to decide whether, in a given round, points can be won/lost or not. For this game, the total number of points (the Bell functional) readsRdSICx1 Let us now impose additional constraints on the marginal distributions of the outputs. More specifically, we require thatx:p(a=1x)+p(a=2x)=2dy:p(b=1y)=1d(28) The intuition behind these constraints is analogous to that discussed for the case of MUBs. Namely, we imagine that Alice and Bob perform measurements on a maximally entangled state of local dimension d. Then, we wish to fix the marginals such that the measurements of Alice (Bob) for the outcomes a {1,2} (b = 1) remotely prepare Bobs (Alices) subsystem in a pure state. This corresponds to the marginals p(a = 1 x) = p(a = 2 x) = p(b = 1 x) = 1/d, which is reflected in the marginal constraints in Eq. 28. We remark that imposing these constraints simplifies both the intuitive understanding of the game and the derivation of the results below. However, it merely serves as a stepping stone to a more general subsequent treatment in which the constraints (Eq. 28) will be removed. To write the value of the Bell functional of a quantum realization, let us introduce two simplifications. The measurement operators of Alice are denoted by {Axa}, and as before, it is convenient to work with the observables defined as Ax=Ax1Ax2. The measurements of Bob are denoted by {Byb}, but since they only have two outcomes, all the expressions can be written in terms of a single operator from each input y. In our case, it is convenient to use the outcome-one operator, and for convenience, we will skip the superscript,, i.e., we will write ByBy1 for all y. Then, the Bell functional evaluated on a specific quantum realization readsRdSIC=x1 Note that the Bell functional, in particular, when written in a quantum model, is much reminiscent of the expression RdMUB (Eq. 2) encountered for MUBs, with the key difference that the roles of the inputs and outputs of Bob are swapped. Let us consider a quantum strategy in which Alice and Bob share a maximally entangled state dmax. Moreover, Bobs measurements are defined as By = yy, where {y}y=1d2 is a set of unit vectors forming a SIC (assuming it exists in dimension d), i.e., y y2 = 1/(d + 1) for all y y. In addition, we define Alices observables as Ax=(d+1)/d(Bx1Bx2)T, where the prefactor ensures normalization. First, since the subsystems of Alice and Bob are maximally mixed and the outcomes a {1,2} and b = 1 each correspond to rank-one projectors, the marginal constraints in Eq. 28 are satisfied. Using the fact that for any linear operator O we have O1dmax=1OTdmax, we find thatRdSIC=d+1dx1 This strategy relying on a maximally entangled state and a SIC achieves the maximal quantum value of RdSIC under the constraints of Eq. 28. In the Supplementary Materials (section S3A), we prove that under these constraints, the tight quantum and no-signaling bounds on RdSIC readRdSICQd(d1)d(d+1)(31)RdSICNSd(d21)(32) We remark that SICs are not known to exist in all Hilbert space dimensions. However, their existence in all dimensions is strongly conjectured, and explicit SICs have been found in all dimensions up to 193 (5355). Bell inequalities for SICs. The marginal constraints in Eq. 28 allowed us to prove that the quantum realization based on SICs achieves the maximal quantum value of RdSIC. Our goal now is to remove these constraints to obtain a standard Bell functional. Analogously to the case of MUBs, we add marginal terms to the original functional RdSIC. To this end, we introduce penalties for both Alice and Bob. Specifically, if Alice outputs a {1,2}, then they lose d points, whereas if Bob outputs b = 1, then they lose d points. The total number of points in the modified game constitutes our final Bell functionalSdSICRdSICdx1 Hence, our aim is to suitably choose the penalties d and d so that the maximal quantum value of SdSIC is achieved with a strategy that closely mimics the marginal constraints (Eq. 28) and thus maintains the optimality of Bob performing a SIC. Theorem II.6 (Bell inequalities for SICs). The Bell functional SdSIC in Eq. 33 withd=1d,22dd+1d=d22d(d+1)(34)obeys the tight local boundSdSICLHV{4ford=2d2(d1)d(d2d1)dd+1ford3(35)and the quantum boundSdSICQd+2d,22d(d+1)(36) Moreover, the quantum bound is tight and can be saturated by sharing a maximally entangled state of local dimension d and choosing Bobs outcome-one projectors to form a SIC. Proof. The proof is presented in the Supplementary Materials (section S3B). To obtain the quantum bound in Eq. 36, the key ingredients are the Cauchy-Schwarz inequality and semidefinite relaxations of polynomial optimization problems. To derive the local bound in Eq. 35, the key observation is that the symmetries of the Bell functional allow us to notably simplify the problem. The fact that the quantum bound is saturated by a maximally entangled state and Bob performing a SIC can be seen immediately from the previous discussion that led to Eq. 30. With that strategy, we find RdSIC=d(d1)d(d+1). Since it also respects (a = 1x) + p(a = 2x) = 2/d x, as well as p(b = 1y) = 1/d y, a direct insertion into Eq. 33 saturates the bound in Eq. 36. Note that in the limit of d both the local bound and the quantum bound grow quadratically in d. We remark that for the special case of d = 2, no penalties are needed to maintain the optimality of SICs (which is why the Kronecker delta appears in Eq. 34). The derived Bell inequality for a qubit SIC (which corresponds to a tetrahedron configuration on the Bloch sphere) can be compared to the so-called elegant Bell inequality (85) whose maximal violation is also achieved using the tetrahedron configuration. While we require six settings of Alice and four settings of Bob, the elegant Bell inequality requires only four settings of Alice and three settings of Bob. However, the additional complexity in our setup carries an advantage when considering the critical visibility of the shared state, i.e., the smallest value of v in Eq. 24 (defining an isotropic state) for which the Bell inequality is violated. The critical visibility for violating the elegant Bell inequality is 86.6%, whereas for our Bell inequality, it is lowered to 81.6%. We remark that on the Bloch sphere, the antipodal points corresponding to the four measurements of Alice and the six measurements of Bob form a cube and a cuboctahedron, respectively, which constitutes an instance of the type of Bell inequalities proposed in (86). Device-independent certification. Theorem II.6 shows that for any dimension d 2, we can construct a Bell inequality that is maximally violated by a SIC in that dimension (provided that a SIC exists). Let us now consider the converse question, namely, that of device-independent certification. In analogy with the case of MUBs (Eq. 9), we find a simple description of Bobs measurements. Theorem II.7 (Device-independent certification). The maximal quantum value of the Bell functional SdSIC, provided that the marginal state of Bob is full rank, implies that his measurement operators {By}y=1d2 are projective and satisfyyBy=d1(37)andBy=(d+1)ByByBy(38)for all y y. A complete proof, which is similar in spirit to the proof of Theorem II.2, can be found in the Supplementary Materials (section S3C). For the special case of d = 2, the conclusion can be made even more accurate: The maximal quantum violation of S2SIC implies that Bobs outcome-one projectors are rank-one projectors acting on a qubit whose Bloch vectors form a regular tetrahedron (up to the three standard equivalences used in self-testing). Similar to the case of MUBs, we face the key question of interpreting the condition in Eq. 38 and its relation to SICs. Again, in analogy with the case of MUBs, we note that the concept of a SIC references the dimension of the Hilbert space, which should not appear explicitly in a device-independent scenario. Hence, we consider an operational approach to SICs, which must rely on observable quantities (i.e., probabilities). This leads us to the following natural definition of a set of projectors being operationally symmetric informationally complete (OP-SIC). Definition II.8 (Operational SIC). We say that a set of projectors {Ba}a=1n2 is OP-SIC ifaBa=n1(39)andBa=1Bb=1n+1(40)for all a b. This definition trivially encompasses SICs as special instances of OP-SICs. An argument analogous to the proof of Theorem II.4 shows that this definition is in fact equivalent to the relations given in Eqs. 37 and 38. Hence, in analogy with the case of MUBs, the property of Bobs measurements certified by the maximal violation of our Bell inequality is precisely the notion of OP-SICs. Adding a SIC-POVM. The Bell inequalities proposed above (Bell functional SdSIC) are tailored to sets of rank-one projectors forming a SIC. However, it is also interesting to consider a closely related entity, namely, a SIC-POVM, which is obtained simply by normalizing these projectors, so that they can be collectively interpreted as arising from a single measurement. That is, a SIC-POVM on d is a measurement {Ea}a=1d2 in which every measurement operator can be written as Ea=1daa, where the set of rank-one projectors { aa }a forms a SIC. Because of the simple relation between SICs and SIC-POVMs, we can extend the Bell inequalities for SICs proposed above such that they are optimally implemented with both a SIC (as before) and a SIC-POVM. It is clear that to make SIC-POVMs relevant to the Bell experiment, it must involve at least one setting that corresponds to a d2-outcome measurement. For the Bell scenario previously considered for SICs (see Fig. 3), no such measurement is present. Therefore, we supplement the original Bell scenario by introducing a single additional measurement setting of Alice, labeled by povm, which has d2 outcomes labeled by a [d2]. The modified Bell scenario is illustrated in Fig. 4. We construct the Bell functional TdSIC for this scenario by modifying the previously considered Bell functional SdSICTdSIC=SdSICy=1d2p(a=y,b=povm,y)(41) This scenario modifies the original Bell scenario for SICs (see Fig. 3) by supplying Alice with an extra setting labeled by povm, which has d2 possible outcomes. Hence, whenever Bob outputs and the outcome associated to the setting povm coincides with the input of Bob, a point is lost. Evidently, the largest quantum value of TdSIC is no greater than the largest quantum value of SdSIC. For the former to equal the latter, we require that (i) SdSIC reaches its maximal quantum value (which is given in Eq. 36) and (ii) that (a = y, b = povm, y) = 0 y. We have already seen that by sharing a maximally entangled state and Bobs outcome-one projectors {By}y forming a SIC, the condition (i) can be satisfied. By normalization, we have that Bobs outcome- projectors are By=>1By. Again, noting that for any linear operator O we have O1dmax=1OTdmax, observe that if Bob applies By, then Alices local state is orthogonal to By. Hence, if Alice chooses her POVM {Ea}, corresponding to the setting povm, as the SIC-POVM defined by Ea=1dBaT, the probability of finding a = y vanishes. This satisfies condition (ii). Hence, we conclude that in a general quantum modelTdSICQd+2d,22d(d+1)(42)and that the bound can be saturated by supplementing the previous optimal realization with a SIC-POVM on Alices side. The fact that the Bell functionals SdSIC and TdSIC achieve their maximal quantum values with a SIC and a SIC-POVM, respectively, opens up the possibility for device-independent quantum information protocols for tasks in which SICs and SIC-POVMs are desirable. We focus on one such application, namely, that of device-independent quantum random number generation (87). This is the task of certifying that the data generated by a party cannot be predicted by a malicious eavesdropper. In the device-independent setting, both the amount of randomness and its security are derived from the violation of a Bell inequality. Nonprojective measurements, such as SIC-POVMs, are useful for this task. The reason is that a Bell experiment implemented with entangled systems of local dimension d and standard projective measurements cannot have more than d outcomes. Consequently, one cannot hope to certify more than log d bits of local randomness. However, Bell experiment relying on d-dimensional entanglement implemented with (extremal) nonprojective measurements can have up to d2 outcomes (88). This opens the possibility of generating up to 2 log d bits of local randomness without increasing the dimension of the shared entangled state. Notably, for the case of d = 2, such optimal quantum random number generation has been shown using a qubit SIC-POVM (42). Here, we use our Bell inequalities for SIC-POVMs to significantly outperform standard protocols relying on projective measurements on d-dimensional entangled states. To this end, we briefly summarize the scenario for randomness generation. Alice and Bob perform many rounds of the Bell experiment illustrated in Figure 4. Alice will attempt to generate local randomness from the outcomes of her setting labeled by povm. In most rounds of the Bell experiment, Alice performs povm and records the outcome a. In a smaller number of rounds, she randomly chooses her measurement setting, and the data are used toward estimating the value of the Bell functional TdSIC defined in Eq. 41. A malicious eavesdropper may attempt to guess Alices relevant outcome a. To this end, the eavesdropper may entangle her system with that of Alice and Bob and perform a well-chosen POVM {Ec}c to enhance her guess. In analogy to Eq. 20, the eavesdroppers guessing probability readsPgsup{c=1d2ABEApovmc1EcABE}(43)where {Ec}c=1d2 is the measurement used by the eavesdropper to produce her guess, the expression inside the curly braces is the probability that her outcome is the same as Alices outcome for the setting povm for a particular realization, and the supremum is taken over all quantum realizations (the tripartite state and measurements of all three parties) compatible with the observed Bell inequality violation =TdSIC. We quantify the randomness generated by Alice using the conditional min-entropy Hmin(ApovmE)=log(Pg). To obtain a device-independent lower bound on the randomness, we must evaluate an upper bound on Pg for a given observed value of the Bell functional. We saw in the Application: Device-independent quantum key distribution section that if the eavesdropper is only trying to guess the outcome of a single measurement setting, we can, without loss of generality, assume that they are only classically correlated with the systems of Alice and Bob. As before, we restrict ourselves to the asymptotic limit of many rounds, in which fluctuations due to finite statistics can be neglected. To bound the randomness for some given value of TdSIC, we use the hierarchy of quantum correlations (81). We restrict ourselves to the cases of d = 2 and d = 3. For the case of d = 2, we construct a moment matrix with the operators {(1,Ax)(1,By)(1,E)}{Apovm(1,By,E)}, neglecting the outcome. The matrix is of size 361 361 with 10,116 variables. Again, we can make use of symmetry to simplify the semidefinite program. In this case, the following permutation leaves the problem invariant: x1 (x1), x2 (x2), a f(a, x1, x2), a (a), y (y), and c (c), wheref(a,x1,x2)={a(x1)<(x2)2(x1)(x2)anda=11(x1)(x2)anda=2(x1)(x2)anda=(44)and S4. Using this symmetry reduces the number of free variables to 477. The trade-off between the amount of certified randomness and the nonlocality is illustrated in Fig. 5. We find that for sufficiently large values of T2SIC (roughly T2SIC4.8718), we outperform the one-bit limitation associated to projective measurements on entangled qubits. Notably, for even larger values of T2SIC, we also outperform the restriction of log 3 bits associated to projective measurements on entangled systems of local dimension three. For the optimal value of T2SIC we find Hmin(Apovm E) 1.999, which is compatible up to numerical precision with the largest possible amount of randomness obtainable from qubit systems under general measurements, namely, two bits. This two-bit limit stems from the fact that every qubit measurement with more than four outcomes can be stochastically simulated with measurements of at most four outcomes (88). For the case of d = 3, we bound the guessing probability following the method of (87). This has the advantage of requiring only a bipartite, and hence smaller, moment matrix than the tripartite formulation. However, the amount of symmetry leaving the problem invariant is reduced because the objective function only involves one outcome. Concretely, we construct a moment matrix of size 820 820 with 263,549 variables. We then write the guessing probability as P(a = 1povm) and identify the following group of permutations, leaving the problem invariant: x1 (x1), x2 (x2), a f(a, x1, x2), a (a), and y (y), where S9 leaves element 1 invariant and permutes elements 2, ,9 in all possible ways. Taking this symmetry into account reduces the number of free variables to 460. To further simplify the problem, we make use of RepLAB, a recently developed tool that decomposes representations of finite groups into irreducible representations (89, 90). This allows us to write the moment matrix in a preferred basis in which it is block diagonal. The semidefinite constraint can then be imposed on each block independently, with the largest block size 28 28 instead of 820 820. Solving one semidefinite program with SeDuMi (84) then takes 0.7 s with <0.1 gigabytes of memory instead of 162 s/0.2 gigabytes without block diagonalization and fails because of lack of memory without any symmetrization (>400 gigabytes required). Using entangled states of dimension 3 and corresponding SIC-POVMs, one can attain the full range of values for T3SIC. The guessing probability is independent of the outcome guessed by the eavesdropper, and we can verify that the bound that we obtain is convex, hence guaranteeing that no mixture of strategy by the eavesdropper must be considered (87). The randomness is then given in Fig. 6, which indicates that by increasing the value of T3SIC, we can obtain more randomness than the best possible schemes relying on standard projective measurements and entangled systems of dimensions 3,4,5,6, and 7. In particular, in the case of T3SIC being maximal, we find that Hmin(ApovmE) 3.03 bits. This is larger than what can be obtained by performing projective measurements on eight dimensional systems (since log 8 = 3 bits). It is, however, worth noting that this last value is obtained at the boundary of the set of quantum correlations where the precision of the solver is significantly reduced (in particular, the DIMACS errors at this point are of the order of 104). It is not straightforward to estimate the extent to which this reduced precision may influence the guessing probability, so it would be interesting to reproduce this computation with a more precise solver such as SDPA (91). Acknowledgments: We would like to thank T. de Lima Silva and N. Gisin for fruitful discussions. We thank M. Arajo for helpful comments. Funding: This work was supported by the Swiss National Science Foundation (starting grant DIAQ, NCCRQSIT). A.T. acknowledges support from the Swiss National Science Foundation (Early PostDoc Mobility fellowship P2GEP2 194800). The project Robust certification of quantum devices is carried out within the HOMING programme of the Foundation for Polish Science cofinanced by the European Union under the European Regional Development Fund. M.F. acknowledges support from the Polish NCN grant Sonata UMO-2014/14/E/ST2/00020, the European Research Council (ERC) under the European Unions Horizon 2020 research and innovation programme ERC AdG CERQUTE (grant agreement no. 834266), the State Research Agency (AEI) TRANQI (PID2019-106888GB-I00/10.13039/501100011033), the Government of Spain (FIS2020-TRANQI; Severo Ochoa CEX2019-000910-S), Fundaci Cellex, Fundaci Mir-Puig, and Generalitat de Catalunya (CERCA, AGAUR). Author contributions: A.T. and J.K. proposed the basic concept. A.T., M.F., J.-D.B., and J.K. developed the theory and the proofs. D.R. developed a software that was used to facilitate particular computations. A.T., M.F., J.-D.B., and J.K. discussed the results and participated in the writing of the manuscript. Competing interests: The authors declare that they have no competing interests. Data and materials availability: All data needed to evaluate the conclusions in the paper are present in the paper and/or the Supplementary Materials. Additional data related to this paper may be requested from the authors. Read the original:
Posted in Quantum Physics
Comments Off on Mutually unbiased bases and symmetric informationally complete measurements in Bell experiments – Science Advances
Yale Quantum Institute Co-sponsored Event – Alternative Realities for the Living – Quantum Physics & Fiction – Yale News
Posted: at 1:59 pm
Friday February 12 at 4 pm Zoom Webinar
Join us for the 11thtalk of Yale Quantum Institute series of nontechnical talks aiming to bring a new regard to quantum physics and STEM by having experts cast new light on often-overlooked aspects of scientific work.
The acclaimed Nigerian Poet and Novelist Ben Okri, one of the foremost postmodern authors, is joining us to talk about his newest book Prayer For The Living, which includes the Quantum Physics Murder Mystery Alternative Realities are True. During this event, Ben will read this short story, share how quantum physics came muddy the water of this British police investigation, and answer the audience questions about his extensive body of work.
This talk, co-sponsored byThe Franke Program in Science and the Humanities,is open to all and will be accessible to students, researchers, the wider university public and the New Haven Community.
Order Ben Okris newest book: Prayer for the Living onBookshop.org,Akashic, orAmazon.
Register here:https://yale.zoom.us/webinar/register/8816106641445/WN_1LIwOVD3SrKsHrThae3caA
Go here to read the rest:
Posted in Quantum Physics
Comments Off on Yale Quantum Institute Co-sponsored Event – Alternative Realities for the Living – Quantum Physics & Fiction – Yale News