The Prometheus League
Breaking News and Updates
- Abolition Of Work
- Ai
- Alt-right
- Alternative Medicine
- Antifa
- Artificial General Intelligence
- Artificial Intelligence
- Artificial Super Intelligence
- Ascension
- Astronomy
- Atheism
- Atheist
- Atlas Shrugged
- Automation
- Ayn Rand
- Bahamas
- Bankruptcy
- Basic Income Guarantee
- Big Tech
- Bitcoin
- Black Lives Matter
- Blackjack
- Boca Chica Texas
- Brexit
- Caribbean
- Casino
- Casino Affiliate
- Cbd Oil
- Censorship
- Cf
- Chess Engines
- Childfree
- Cloning
- Cloud Computing
- Conscious Evolution
- Corona Virus
- Cosmic Heaven
- Covid-19
- Cryonics
- Cryptocurrency
- Cyberpunk
- Darwinism
- Democrat
- Designer Babies
- DNA
- Donald Trump
- Eczema
- Elon Musk
- Entheogens
- Ethical Egoism
- Eugenic Concepts
- Eugenics
- Euthanasia
- Evolution
- Extropian
- Extropianism
- Extropy
- Fake News
- Federalism
- Federalist
- Fifth Amendment
- Fifth Amendment
- Financial Independence
- First Amendment
- Fiscal Freedom
- Food Supplements
- Fourth Amendment
- Fourth Amendment
- Free Speech
- Freedom
- Freedom of Speech
- Futurism
- Futurist
- Gambling
- Gene Medicine
- Genetic Engineering
- Genome
- Germ Warfare
- Golden Rule
- Government Oppression
- Hedonism
- High Seas
- History
- Hubble Telescope
- Human Genetic Engineering
- Human Genetics
- Human Immortality
- Human Longevity
- Illuminati
- Immortality
- Immortality Medicine
- Intentional Communities
- Jacinda Ardern
- Jitsi
- Jordan Peterson
- Las Vegas
- Liberal
- Libertarian
- Libertarianism
- Liberty
- Life Extension
- Macau
- Marie Byrd Land
- Mars
- Mars Colonization
- Mars Colony
- Memetics
- Micronations
- Mind Uploading
- Minerva Reefs
- Modern Satanism
- Moon Colonization
- Nanotech
- National Vanguard
- NATO
- Neo-eugenics
- Neurohacking
- Neurotechnology
- New Utopia
- New Zealand
- Nihilism
- Nootropics
- NSA
- Oceania
- Offshore
- Olympics
- Online Casino
- Online Gambling
- Pantheism
- Personal Empowerment
- Poker
- Political Correctness
- Politically Incorrect
- Polygamy
- Populism
- Post Human
- Post Humanism
- Posthuman
- Posthumanism
- Private Islands
- Progress
- Proud Boys
- Psoriasis
- Psychedelics
- Putin
- Quantum Computing
- Quantum Physics
- Rationalism
- Republican
- Resource Based Economy
- Robotics
- Rockall
- Ron Paul
- Roulette
- Russia
- Sealand
- Seasteading
- Second Amendment
- Second Amendment
- Seychelles
- Singularitarianism
- Singularity
- Socio-economic Collapse
- Space Exploration
- Space Station
- Space Travel
- Spacex
- Sports Betting
- Sportsbook
- Superintelligence
- Survivalism
- Talmud
- Technology
- Teilhard De Charden
- Terraforming Mars
- The Singularity
- Tms
- Tor Browser
- Trance
- Transhuman
- Transhuman News
- Transhumanism
- Transhumanist
- Transtopian
- Transtopianism
- Ukraine
- Uncategorized
- Vaping
- Victimless Crimes
- Virtual Reality
- Wage Slavery
- War On Drugs
- Waveland
- Ww3
- Yahoo
- Zeitgeist Movement
-
Prometheism
-
Forbidden Fruit
-
The Evolutionary Perspective
Category Archives: Quantum Physics
Quantum Computing In Finance Where We Stand And Where We Could Go – Science 2.0
Posted: May 14, 2021 at 6:23 am
Quantum computers(QCs) operate totally differently than classical computers. Due to the quantumeffects known as superposition and entanglement, quantum bits (called qubits) cantake on non-binary states represented by complex numbers. This facilitatescomputational solutions to mathematical problems that cannot be solved byclassical computers because they require sequentially computing an astronomicalnumber of combinations or permutations.
This ability of QCsmean that they particularly excel at optimization problems, where the optimalcombination is only found after trying out an enormous number of possiblecombinations. Several important problems in finance are in essence optimizationproblems which meet this description. The portfolio-optimization problem infinance is one good example of such a problem. Asset pricing, credit-scoring,and Monte Carlo-type risk analysis are other examples. For example, it isestimated that running a risk assessment of a large portfolio which needs to bedone overnight or can even take days with classical computers, could one day bedone in real-time by a full-scale QC. That explains the keen interest of thefinance industry in quantum solutions.
The calculatingpower of a QC grows exponentially with the number of qubits. Quantum-computingroadmaps cite the number of qubits or competing metrics to indicate the risingpower of these machines, with some setting thresholds for so-called quantum supremacy,the point at which QCs will surpass classical supercomputers. But there arestill enormous technical challenges to solve before at-scale QCs can becommercialized, most notably the challenges of stability and error correction.
However, quantum-inspiredsoftware which is software running on classical computers but based on novelalgorithms that reframe mathematical problems in terms of quantum principles is already here. Several quantum-inspired solutions are currently focused onportfolio-optimization problems, and seem well positioned for their near-futureadoption by the financial asset-management industry.
Even thelimited-size, noisy QCs currently available lend themselves toportfolio-optimization solutions. Early proofs-of-concept (POCs) of hybrid orfull quantum solutions to asset-portfolio optimizations such as stock selectionhave already been demonstrated with encouraging results. Many of the largestnames in finance are already investing in quantum, or at least partnering withtechnology providers to explore finance applications. Financial servicescompanies who wait too long to gain experience in the field run the risk ofgetting left behind.
Quantum computing exploits quantum mechanics, the propertiesand behavior of fundamental particles at the subatomic level, as predicted byour best current understanding of quantum physics. The goal of quantumcomputing is to build hardware and develop suitable algorithms that processinformation in ways that are superior to so-called classical computers, i.e.the ubiquitous digital computers that the Information Age was built on.
The essential elements of a QC were postulated in the early1980s, but of late work in this area has accelerated with several largeestablished companies and start-ups building quantum-computing hardware. Aneven larger ecosystem of software platforms and solution providers exist aroundthe hardware providers. Collaboration models such as alliances and partnershipsare common. Many universities are involved, while governments are alsosupporting quantum-computing research.
Typical of a new industry, standards and metrics are stillin flux, and competing architectures, which leverage different mechanisms andimplementations of quantum principles, vie for technical supremacy andinvestment dollars. Announcements of new breakthroughs are made almost daily,which makes it important to distinguish the hype from real progress.
This paper attempts to demystify the technology, byexplaining the basic principles of quantum computing and the competingtechnologies vying for quantum supremacy. An overview of the current quantumcomputing industry and the main players is provided, as well as a look at thefirst applications and the different industries that could benefit. The focusthen turns to the finance industry, with an overview of the most importantcomputational problems in finance that lends themselves to quantum computing,with a deeper dive into portfolio optimization. Notable recent case studies andtheir participants are reviewed. The paper concludes with an assessment of thecurrent state of quantum computing and the business impact that can be expectedin the short and medium term.
What we now call classical(or conventional) digital computers perform all theircalculations in an aggregate of individual bits that are either 0 or 1 invalue, because they are implemented by transistors that are each eitherswitched completely on or off. This is called binary logic,which is the essence of any digital computer, and implemented in a longstandingcomputer-science paradigm originating with Turing and Von Neumann. Conventionalcomputers operate by switching billions of little transistors on and off, withall state changes governed by the computers clock cycle. With n transistors,there are 2n possible states for the computer to be in at any giventime. Importantly, the computer can only be in one of these states at a time. Digital computers are highly complexwith typical computer chips holding 20x1019 bits, yet incrediblyreliable at the semiconductor level with fewer than one error in 1024operations. (Software and mechanical-related errors are far more common incomputers.)
Analog computersprecede digital computers. In contrast to digital computers, classical analogcomputers perform calculations with electrical parameters (voltage or current)that take a full range of values along a continuous linear scale. Analogcomputers do not necessarily need to be electrical they can be mechanicaltoo, such as the first ones built by the ancient Greeks but the most sophisticated ones from the 20th century ones wereelectrical. Unlike digital computers, analog computers do not need a clock cycle,and all values change continuously. Before the digital revolution was enabledthrough the mass integration of transistors on chips, analog computers wereused in several applications, for example, to calculate flight trajectories orin early autopilot systems. But since the 1960s analog computers have largelyfallen into disuse due to the dominance of digital computers over the last fewdecades.
Both classical digital and analog computers are at theircore electrical devices, in the sense that they perform logic operations thatare reflected by the electrical state of devices, typically semiconductordevices such as transistors (or vacuum tubes for mid-20th centuryanalog computers), which comes about because of voltage differences and currentflow. Current flow is physically manifested in terms of the flow of electrons in an electricalcircuit.
Quantum computers(QCs), on the other hand, directly exploit the strange and counterintuitivebehavior of sub-atomic particles (electrons, nuclei or photons) as predicted byquantum theory to implement a new type of mathematics. In a QC, quantum bitscalled qubits can be measured as|0> or |1>, which are the quantum equivalents of the binary 0 and 1 inclassical computers. However, due to a quantum property called superposition, qubits can be non-binaryin a superposition state and interact with one another in that state duringprocessing. It is this special property that allows QCs to theoretically offerexponentially more processing power than classical computers in someapplications. Once the processing is complete, the result can only be measuredin the binary states, |0> or |1>, because superpositioning is alwayscollapsed by the measurement process.
Because of another curious quantum property called entanglement, the behavior of two ormore quantum objects is correlated even if they are physically separated.According to the laws of quantum mechanics, this pattern is consistent whethera millimeter or kilometer or an astronomical distance separates them.While one qubit is situated in a superposition between two basis states, 10qubits utilizing entanglement, could be in a superposition of 1,024 basisstates.
Unlike the linearity of classical computers, the calculatingpower of a QC grows exponentially with the number of qubits. It is this abilitythat gives QCs the extraordinary power of processing a huge number of possibleoutcomes simultaneously. When in the unobserved state of superposition, nqubits can contain the same amount of information as 2n classicalbits. So, four qubits are equivalent to 16 classical bits, which might notsound like a big improvement. But 16 qubits are equivalent to 85,536 classicalbits, and 300 qubits can contain more states than all the atoms estimated to bein the universe. That is not only an astronomical number; it is beyondastronomical. This exponential effect is why there is so much hope for thefuture of quantum computing. With single- or double-digit numbers of qubits,the advantage over classical computing is not immediately clear, but the powerof quantum computing scales exponentially beyond that in ways that are trulyhard to imagine. This explains why there is so much anticipation about thetechnology exploding once a certain number of qubits have been reached in areliable QC.
However, to reliably encode information and expect it to bereturned upon measurement, there are only two acceptable states for a qubit: 0and 1.This means a qubit can only store 1 bit of information at a time. Even withmany qubits, the scaling of information storage doesn't improve beyond whatyou'd get classically: ten qubits can store 10 bits of information and onethousand qubits can store 1,000 bits. Because a qubit can only be measured in one of these two states,qubits cannot store any more data than conventional computer bits. There isthus no quantum advantage in data storage. The advantage is in informationprocessing, and that advantage comes from the special quantum properties of aqubit that it can occupy asuperposition of states when not being measured.
Another point to keep in mind is that due to probabilisticwaveform properties of qubits, QCs do not typically deliver one answer, butrather a narrow range of possible answers. Multiple runs of the samecalculation can further narrow the range, but at the expense of lessening speedgains.
Classical computers will not be replaced by QCs. A primaryreason for this is that QCs cannot run the if/then/else logic functions thatare a cornerstone of the classical Von Neumann computer architecture. InsteadQCs will be used alongside classical computers to solve those problems thatthey are particularly good at, such as optimization problems.
The strengths of QCs in simultaneous calculations mean thatthey excel at finding optimal solutions to problems with a large number ofvariables, where the optimal combination is only found after trying out anenormous number of possible combinations or permutations. Such problems arefound, for example, in optimizing any portfolio composition, or trying outmillions of possible new molecular combinations for drugs, or in routing manyaircraft between many hubs. In such problems there are typically 2npossibilities and they all have to be tried out to find an optimal solution. Ifthere are 100 elements to combine, it becomes a 2100 computation,which is almost impossible to solve with a classical computer but a 100-qubitcomputer could solve it in one operation.
Quite a few hard problems in finance are in essenceoptimization problems and therefore meet the description of problems that canbe solved by QCs. The portfolio-optimization problem in finance is one goodexample of such a problem. Asset pricing, credit-scoring, and Monte Carlo-typerisk analysis are other examples. That explains the keen interest of thefinance industry in quantum solutions. The finance industry is also wellpositioned to be an early adopter, because financial algorithms are muchquicker to deploy than algorithms that drive industrial or other physicalprocesses.
A QC architecture can be seen as a stack with the followingtypical layers:
At the bottom is the actual quantum hardware (usually held at near-absolute zero temperaturesto minimize thermal noise, and/or in a vacuum)
The next level up comprises the control systems that regulate thequantum hardware and enable the calculation
Above those comes the software layer that implements the algorithms (and in future, alsowill do the error correction). It includes a quantum-classical interface thatcompiles source code into executable programs
The top of the stack comprises the wider varietyof services to utilize the QC, e.g. the operatingsystems and software platformsthat help translate real-life problems into a format suitable for quantumcomputing
There are many different ways to physically realize qubitsfrom using trapped calcium ions to superconducting structures.In each case, quantum states are being manipulated to perform calculations.Quantum computers can entangle qubits by passing them through quantum logic gates. For example, aCNOT (conditional NOT) gate flipsor doesnt flipa qubit based on the stateof another qubit. Stringing multiple quantum logic gates together creates a quantum circuit.
The designers of QCs need to master and control both superposition and entanglement:
Without superposition, qubits wouldbehave like classical bits, and would not be in the multiple states that allowquantum programmers to run the equivalent of many calculations at once. Withoutentanglement, the qubits would sit in superposition without generatingadditional insight by interacting. No calculation would take place because thestate of each qubit would remain independent from the others. The key tocreating business value from qubits is to manage superposition and entanglementeffectively.[i]
The simplest and most typical physical properties that canserve as a qubit is the electrons internal angular momentum, spin for short. It has the quantumproperty of having only two possible projections on any coordinate axis, +1/2or -1/2 in units of the Planck constant. For any chosen axis the two basicquantum states of the electrons spin can be denoted as (up) or (down). Butthese are not the only states possible for a quantum bit, because the spinstate of an electron is described by a quantum-mechanical wave function. Thatfunction includes two complexnumbers, called quantum amplitudes, and , each with its own magnitude. The rules of quantum mechanics dictate thatQUOTE 162+2=1"> 162+2=1"> . Both and have real and imaginaryparts. The squared magnitudes 2 and 2 correspond to theprobabilities of the spin of the electron to be in the basic states or whenthey are measured. Since those are the only two outcomes possible, their squaredmagnitudes must equal 1. In contrast to a classical bit, which can only be inone of its two binary states, a qubit can be in any continuum of possiblestates, as defined by the quantum amplitudes and . In the popular press thisis often explained by the oversimplified, and somewhat mystical, statement thata qubit can exist simultaneously in both its or states. That is analogousto saying that a plane flying northwest is simultaneously flying both west andnorth, which is not incorrect strictly speaking, but not a particularly helpfulmental model either.
Because a qubit can only be measured in one of these two states, qubits cannot store any moredata than conventional computer bits. There is thus no quantum advantage indata storage. The advantage is in information processing, and that advantagecomes from the special quantum properties of a qubit meaning it can occupy asuperposition of states when not being measured. During computation, qubits caninteract with one another while in their superposition state. For example, aset of 6 qubits can occupy any linear combination of all the 26 = 64different length 6-bit strings. With 64 continuous variables describing thisstate, the space of configurations available to a QC during a calculation is muchgreater than a classical one. The measurement limitations of storinginformation do not apply during the runtime execution of a quantum algorithm:During processing every qubit in a quantum algorithm can occupy asuperposition. Thus, in a superposition state, every possible bit string (inthis example, 26 = 64 different strings)) can be combined. Each bitstring in the superposition has an independent complex number coefficient witha magnitude (A) and a phase ():
i = Aieii
A modern digital computer, with billions of transistors inits processors, typically has 64 bits, not 6 as in our quantum example above.This allows it to consider 64 bits at once, which allows for 264states. While 264 is a large number, equal to approximately 2 x 1019,quantum computing can offer much more. The spaceof continuous states of QCs is much larger than the space of classical bitstates. That is because the possibility of many particles interacting at thequantum level to form a common wave function, allowing changes in one particleto affect all others instantaneously and in a well-ordered manner. That is akinto massive parallel computing, which can beat classical multicore systems.
Quantum computing operations can mostly be handled accordingto the standard rules of linear algebra, in particular matrix multiplication. The quantum state is represented by a state vectorwritten in matrix form, and the gates in the quantum circuit (whereby the calculations are executed) arerepresented as matrices too. Multiplying a state vector by a gate matrix yieldsanother state vector. Recent progress has been made to use quantum algorithmsto crack non-linear equations, by using techniques that disguise non-linearsystems as linear ones.[ii]
The possibility of quantum computing was raised by Caltechphysicist, Richard Feynman, in 1981. The person considered by most to be thefounder of quantum computing, David Deutsch, first defined a QC in a seminalpaper in 1985.[iii]
In 1994, a Bell Labs mathematician, Peter Shor, developed aquantum computing algorithm that can efficiently decompose any integer numberinto its prime factors.[iv]It has since become known as the Shoralgorithm and has great significance for quantum computing. Shorsalgorithm was a purely theoretical exercise at the time, but it anticipatedthat a hypothetical QC could one day solve NP-hard problems of the type used asthe basis for modern cryptography. Shors algorithm relies on the specialproperties of a quantum machine. While the most efficient classical factoringalgorithm, known as the general number field sieve, uses an exponential function of a constant x d1/3to factor an integer with d digits; Shors algorithm can do that byexecuting a runtime function that is only a polynomialfunction, namely a constant x d3. Accordingly, classicalcomputers are limited to factoring integers with only a few hundred digits,which is why using integers in the thousands in cryptography keys is consideredto make for practically unbreakable codes. But a QC using the Kitaev version ofShors algorithm only needs 10d qubits, and will have a runtime roughly equalto d3.[v]
In summary, the Shor algorithm means that a QC can solve anNP-hard mathematical problem in polynomial time that classical computers canonly solve in exponential time.Therefore, Shors algorithm can demonstrate by how much quantum computing canimprove processing time over classical computing. While a full-scale QC withthe thousands of qubits needed to employ Shors algorithm in practice to crackcodes is not yet available, many players are working towards machines of thatsize.
Another important early QC algorithm is Grovers algorithm, a search algorithm which finds a particularregister in an unordered database. This problem can be visualized as aphonebook with N names arranged in completely random order. In order to findsomeone's phone number with a probability of , any classical algorithm(whether deterministic or probabilistic) will need to look at a minimum of N/2names. But the quantum algorithm needs only QUOTE 16ON"> 16ON"> steps.[vi]This algorithm can also be adapted for optimization problems.
Most quantum calculations are performed in what is called a quantum circuit. The quantum circuit isa series of quantum gates thatoperate on a system of qubits. Each quantum gate has inputs and outputs andoperates akin to the hardware logic gates in classical digital computers. Likedigital logic gates, the quantum gates are connected sequentially to implementquantum algorithms.
Quantum algorithmsare algorithms that run on QCs, and which are structured to use the uniqueproperties of quantum mechanics, such as superposition or quantum entanglement,to solve particular problem statements. Major quantum algorithms include thequantum evolutionary algorithm (QEA), the quantum particle swarm optimizationalgorithm (QPSO), the quantum annealing algorithm (QAA), the quantum neuralnetwork (QNN), the quantum Bayesian network (QBN), the quantum wavelettransform (QWT), and the quantum clustering algorithm (QC).[vii]A comprehensive catalog of quantum algorithms can be found online in the Quantum Algorithm Zoo.[viii]
Quantum softwareis the umbrella term used to describe the full collection of QC instructions,from hardware-related code, to compilers, to circuits, all algorithms andworkflow software.
Quantum annealingis an alternative model to circuit-based algorithms, as it is not built up outof gates. Quantum annealing naturally returns low-energy solutions by utilizinga fundamental law of physics that any system will tend to seek its minimumstate. In the case of optimization problems, quantum annealing uses quantumphysics to find the minimum energy state of the problem, which equates to theoptimal or near-optimal combination ofits constituent elements.[ix]
An Ising machineis a non-circuit alternative that works for optimization problems specifically.In the Ising model, the energy from interactions between the spins of everypair of electrons in a collection of atoms is summed. Since the amount of energydepends on whether spins are aligned or not, the total energy of the collectiondepends on the direction in which each spin in the system points. The generalIsing optimization problem is determining in which state the spins should be sothat the total energy of the system is minimized. To use the Ising model foroptimization requires mapping parameters of the original optimization problem,such as an optimal route for the TravelingSalesman,into a representative set of spins, and to define how the spins influence oneanother.[x]
Hybrid computingtypically entails transferring the problem (say optimization) into a quantumalgorithm, of which the first iteration is run on a QC. This provides a veryfast answer, but only a rough assessment of the valid total solution space. Therefined answer is then found with a powerful classical computer, which only hasto examine a subset of the original solution space.[xi]
The Achilles heel of the QC is the loss of coherence, or decoherence, caused by mechanical(vibration), thermal (temperature fluctuations), or electromagnetic disturbanceof the subatomic particles used as qubits. Until the technology improves,various workarounds are needed. Commonly algorithms are designed to reduce thenumber of gates in an attempt to finish execution before decoherence and othersources of errors can corrupt the results.[xii]This often entails a hybrid computing scheme which moves as much work aspossible from the QC to classical computers.
Current guestimates by experts are that truly useful QCswould need to be between 1,000 and 100,000 qubits. However, quantum-computingskeptics such as Mikhail Dyakonov, a noted quantum physicist, point out thatthe enormous number of continuous parameters that would describe the state of auseful QC might also be its Achilles heel. Taking the low end of a 1,000 qubitsmachine, would imply a QC with 21,000 parameters describing itsstate at any moment. That is roughly 10300, a number greater thanthe number of subatomic particles in the universe: A useful QC needs toprocess a set of continuous parameters that is larger than the number ofsubatomic particles in the observable universe.[xiii]How would error control be done for 10300 continuous parameters?According to quantum-computing theorists the threshold theorem proves that it can be done. Their argument isthat once the error per qubit per quantum gate is below a certain thresholdvalue, indefinitely long quantum computation becomes possible, at a cost ofsubstantially increasing the number of qubits needed. The extra qubits areneeded to handle errors by forming logical qubits using multiple physical qubits.(This is a bit like error correction in current telecom systems, which useextra bits to validate data.) But that greatly increases the number of physicalqubits to handle, which as we have seen, are already more than astronomical. Atthe very least, this brings into perspective the magnitude of the technologicalproblems that scientists and engineers will have to overcome.
To put the comparative size of the QC error-correctionproblem in practical terms: For a typical 3-Volt CMOS logic circuit used inclassical digital computers, a binary 0 would be any voltage measured between0V and 1V, while a binary 1 would be any voltage measured between 2V and 3V.Thus when e.g. 0.5V of noise is added to the signal for binary 0, themeasurement would be 0.5V which would still correctly indicate a binary valueof 0. For this reason, digital computers are very robust to noise. However, fora typical qubit, the difference in energy between a zero and a one is just 10-24Joulesone ten-trillionth as much energy as an X-ray photon. Error correctionis one of the biggest hurdles to overcome in quantum computing, the concernbeing that it will impose such a huge overhead, in terms of auxiliarycalculations, that it will make it very hard to scale QCs.
After Dyakonov published the skeptics viewpoint two yearsago, a vigorous debate followed.[xiv]A typical response to the skeptics case comes from an industry-insider,Richard Versluis, systems architect at QuTech, a Dutch QC collaboration.Versluis acknowledges the engineering challenges to control a QC and to makesure its state is not affected. However, he states that the challenge is tomake sure that the control signals and qubits perform as desired. Major sourcesof potential errors are quantum rotationsthat are not perfectly accurate, and decoherenceas qubits lose their entanglement and the information they contain. Versluisgoes on to define a five-layered QC architecture that he believes will be up tothe task. From top to bottom, the layers are 1. Application layer, 2. Classicalprocessing, 3. Digital processing, 4. Analog processing, and 5. Quantumprocessing. Together the digital-, analog-, and quantum-processing layerscomprise the quantum processing unit (QPU). But Versluis also has toacknowledge that quantum error correction could solve the fundamental problemof decoherence only at the expense of 100 to 10,000 error-correcting physicalqubits per logical (calculating) qubit. Furthermore, each of these millions ofqubits will need to be controlled by continuous analog signals. And the biggestchallenge of all is doing the thousands of measurements per second in a waythat they do not disturb quantum information (which must remain unknown untilthe end of the calculation), while catching and correcting errors. The currentparadigm of measuring all qubits with analog signals will not scale up tolarger machines, and a major advance in the technology will be required.[xv]
Most experts agree that we will have to live with QCs overthe next few years that will have high levels of errors that go uncorrected.There is even an accepted industry term and acronym for such QCs: NISQ (Noisy Intermediate-Scale Quantum)devices. The NISQ era is expected to last for the next five years at least, barany major breakthroughs that might shorten that timeline.
Once critical technical breakthroughs are made, QC adoptionmay happen faster than expected due to the prevalence of cloud computing.Making QC services easily accessible over the cloud speeds both adoption andlearning. It has the added advantage that it forces hardware makers to focus onbuilding QCs with a high percentage of uptime, so as to ensure continuedavailability over the cloud.
Most QC makers already offer cloud access to their latestQCs. There are programming environments softwaredevelopment kits (SDKs) that facilitate the building of quantum circuits available over the cloud for QC programmers to learn how to write the softwarethat unleashes the magic of quantum computing, and to experiment with it. Asmore functionality is added to the hardware, these SDKs are continually updated.
The implication is that a whole ecosystem is being broughtup to speed on how to make the best use of a quantum capability that does notquite exist yet. An analogy would be having had flight simulators to trainfuture pilots while the Wright brothers were still figuring out how to keeptheir plane in the air for more than a few hundred feet. The upside of thisapproach is that any real advances in making reliable QCs with capabilitiessuperior to classical computers will be very quickly exploited by real-worldapplications. This situation is in contrast to most major technologicalbreakthroughs we have seen in the past. For example, it took a generation ortwo for industrial engineers to learn how to properly use electrical power inthe place of steam power in factories. More recently, it took a generation tofully exploit the capabilities of digital computing in business and elsewhere.But in the case of quantum computing, all the knowledge building inanticipation of a successful QC could be rapidly translated into applicationsby a corps of developers who are all trained up and ready to fly the planeonce it is finally built. That is the optimistic perspective.
Quantum circuits are already being developed using quantumprogramming languages and so-called quantumdevelopment kits (QDKs),such as Qiskit by IBM and Google Cirq based on Python; and Q# by Microsoftbased on the C# language. The next stepis to develop libraries and workflows for different application domains.Examples of the former are IBMs Aqua and Q# libraries. Examples of the latterare D-Waves Ocean development tool kit for hybrid quantum-classicalapplications and to translate quantum optimization problems into quantumcircuits; or Zapatas Orquestra to compose, run and analyze quantum workflows.On top of the circuits and libraries come the domain-specific applicationplatforms. Orchestrating and integrating classical and quantum workflows tosolve real problems with hybrid quantum-classical algorithms is the name of thegame for the next few years.[xvi]
Quantum-inspired software is already in operation, becausethese applications run on classical computers and not on quantum machines. Amajor example is Fujitsu Quantum-Inspired Digital Annealer Services.[xvii]Even on a theoretical level, quantum ideas have already been fruitful inseveral problem areas, where restructuring problems using quantum principleshave resulted in improved algorithms, proofs, and refuting erroneous oldalgorithms.[xviii]Quantum-inspired software is closely related to quantum-ready software, which can be run on suitable QCs once theyare available.
The industrialization of QCs has entered a critical period.Major countries and leading enterprises in the world are investing huge humanand material resources to advance research in quantum computing.
Google perhaps prematurely used the term quantum supremacy in October 2019 whenit announced the results of its quantum supremacy experiment in a blog[xix]and an article in Nature.[xx]The experiment used Googles 54-qubit processor, named Sycamore, to perform acontrived benchmark test in 200 seconds that would take the fastestsupercomputer 10,000 years to do. But at some point in the future, true quantumsupremacy may indeed be achieved.
Quantum supremacywas originally defined by Caltechs John Preskill[xxi]as the point at which the capabilities of a QC exceed those of any availableclassical computer; the latter is usually understood to be the most advancedsupercomputer built on classical architecture. At one point this was estimatedbe when a QC with 50 or more qubits could be demonstrated. But some experts sayit depends more on how many logical operations (gates) can be implemented in asystem of qubits before their coherence decays, at which point errorsproliferate and further computation becomes impossible. How the qubits areconnected also matters.[xxii]
This led IBM-researchers to formulate the concept of quantum volume (QV) in 2017. More QVmeans a more powerful computer, but QV cannot be increased by increasing onlythe number of qubits. QV is a hardware-agnostic performance measurement forgate-based QCs that considers a number of elements including the number ofqubits, connectivity of the qubits, gate fidelity, cross talk, and circuitcompiler efficiency. In late 2020, IonQ announced that it has calculated a QVof 4 million for its 5th generation QC. Before this announcement, Honeywell's 7-qubition-trap QC had the industry's highest published quantum volume of 128, and IBMhad the next highest QV of 64 with its 27-qubit superconducting quantummachine.[xxiii]In early March 2021, Honeywell claimed to have regained the lead by achieving aQV of 512 with an updated version of System Model H1 QC.[xxiv]Alternating announcements like these from the major QC developers are likely tocontinue for the time being, as each compete for the title of most powerful QC.
Rather than thinking about quantum supremacy as an absolutethreshold or milestone, it is wiser to think about so-called quantum supremacyexperiments as benchmarking experiments for the new technology, perhaps similarto the way we came to express automobile engine power in measures of horsepower. There is also an intriguing question lingering over the whole concept ofquantum supremacy, which is: How could anyone know that a quantum computer isgenuinely doing something that is impossible for a classical one to do ratherthan that they just havent yet found a classical algorithm that is clever enoughto do the job?[xxv]It may be that the advent of quantum computing will force and inspire newdevelopments in classical computing algorithms, something we are already seeingin the concept of quantum-inspired computing software, which will be discussedfurther in a later section.
There is a difference between quantum advantage and quantum supremacy. Quantum supremacy is whenit can be demonstrated that a QC can do something that cannot be done on aclassical computer. Quantum advantage is that a quantum solution can provide areal-world advantage over using the classical approach. (It does not imply thata classical computer could not do it at all.)
There is a second meaning one could attach to quantumsupremacy, which is to mean which nation will hold the technological advantageto this technology of the future. The current list of Top 500 (classical)supercomputers[xxvi]provides a good indication of where the hot spots of quantum computing willlikely be, since no country or region will want to cede a hard-gained advantagein classical computing. Currently, 43 percent of supercomputers are in China,23 percent in the United States, 7 percent in Japan, and about 19 percent inEurope (including the United Kingdom but excluding Russia).
In the European Union, the European Commission founded the Quantum Flagship as a ten-yearcoordinated research initiative which will have at least 1 billion in funding.The long-term vision is the creation of a Quantum Web, defined as quantumcomputers, simulators and sensors interconnected via quantum networksdistributing information and quantum resources such as coherence andentanglement.[xxvii]
The equivalent U.S. initiative is known as the National Quantum Initiative (NQI), andthe $1.2 billion of U.S. government funds are going to the National Instituteof Standards and Technology (NIST), National Science Foundation (NSF)Multidisciplinary Centers for Quantum Research and Education and to theDepartment of Energy Research and National Quantum Information Science ResearchCenters.[xxviii]NIST partners with the University of Colorado Boulder on quantum computingresearch through JILAs Quantum Information Science&Technology (QIST).[xxix]NIST, the Laboratory for Physical Sciences (LPS), and the University of Marylandhave formed the Joint Quantum Institute (JQI)[xxx]to conduct fundamental quantum research. The Joint Center for QuantumInformation and Computer Science (QuICS)[xxxi]was founded in another partnership between NIST and the University of Marylandto specifically advance advances research QC science and quantum informationtheory.
The Chinese government is investing upwards of $10bn inquantum computing, an order of magnitude greater than the respectiveinvestments of $1.2bn by the U.S. government and the E.U. The U.K. and Japanesegovernments are each investing in the order of $300m, with Canada and SouthKorea investing about $40m each.[xxxii]
Chinas multi-billion quantum computing initiative aims toachieve significant breakthroughs by 2030. President Xi has committed billionsto establish the Chinese National Laboratory for Quantum Information Sciences.
The implication of the difference in funding with China, isthat United States is mostly relying on private investments by its tech giantsto remain competitive. Time will tell if that is a wise strategy. It is not asif large tech companies in China are not investing in quantum computing too Alibaba, Tencent, and Baidu are all known to be heavily investing in thetechnology. According to some metrics, China has already gained an early advantageby accumulating more quantum computing-related patents than the United States.[xxxiii]In 2019 Google announced that its a QC performed a particular computation in200 seconds that would take todays fastest supercomputers 10,000 years. But inDecember 2000, Chinese researchers at the University of Science and Technologyin China (USTC) claimed that their prototype QC (based on photons) is 10billion times faster than Googles.[xxxiv]
The Chinese desire to lead the world on quantum computing isnot purely motivated by a desire for industrial competitiveness and economicpower. Threat assessments[xxxv]point to Chinese quantum research and experiments in defense applications suchas:
Using entanglement for secure long-distancemilitary communications, e.g. between satellites and earth stations
Quantum radar that could nullify current U.S.advantages in stealth technology against conventional radars
Quantum submarine detection to ranges of overfive kilometers that would limit the operations of U.S. nuclear submarines
Quantum computers are very hard to build. They requireintricate manipulations of subatomic particles, and operating in a vacuumenvironment or at cryogenic temperatures.
The state of quantum computing resembles the early days ofthe aircraft and automobile industries, when there was a similar proliferationof diverse architectures and exotic designs. Eventually, as quantum technologymatures, a convergence can be expected similar to what we have seen in thoseindustries. In fact, the arrival of such a technological convergence would be agood measure of a growing maturity of quantum computing technology.
There are a number of technical criteria[xxxvi]for making a good QC:
Qubit must stay coherent for long enough to allow the computing to be completed inthe state of superposition. That requires isolation because decoherence occurswhen qubits interact with the outside world
Qubits must be highly connected. This occurs through entanglement and is needed foroperations to act on multiple qubits
High-fidelityoperations are needed. As pointed out above, classical digital computersrely on the digital nature of signals for noise resistance. However, sincequbits need to precisely represent numbers that are not just zero and oneduring the computation state, digital noise reduction is not possible and thenoise problem is more analogous to that in an old-fashioned analog computer.Since noise cannot be easily prevented and must therefore be mitigated, thefocus of current research is on noise-correction techniques
Gate operations must be fast. In practice, this is a trade-off between maintainingcoherence and high-fidelity
Highscalability. It should be obviousthat QCs will only be useful when they can be scaled large enough to solvevaluable problems
Currently, the two quantum technologies showing the greatestpromise and attracting the most interest and investment dollars are superconducting qubits and trapped ions. These and other morenascent or theoretical technologies are presented in Table 1, along with themain proponents in each technology.
Table SEQ Table * ARABIC1. Qubit Technologies and MainProponents
Technology
Main Proponents
Superconducting qubits (called transmons by some) are realized by using a microwave signal to put a resistance-free current in a superposition state. This technology has fast gate times and the advantage of more proven technologies superconducting circuits are based on well-known complementary metal-oxide semiconductor technology (CMOS) used in digital computers. But superconducting qubits have fast decoherence times and require more error correction. Superconduction requires cooling to a temperature very close to absolute zero. The technology is considered to be highly scalable.
IBM
Rigetti
D-Wave
Alibaba
Intel
NEC
Quantum Circuits
Oxford Quantum Circuits
Ion Trap QCs work by trapping ions electric fields and holding them in place. The outermost electron orbiting the nucleus is put in different states and used as a qubit. Ion Trap qubits have longer coherence times and can operate with minor cooling, but do require a high vacuum. Thought the first quantum logic gate was demonstrated in 1995 using trapped atomic ions, at a system level this technology is less mature and require progress in multiple domains including vacuum, laser, and optical systems, radio frequency and microwave technology, and coherent electronic controllers
IonQ
Honeywell
Alpine Quantum Technologies
Photonic qubits are photons (light particles) that operate on silicon chip pathways. Such qubits do no require extreme cooling, and silicon chip fabrication techniques are well-established, making this technology highly scalable.
Continue reading here:
Quantum Computing In Finance Where We Stand And Where We Could Go - Science 2.0
Posted in Quantum Physics
Comments Off on Quantum Computing In Finance Where We Stand And Where We Could Go – Science 2.0
Outlook on the Quantum Technology Global Market to 2026 – – GlobeNewswire
Posted: at 6:23 am
Dublin, May 14, 2021 (GLOBE NEWSWIRE) -- The "Quantum Technology Market by Computing, Communications, Imaging, Security, Sensing, Modeling and Simulation 2021 - 2026" report has been added to ResearchAndMarkets.com's offering.
This report provides a comprehensive analysis of the quantum technology market. It assesses companies/organizations focused on quantum technology including R&D efforts and potential gaming-changing quantum tech-enabled solutions. The report evaluates the impact of quantum technology upon other major technologies and solution areas including AI, Edge Computing, Blockchain, IoT, and Big Data Analytics. The report provides an analysis of quantum technology investment, R&D, and prototyping by region and within each major country globally.
The report also provides global and regional forecasts as well as the outlook for quantum technology's impact on embedded hardware, software, applications, and services from 2021 to 2026. The report provides conclusions and recommendations for a wide range of industries and commercial beneficiaries including semiconductor companies, communications providers, high-speed computing companies, artificial intelligence vendors, and more.
Select Report Findings:
Much more than only computing, the quantum technology market provides a foundation for improving all digital communications, applications, content, and commerce. In the realm of communications, quantum technology will influence everything from encryption to the way that signals are passed from point A to point B. While currently in the R&D phase, networked quantum information and communications technology (ICT) is anticipated to become a commercial reality that will represent nothing less than a revolution for virtually every aspect of ICT.
However, there will be a need to integrate the ICT supply chain with quantum technologies in a manner that does not attempt to replace every aspect of classical computing but instead leverages a hybrid computational framework. Traditional High-Performance Computing (HPC) will continue to be used for many existing problems for the foreseeable future, while quantum technologies will be used for encrypting communications, signaling, and will be the underlying basis in the future for all commerce transactions. This does not mean that quantum encryption will replace Blockchain, but rather provide improved encryption for blockchain technology.
The quantum technology market will be a substantial enabler of dramatically improved sensing and instrumentation. For example, gravity sensors may be made significantly more precise through quantum sensing. Quantum electromagnetic sensing provides the ability to detect minute differences in the electromagnetic field. This will provide a wide-ranging number of applications, such as within the healthcare arena wherein quantum electromagnetic sensing will provide the ability to provide significantly improved mapping of vital organs. Quantum sensing will also have applications across a wide range of other industries such as transportation wherein there is the potential for substantially improved safety, especially for self-driving vehicles.
Commercial applications for the quantum imaging market are potentially wide-ranging including exploration, monitoring, and safety. For example, gas image processing may detect minute changes that could lead to early detection of tank failure or the presence of toxic chemicals. In concert with quantum sensing, quantum imaging may also help with various public safety-related applications such as search and rescue. Some problems are too difficult to calculate but can be simulated and modeled. Quantum simulations and modeling is an area that involves the use of quantum technology to enable simulators that can model complex systems that are beyond the capabilities of classical HPC. Even the fastest supercomputers today cannot adequately model many problems such as those found in atomic physics, condensed-matter physics, and high-energy physics.
Key Topics Covered:
1.0 Executive Summary
2.0 Introduction
3.0 Quantum Technology and Application Analysis3.1 Quantum Computing3.2 Quantum Cryptography Communication3.3 Quantum Sensing and Imaging3.4 Quantum Dots Particles3.5 Quantum Cascade Laser3.6 Quantum Magnetometer3.7 Quantum Key Distribution3.8 Quantum Cloud vs. Hybrid Platform3.9 Quantum 5G Communication3.10 Quantum 6G Impact3.11 Quantum Artificial Intelligence3.12 Quantum AI Technology3.13 Quantum IoT Technology3.14 Quantum Edge Network3.15 Quantum Blockchain
4.0 Company Analysis4.1 1QB Information Technologies Inc.4.2 ABB (Keymile)4.3 Adtech Optics Inc.4.4 Airbus Group4.5 Akela Laser Corporation4.6 Alibaba Group Holding Limited4.7 Alpes Lasers SA4.8 Altairnano4.9 Amgen Inc.4.10 Anhui Qasky Science and Technology Limited Liability Company (Qasky)4.11 Anyon Systems Inc.4.12 AOSense Inc.4.13 Apple Inc. (InVisage Technologies)4.14 Biogen Inc.4.15 Block Engineering4.16 Booz Allen Hamilton Inc.4.17 BT Group4.18 Cambridge Quantum Computing Ltd.4.19 Chinese Academy of Sciences4.20 D-Wave Systems Inc.4.21 Emerson Electric Corporation4.22 Fujitsu Ltd.4.23 Gem Systems4.24 GeoMetrics Inc.4.25 Google Inc.4.26 GWR Instruments Inc.4.27 Hamamatsu Photonics K.K.4.28 Hewlett Packard Enterprise4.29 Honeywell International Inc.4.30 HP Development Company L.P.4.31 IBM Corporation4.32 ID Quantique4.33 Infineon Technologies4.34 Intel Corporation4.35 KETS Quantum Security4.36 KPN4.37 LG Display Co. Ltd.4.38 Lockheed Martin Corporation4.39 MagiQ Technologies Inc.4.40 Marine Magnetics4.41 McAfee LLC4.42 MicroSemi Corporation4.43 Microsoft Corporation4.44 Mirsense4.45 Mitsubishi Electric Corp.4.46 M-Squared Lasers Limited4.47 Muquans4.48 Nanoco Group PLC4.49 Nanoplus Nanosystems and Technologies GmbH4.50 Nanosys Inc.4.51 NEC Corporation4.52 Nippon Telegraph and Telephone Corporation4.53 NN-Labs LLC.4.54 Nokia Corporation4.55 Nucrypt4.56 Ocean NanoTech LLC4.57 Oki Electric4.58 Oscilloquartz SA4.59 OSRAM4.60 PQ Solutions Limited (Post-Quantum)4.61 Pranalytica Inc.4.62 QC Ware Corp.4.63 QD Laser Co. Inc.4.64 QinetiQ4.65 Quantum Circuits Inc.4.66 Quantum Materials Corp.4.67 Qubitekk4.68 Quintessence Labs4.69 QuSpin4.70 QxBranch LLC4.71 Raytheon Company4.72 Rigetti Computing4.73 Robert Bosch GmbH4.74 Samsung Electronics Co. Ltd. (QD Vision Inc.)4.75 SeQureNet (Telecom ParisTech)4.76 SK Telecom4.77 ST Microelectronics4.78 Texas Instruments4.79 Thorlabs Inc4.80 Toshiba Corporation4.81 Tristan Technologies4.82 Twinleaf4.83 Universal Quantum Devices4.84 Volkswagen AG4.85 Wavelength Electronics Inc.4.86 ZTE Corporation
5.0 Quantum Technology Market Analysis and Forecasts 2021 - 20265.1 Global Quantum Technology Market 2021 - 20265.2 Global Quantum Technology Market by Technology 2021 - 20265.3 Quantum Computing Market 2021 - 20265.4 Quantum Cryptography Communication Market 2021 - 20265.5 Quantum Sensing and Imaging Market 2021 - 20265.6 Quantum Dots Market 2021 - 20265.7 Quantum Cascade Laser Market 2021 - 20265.8 Quantum Magnetometer Market 2021 - 20265.9 Quantum Key Distribution Market 2021 - 20265.9.1 Global Quantum Key Distribution Market by Technology5.9.1.1 Global Quantum Key Distribution Market by Infrastructure Type5.9.2 Global Quantum Key Distribution Market by Industry Vertical5.9.2.1 Global Quantum Key Distribution (QKD) Market by Government5.9.2.2 Global Quantum Key Distribution Market by Enterprise/Civilian Industry5.10 Global Quantum Technology Market by Deployment5.11 Global Quantum Technology Market by Sector5.12 Global Quantum Technology Market by Connectivity5.13 Global Quantum Technology Market by Revenue Source5.14 Quantum Intelligence Market 2021 - 20265.15 Quantum IoT Technology Market 2021 - 20265.16 Global Quantum Edge Network Market5.17 Global Quantum Blockchain Market5.18 Global Quantum Exascale Computing Market5.19 Regional Quantum Technology Market 2021 - 20265.19.1 Regional Comparison of Global Quantum Technology Market5.19.2 Global Quantum Technology Market by Region5.19.2.1 North America Quantum Technology Market by Country5.19.2.2 Europe Quantum Technology Market by Country5.19.2.3 Asia Pacific Quantum Technology Market by Country5.19.2.4 Middle East and Africa Quantum Technology Market by Country5.19.2.5 Latin America Quantum Technology Market by Country
6.0 Conclusions and Recommendations
For more information about this report visit https://www.researchandmarkets.com/r/pcwigy
Here is the original post:
Outlook on the Quantum Technology Global Market to 2026 - - GlobeNewswire
Posted in Quantum Physics
Comments Off on Outlook on the Quantum Technology Global Market to 2026 – – GlobeNewswire
Pathogenic, auto-immune or viral, all diseases are actually epigenetic – The Times of India Blog
Posted: at 6:23 am
Though the coronavirus pandemic is the greatest tragedy the world has witnessed in recent times, it also has a potential to be a blessing in disguise as it can transform the way we understand and treat diseases.
If we look at all the diseases broadly, they can be clubbed into three large groups.
Most of the diseases we laypeople recognize are caused by alien pathogens, be it a bacterium or some other parasites. Malaises like cholera, malaria,amebiasis, mucormycosis are simple in nature as they show a clear cause-and-effect relationship between a pathogen and malaise and hence the cure is, in general, eradicating of the alien pathogen.
The other group is of auto-immune diseases. Phenomena like cancer or allergic reaction belong to this confusing group as here the body itself messes things up. These diseases are tricky to cure as they are by-products of some of the critical internal systems. So, to cure them we either remove such cells physically or try and find a way to kill them selectively or just counter their activities by suppressing them, even at a cost.
The last group is the trickiest and that is the virus-caused diseases. Though most of them are not too harmful as a healthy body can deal with them using native immune system, but that is actually the only option as, if the immune system cant deal with them, there is no real cure. We have anti-viral drugs but not at all comparable to the level of success that anti-biotics or anti-parasitic drugs (sad to say that only temporarily) enjoy.
Corona virus diseases belong to the last group and SARS-CoV-2 pandemic is a specific case where the native immune system of some people is not able to cope with it.
In SARS-CoV-2 infection, we have an alien virus pathogen causing auto-immune reaction, making it a bridge between all three groups that we treat differently today and hence it demands that we start re-looking at what we call a disease.
To understand this phenomenon, we have to go to the source of all life, the genetic codes that are enshrined in each living cell. These genetic codes are more like instruction/equations of chemical reactions. They give expression to mindboggling complexity while remaining simple in nature by following simple rules.
Life is a game of these rules and evolution is a game of finding new rules that work. If we put these two together, a disease is a random process of finding new rules that work through genetic interaction between all life forms.
If we try and use a metaphor to understand this better, life is a book, but not a linear one. It has commands with if-then logic.
It is a bit like, if there is a person wearing blue shirt in the room, read the 467thline of the book, but if there is a person wearing red shirt, read the 145thline. This just gets worse as it can even be, if they have a dog, read 179th, so on and on. There is no end to the complexity in the book as it responds to the entire reality and hence, we are now hitting a roadblock in terms of how to use genetics, as it is now understood that there is more outside than inside in terms of the story.
This science of finding relationships about how external factors dictate the way genetic code is read is called epigenetics.
Epigenetics is the quantum physics of biology telling us yet again that the God does play dice.
Thankfully, it is not as bad as it seems. Just as we have probabilistic handles to understand quantum physics, epigenetic is also orderly and has cause-and-effect relationships that we can fathom or will fathom in future (just as we hope for the quantum physics).
This new understanding of life demands a new way of curing, i.e., ensuring that the book is read coherently and if it is made to stray, bring it back on main story-line.
Keeping the book reading on course is obviously life-style corrections that reduce chances of straying, but the challenge is what to do when it goes off-the-course.
The new healthcare of the future would be all about transcending the ideas like parasites or viruses or even auto-immune disorders. They are all part of the epigenetic forces and hence the cure is best found within than without.
Logically speaking, for every disease, the clue is to find the mis-reading of the code it thrives on and finding a way to disrupt it, but there is another twist in the tail/tale or rather an advantage, and that is, each of us are a different version of the book and hence, each of us will respond differently (as we have seen in case of SARS-CoV-2). It is these different responses will help us identify clues if we look for them.
As SARS-CoV-2 is studied a lot, a great start would be to do genetic sequencing of those who have the misfortune of having the genetic predisposition to have cytokine storm when the viral information is inserted in their book. The same process must be conducted for each of the specific cases, be it malaria or cancer.
In short, we need to find the page in the book that derails the metabolic process and find people who have that page first. Once we manage to find that, we will be able to manage the healthcare in a more rational way.
This is clearly a job of couple of millennia (if we are lucky) and SARS-CoV-2 could be a Gods way of asking us to move one step forward.
Views expressed above are the author's own.
END OF ARTICLE
See more here:
Pathogenic, auto-immune or viral, all diseases are actually epigenetic - The Times of India Blog
Posted in Quantum Physics
Comments Off on Pathogenic, auto-immune or viral, all diseases are actually epigenetic – The Times of India Blog
Researchers confront major hurdle in quantum computing – University of Rochester
Posted: May 9, 2021 at 11:54 am
May 4, 2021
Quantum science has the potential to revolutionize modern technology with more efficient computers, communication, and sensing devices. But challenges remain in achieving these technological goals, especially when it comes to effectively transferring information in quantum systems.
A regular computer consists of billions of transistors, called bits. Quantum computers, on the other hand, are based on quantum bits, also known as qubits, which can be made from a single electron.
Unlike ordinary transistors, which can be either 0 (off) or 1 (on), qubits can be both 0 and 1 at the same time. The ability of individual qubits to occupy these so-called superposition states, where they are in multiple states simultaneously, underlies the great potential of quantum computers. Just like ordinary computers, however, quantum computers need a way to transfer quantum information between distant qubitsand that presents a major experimental challenge.
In a series of papers published in Nature Communications, researchers at the University of Rochester, including John Nichol, an assistant professor of physics and astronomy, and graduate students Yadav Kandel and Haifeng Qiao, the lead authors of the papers, report major strides in enhancing quantum computing by improving the transfer of information between electrons in quantum systems.
In one paper, the researchers demonstrated a route of transferring information between qubits, called adiabatic quantum state transfer (AQT), for the first time with electron-spin qubits. Unlike most methods of transferring information between qubits, which rely on carefully tuned electric or magnetic-field pulses, AQT isnt as affected by pulse errors and noise.
To envision how AQT works, imagine you are driving your car and want to park it. If you dont hit your brakes at the proper time, the car wont be where you want it, with potential negative consequences. In this sense, the control pulsesthe gas and brake pedalsto the car must be tuned carefully. AQT is different in that it doesnt really matter how long you press the pedals or how hard you press them: the car will always end up in the right spot. As a result, AQT has the potential to improve the transfer of information between qubits, which is essential for quantum networking and error correction.
The researchers demonstrated AQTs effectiveness by exploiting entanglementone of the basic concepts of quantum physics in which the properties of one particle affect the properties of another, even when the particles are separated by a large distance. The researchers were able to use AQT to transfer one electrons quantum spin state across a chain of four electrons in semiconductor quantum dotstiny, nanoscale semiconductors with remarkable properties. This is the longest chain over which a spin state has ever been transferred, tying the record set by the researchers in a previous Nature paper.
Because AQT is robust against pulse errors and noise, and because of its major potential applications in quantum computing, this demonstration is a key milestone for quantum computing with spin qubits, Nichol says.
In a second paper, the researchers demonstrated another technique of transferring information between qubits, using an exotic state of matter called time crystals. A time crystal is a strange state of matter in which interactions between the particles that make up the crystal can stabilize oscillations of the system in time indefinitely. Imagine a clock that keeps ticking forever; the pendulum of the clock oscillates in time, much like the oscillating time crystal.
By implementing a series of electric-field pulses on electrons, the researchers were able to create a state similar to a time crystal. They found that they could then exploit this state to improve the transfer of an electrons spin state in a chain of semiconductor quantum dots.
Our work takes the first steps toward showing how strange and exotic states of matter, like time crystals, can potentially by used for quantum information processing applications, such as transferring information between qubits, Nichol says. We also theoretically show how this scenario can implement other single- and multi-qubit operations that could be used to improve the performance of quantum computers.
Both AQT and time crystals, while different, could be used simultaneously with quantum computing systems to improve performance.
These two results illustrate the strange and interesting ways that quantum physics allows for information to be sent from one place to another, which is one of the main challenges in constructing viable quantum computers and networks, Nichol says.
Tags: Arts and Sciences, Department of Physics and Astronomy, John Nichol, quantum computing, quantum physics
Category: Science & Technology
Continue reading here:
Researchers confront major hurdle in quantum computing - University of Rochester
Posted in Quantum Physics
Comments Off on Researchers confront major hurdle in quantum computing – University of Rochester
Can a Patent Be Valid and Invalid at the Same Time? – Bloomberg Law
Posted: at 11:54 am
In quantum physics, Schrdingers cat refers to a thought experiment where a cat is simultaneously both alive and dead. But law is binarya statement is either true or false; a litigant either wins or loses; and a patent is either valid or invalid. Or is it?
In a fascinating 2019 ruling in Sanofi-Aventis v. Mylan, the federal district court in New Jersey refused, initially, to find a pharmaceutical patent invalid despite a U.S. Patent and Trademark Office finding of invalidity in an inter partes review (IPR) between the same parties.
Sanofi held that collateral estoppel which permits a ruling from one case to be applied in a separate casewas inappropriate where the standards of proof for invalidity were different (i.e., only a preponderance of evidence in the PTO, versus clear and convincing evidence in federal courts).
Ultimately, the Sanofi court did accept the invalidity determination, but only after the Federal Circuit Court of Appeals affirmed the PTOs decision. That appellate ruling, however, merely affirmed that the PTO had correctly found invalidity applying the lower standard.
In 2020, the Northern District of California, in Cisco Sys. v. Capella Photonics Inc., did apply collateral estoppel to find patent invalidity where the Federal Circuit had already affirmed an IPR invalidity finding. The Cisco court distinguished Sanofi as having occurred before appellate review of the IPRciting XY LLC v. Trans Ova Genetics, where the Federal Circuit held that collateral estoppel applies once the appeals court affirms the patent office on invalidity.
Interestingly, the XY case garnered a dissent from Circuit Judge Pauline Newman who pointed to the different standards in IPRs as a reason not to apply collateral estoppeleven after appellate affirmance of an IPR holding. Newman further noted that Supreme Court jurisprudence has repeatedly held that collateral estoppel is not to be automatically applied.
These decisions reveal tensions as to the collateral estoppel effect of IPR decisions in federal district court litigations.
Modern day collateral estoppel law derives from a landmark patent case. In Blonder-Tongue Laboratories Inc. v. University of Illinois Foundation, the Supreme Court held that a defendant in federal court may assert patent invalidity based on collateral estoppel if a different defendant invalidated the patent in an earlier federal court case.
Before Blonder-Tongue, patentees could in theory continue to assert a patent against different defendants after an invalidity rulingand thus a patent could be both effectively valid and invalid.
Nonetheless, as Judge Newman pointed out in her dissent in XY, the Blonder-Tongue case did not require automatic application of collateral estoppel but instead required the trial court to examine whether there were any reasons not to apply estoppele.g., if the issues being litigated in the two cases were not the same. And the Supreme Court has explained, in B&B Hardware Inc. v. Hargis Indus., that issues are not identical if the second action involves application of a different legal standard, even though the factual setting of both suits may be the same.
Citing B&B Hardware, Sanofi held that the different legal standards in IPRs preclude estoppel. The different standards have also led the Federal Circuit to hold, in Novartis AG v. Noven Pharms. Inc., that an IPR properly may reach a different conclusion [from a prior court ruling finding no invalidity] based on the same evidence.
These rulings seem discomforting. Our jurisprudence is premised upon treating like cases alike. How can a patent be both valid and invalid at the same time (even in the face of the same prior art)?
Indeed, the purpose of collateral estoppel is largely to avoid the anomaly of the same issue being adjudicated differently in different courts. Yet, are they really like cases if different standards are being applied? How can the patent office remove a property righta patentusing a lower standard of proof and then effectively impose that conclusion on an Article III court that has a higher standard of proof?
If and when these issues reach the Supreme Court they may provide a vehicle for refining the doctrine of collateral estoppel, as well as for considering whether the IPRs lower standard of proof for eliminating a property right is actually constitutionally permissiblean issue that the court has not specifically addressed when it has upheld IPRs in the face of other constitutional challenges.
For now, we live with the potential for Schrdingers patents"simultaneously valid and invalid during the time between IPR invalidation and appellate review. Like their quantum-physics feline counterparts, Schrdingers patents are curious creatures cloaked in uncertainty.
This column does not necessarily reflect the opinion of The Bureau of National Affairs, Inc. or its owners.
Write for Us: Author Guidelines
Jason Lief is a co-founder of Lief Parke LLP in New York where he practices patent and IP litigation and counseling.
Read this article:
Can a Patent Be Valid and Invalid at the Same Time? - Bloomberg Law
Posted in Quantum Physics
Comments Off on Can a Patent Be Valid and Invalid at the Same Time? – Bloomberg Law
Breaking the Laws of Physics: Steering Light to Places It Isnt Supposed to Go – SciTechDaily
Posted: at 11:54 am
Credit: University of Twente
Light that is sent into a photonic crystal, cant go deeper than the so-called Bragg length. Deeper inside the crystal, light of a certain color range can simply not exist. Still, researchers of the University of Twente, the University of Iowa and the University of Copenhagen managed to break this law. They steer light into a crystal, using a programmed pattern, and demonstrate that it will reach places far beyond the Bragg length. They publish their findings in Physical Review Letters.
Photonic crystals have a regular pattern of nano pores etched in silicon. They are typically designed to work as a mirror for a certain color range of light. Inside the crystal, light of those colors is forbidden. Even if youd be able to place an atom inside the crystal, that typically emits one color, it will stop emitting light. The so-called Bragg length is the maximum distance light is allowed to travel, according to a well-known physics law.
This property can be used for creating perfect mirrors for certain wavelengths, but it also helps improving solar cells. Still, if there is a sign that says forbidden anywhere, then it is always tempting to go there. This is what the researchers did, they proved that light can penetrate the photonic crystal, much deeper than the Bragg length.
They managed to do this by using light that was pre-programmed, and by using the small imperfections that always come with creating nanostructures. These imperfections cause light waves to be scattered randomly inside the crystal. The researchers program the light in such a way that every location inside the photonic crystal can be reached. They even demonstrate a bright spot at five times the Bragg length, where light is enhanced100 times instead of decreased 100 to 1000 times.
This remarkable result can be used for creating stable quantum bits, for a light-driven quantum computer. The forbidden effect can also be employed in miniature on-chip light sources and lasers.
The research was done in theComplex Photonics group of Professor Willem Vos. The group is part of UTs MESA+ Institute. The first author, Ravitej Uppu, who worked in this group earlier on, is now a Professor at the University of Iowa. The research collaboration was continued, also together with the University of Copenhagen. It was supported by Dutch Research Council (NWO) programs Stirring of Light, Free form scattering optics and Self-assembled icosahedral quasicrystals with a band gap for visible light, by the Applied Nanophotonics section of the MESA+ Institute and the Center for Hybrid Quantum Networks of the Niels Bohr Institute in Copenhagen.
Reference: Spatially Shaping Waves to Penetrate Deep inside a Forbidden Gap by Ravitej Uppu, Manashee Adhikary, Cornelis A.M. Harteveld and Willem L. Vos, 27 April 2021, Physical Review Letters.DOI: 10.1103/PhysRevLett.126.177402
See the rest here:
Breaking the Laws of Physics: Steering Light to Places It Isnt Supposed to Go - SciTechDaily
Posted in Quantum Physics
Comments Off on Breaking the Laws of Physics: Steering Light to Places It Isnt Supposed to Go – SciTechDaily
Are We on the Brink of a New Age of Scientific Discovery? – SciTechDaily
Posted: at 11:54 am
The centerpiece of the Muon g-2 experiment at Fermilab is a 50-foot-diameter superconducting magnetic storage ring, which sits in its detector hall amidst electronics racks, the muon beamline and other equipment. Credit: Reidar Hahn, Fermilab
In 2001 at the Brookhaven National Laboratory in Upton, New York, a facility used for research in nuclear and high-energy physics, scientists experimenting with a subatomic particle called a muon encountered something unexpected.
To explain the fundamental physical forces at work in the universe and to predict the results of high-energy particle experiments like those conducted at Brookhaven, Fermilab in Illinois, and at CERNs Large Hadron Collider in Geneva, Switzerland, physicists rely on the decades-old theory called the Standard Model, which should explain the precise behavior of muons when they are fired through an intense magnetic field created in a superconducting magnetic storage ring. When the muon in the Brookhaven experiment reacted in a way that differed from their predictions, researchers realized they were on the brink of a discovery that could change sciences understanding of how the universe works.
Earlier this month, after a decades-long effort that involved building more powerful sensors and improving researchers capacity to process 120 terabytes of data (the equivalent of 16 million digital photographs every week), a team of scientists at Fermilab announced the first results of an experiment called Muon g-2 that suggests the Brookhaven find was no fluke and that science is on the brink of an unprecedented discovery.
UVA physics professor Dinko Poani has been involved in the Muon g-2 experiment for the better part of two decades, and UVA Today spoke with him to learn more about what it means.
Q. What are the findings of the Brookhaven and Fermilab Muon g-2 experiments, and why are they important?
A. So, in the Brookhaven experiment, they did several measurements with positiveand negative muons an unstable, more massive cousin of the electron under different circumstances, and whenthey averaged their measurements,they quantified a magnetic anomaly that is characteristic of the muon more precisely than ever before. According torelativistic quantum mechanics, the strength of the muons magnetic moment (a property it shares with a compass needle or a bar magnet) should be two in appropriate dimensionless units, the same as for an electron. The Standard Model states, however, that its not two, its a little bit bigger, and that difference is the magnetic anomaly. The anomaly reflects the coupling of the muon to pretty much all other particles that exist in nature. How is this possible?
The answer is that space itself is not empty; what we think of as a vacuum contains the possibility of the creation of elementary particles, given enough energy. In fact, these potential particles are impatient and are virtually excited, sparking in space for unimaginably short moments in time. And as fleeting as it is, this sparking is sensed by a muon, and it subtly affects the muons properties. Thus, the muon magnetic anomaly provides a sensitive probe of the subatomic contents of the vacuum.
To the enormous frustration of all the practicing physicists of my generation and younger, the Standard Model has been maddeningly impervious to challenges. We know there are things that must exist outside of it because it cannot describe everything that we know about the universe and its evolution. For example, it does not explain the prevalence of matter over antimatter in the universe, and it doesnt say anything about dark matter or many other things, so we know its incomplete. And weve tried very hard to understand what these things might be, but we havent found anything concrete yet.
So, with this experiment, were challenging the Standard Model with increasing levels of precision. If the Standard Model is correct, we should observe an effect that is completely consistent with the model because it includes all the possible particles that are thought to be present in nature, but if we see a different value for this magnetic anomaly, it signifies that theres actually something else. And thats what were looking for: this something else.
This experiment tells us that were on the verge of a discovery.
Q. What part have you been able to play in the experiment?
A. I became a member of this collaboration when we had just started planning for the follow-up to the Brookhaven experiment around 2005, just a couple of years after the Brookhaven experiment finished, and we were looking at the possibility of doing a more precise measurements at Brookhaven. Eventually that idea was abandoned, as it turned out that we could do a much better job at Fermilab, which had better beams, more intense muons and better conditions for experiment.
So, we proposed that around 2010, and it was approved and funded by U.S. and international funding agencies. An important part was funded by a National Science Foundation Major Research Instrumentation grant that was awarded to a consortium of four universities, and UVA was one of them. We were developing a portion of the instrumentation for the detection of positrons that emerge in decays of positive muons. We finished that work, and it was successful, so my group switched focus to the precise measurements of the magnetic field in the storage ring at Fermilab, a critical part of quantifying the muon magnetic anomaly. My UVA faculty colleague Stefan Baessler has also been working on this problem, and several UVA students and postdocs have been active on the project over the years.
Q. Fermilab has announced that these are just the first results of the experiment. What still needs to happen before well know what this discovery means?
A. It depends on how the results of our analysis of the yet-unanalyzed run segments turn out. The analysis of the first run took about three years. The run was completed in 2018, but I think now that we weve ironed out some of the issues in the analysis, it might go a bit faster. So, in about two years it would not be unreasonable to have the next result, which would be quite a bit more precise because it combines runs two and three. Then there will be another run, and we will probably finish taking data in another two years or so. The precise end of measurements is still somewhat uncertain, but I would say that about five years from now, maybe sooner, we should have a very clear picture.
Q. What kind of impact could these experiments have on our everyday lives?
A. One way is in pushing specific technologies to the extreme in solving different aspects of measurement to get the level of precision we need. The impact would likely come in fields like physics, industry and medicine. There will be technical spinoffs, or at least improvements in techniques, but which specific ones will come out of this, its difficult to predict. Usually, we push companies to make products that we need that they wouldnt otherwise make, and then a new field opens up for them in terms of applications for those products, and thats what often happens. The World Wide Web was invented, for example, because researchers like us needed to be able to exchange information in an efficient way across great distances, around the world, really, and thats how we have, well, web browsers, Zoom, Amazon and all these types of things today.
The other way we benefit is by educating young scientists some of whom will continue in the scientific and academic careers like myself but others will go on to different fields of endeavor in society. They will bring with them an expertise in very high-level techniques of measurement and analysis that arent normally found in many fields.
And then, finally, another outcome is intellectual betterment. One outcome of this work will be to help us better understand the universe we live in.
Q. Could we see more discoveries like this in the near future?
A. Yes, there is a whole class of experiments besides this one that look at highly precise tests of the Standard Model in a number of ways. Im always reminded of the old adage that if you lose your keys in the street late at night, you are first going to look for them under the street lamp, and thats what were doing. So everywhere theres a streetlight, were looking. This is one of those places and there are several others, well, I would say dozens of others, if you also include searches that are going on for subatomic particles like axions, dark matter candidates, exotic processes like double beta decay, and those kinds of things. One of these days, new things will be found.
We know that the Standard Model is incomplete. Its not wrong, insofar as it goes, but there are things outside of it that it does not incorporate, and we will find them.
Here is the original post:
Are We on the Brink of a New Age of Scientific Discovery? - SciTechDaily
Posted in Quantum Physics
Comments Off on Are We on the Brink of a New Age of Scientific Discovery? – SciTechDaily
Can you really put a price on your college major? – The Boston Globe
Posted: at 11:54 am
Our view of higher ed has been blinded by greed
Re Nicholas Tampios May 2 Ideas piece, How much money do English majors make? Dont ask.: Actually, glad you asked. Im doing fine, thanks, regardless of [my] major. Tampio ends his essay about the ill-conceived proposed College Transparency Act with a reference to what it would produce: a higher education system in the image of economists and businesspeople. But he never mentions greed.
We have strangled ourselves with greed. Instead of allowing 18-year-olds to develop their intellect, become curious, and build character by taking a range of courses and changing their minds while in the process of finding out who they might be we are continuing the diabolical process of producing limited, iPhone-distracted young adults with no perspective on the world.
Our higher-ed system has become embarrassing. So were going to talk college students out of studying history, literature, and philosophy because it wont put money in their pockets? We truly have become a nation of fools.
Laura Duffy
Newton
An English majors cautionary ode to practical-mindedness
Nicholas Tampio is correct in his opposition to the College Transparency Act, an unnecessary government solution to a problem that doesnt exist. If prospective college students want to know which majors pay the most (or the least), they can find that information on Google. Took me 10 seconds.
But Tampio is dead wrong in his assertion that students should follow their passions when choosing a college major. I guess if you want to have roommates for the rest of your life, a major in some obscure humanities field would be a good, passionate choice. But dont forget to buy soap on a rope; itll be convenient for the times you trundle back and forth to the bathroom that you share with other passionate degree holders.
Dave Rossow
Boylston
The writer is a recovering English major.
Family and its liberal arts majors have done quite well, thank you
Our three liberal arts major children (undergraduates in psychology-sociology, graphic design, and English-humanities) have all gone on to successful careers in business, entrepreneurship, and finance, as have I, another English major. English majors may not learn quantum physics, but they can do what a lot of STEM graduates cant do as effectively: read, write, and analyze. Dont sell the degree short.
Gail Schubert
Roslindale
Some may have the luxury to opine on the point of college
Nicholas Tampio, a professor of political science at Fordham University, reveals himself by saying hes worried about nudging students and families into viewing college as being primarily about making money. As a first-generation college student who relied on scholarships, loans, and work to attend a great university, I was lucky to have pieced it together to chart my path without even adequate career-counseling advice. This professor might need to sit in on some economics classes to understand that higher ed is, in fact, all about career opportunities for most scholars, not only to inform and motivate students to apply their own skills, but to rationalize the cost of the education itself, with a growing share subsidized by limited taxpayer and endowment resources. This does not exclude any students ability to learn, analyze, and appreciate life broadly.
Tom Pappas
Dartmouth
Originally posted here:
Can you really put a price on your college major? - The Boston Globe
Posted in Quantum Physics
Comments Off on Can you really put a price on your college major? – The Boston Globe
Physicist and jazz pianist combines music and science at Rochester – University of Rochester
Posted: May 7, 2021 at 3:58 am
May 6, 2021
As an undergraduate and later graduate student at the University of Rochester, Philippe Lewalle 14, 21 (PhD) has played piano at the Colleges music and physics department commencement ceremonies.
This year will be different, though: he will also be the one graduatingwith a PhD in physicsduring spring commencement ceremonies, May 14 to 16 and 20 to 23.
Visit the Class of 2021 site for details about this years Commencement ceremonies and for a downloadable toolkit of materials to share your support on social media.
The child of parents who are both violinists and academics, Lewalle began playing piano at age 7 and was drawn to Rochester because the University offered the possibility to combine his love of music and interest in science, earning dual degrees in music and physics at the School of Arts & Sciences.
I came to Rochester in large part because it was feasible to double major in physics and music, he says. Even though I was working a lot of long days as an undergrad, it always felt refreshing having two very different types of homework on my platter. If I got tired of one, I could always switch to the other.
The summer after his sophomore year, Lewalle had the opportunity to work with Joseph Eberly, the Andrew Carnegie Professor of Physics, conducting research on quantum optics. The research would ultimately set the direction of his graduate work and PhD dissertation.
That summer definitely shaped the trajectory I took later, Lewalle says. The research I conducted as an undergrad ended up relating a lot to my PhD work.
As a graduate student, he worked with physics professor Andrew Jordan, where his specific research focus was on tracking quantum systems in real timea process that is intrinsically invasive to changing the system state itself as it is monitoredand the odd things that happen when such systems are disturbed. The research is important not only for better understanding fundamental quantum mechanics, but also for improving quantum technologies such as quantum computers.
Throughout his time at Rochester, Lewalle continued to play music with a variety of musicians at venues in the city and took advantage of the musical opportunities offered by the Rochester community.
The pool of talent that comes through Eastman is really motivating and inspiring, Lewalle says. I have benefited a lot from playing with so many talented musicians throughout my time here, in addition to attending a number of the great performances, whether at Jazz Fest, student recitals, or other concerts that are held on a regular basis at Eastman and venues around the city.
Lewalles own musical interests include contemporary jazz and its intersections with hip-hop and free improvisation. One project he played in, called Claude Benningtons Fever Dream, involved a synthesis of hip-hop and jazz that took a jazz rhythm section and instead of horn players, put rappers out front, he says. We were learning beats and treating them like jazz tunes, improvising on and around them and sometimes venturing freely away from the written material. Recordings from the project are available on Bandcamp and YouTube.
Although many of his music projects have stalled due to the COVID-19 pandemic, Lewalle has continued to work on his own compositions and record and play when possible and safe, even while finishing his physics PhD thesis.
Its been difficult during COVID because, especially with jazz improv, you really feed on the energy of the crowds and the immediate interactions between musicians, he says. Its not the same without that.
In July, Lewalle will start a new chapter as he travels across the country to California to begin a postdoctoral research appointment, studying quantum mechanics in the group of K. Birgitta Whaley at the University of California Berkeley.
Tags: Arts Sciences and Engineering, Class of 2021, commencement, COVID-19, Department of Physics and Astronomy, event, Satz Department of Music, School of Arts and Sciences
Category: Student Life
Continue reading here:
Physicist and jazz pianist combines music and science at Rochester - University of Rochester
Posted in Quantum Physics
Comments Off on Physicist and jazz pianist combines music and science at Rochester – University of Rochester
On the marvels of physics | symmetry magazine – Symmetry magazine
Posted: at 3:58 am
Clifford Johnson, a theoretical physicist at the University of Southern California, is an accomplished scientist working on ways to describe the origin and fabric of the universe.
He is also a multitalented science communicator and one of the rare scientists that can boast his own IMDb page.
Johnsons efforts to engage the public with science have spanned blogging, giving public lectures, appearing on television and web shows, writing and illustrating a graphic novel, and acting as a science advisor for television shows and blockbuster films including Star Trek: Discovery and Avengers: Endgame.
In the spirit of his 2017 popular science book The Dialogues, I hopped on a Zoom call with Johnson for a dialogue of my own. What follows is an edited version of our conversation about how and why he came to study quantum physics, why he decided to create a graphic novel about science, the ups and downs of Hollywood consulting, and why public engagement with science matters.
From a very early age, I was asking questions about how the world works and trying to figure out how things worked by tinkering with old radios and things. Then at some point, I learned that there's a career where you can make a living from that sort of curiositybeing a scientist.
And then some family friend asked me what kind of scientist I wanted to be. I didn't realize there were different kinds. So I found a dictionary and I went through page by page and read the definitions of chemist, biologist, all of the -ists and -ologists. And when I hit physicist, I thought, this is the one, because the entry said that physics underliesall the other scienceswhich appealed to me because I wanted to keep my options open.
I got interested in particle physics reading authors such as Paul Davies and Abraham Pais as a teenager. And then in my undergraduate studies at Imperial College, I began to learn about the issues of trying to quantize gravity, which led me to study string theory for my PhD at the University of Southampton. The universe really does seem to be fundamentally quantum mechanical. So, it's a real problem if we don't know quantum mechanically how to understand gravity, spacetime and where the universe comes from.
Ive been doing outreach in a way since I was 8 or 10 years old. I was that annoying kid who was always explaining things. In school, people would call me the professor. Everyone thought they were giving me a hard time, but secretly I thought it was an awesome nickname.
Outreach, for me, is a natural part of being a scientist. Research is all about the story of how things work and where they came from. And what's the point of knowing the story, if you can't also get other people excited about it? If someone wants to know, I'm going to tell them. I got reasonably good at explaining things in a coherent way. Word got around, and I started presenting on radio and TV.
Sometimes, people would get in touch from the media because of something they read on my blog. I co-founded a blog called Cosmic Variance with four other physicists in 2005, and also started a solo blog called Asymptotia in 2006. Id write about interesting ideas and what was going on in research, as well as my other interests and day-to-day life. Blogging created communities where people would engage in conversation and we'd have great discussions, and then that would encourage us to write more.
It is very frustrating to me that science is often portrayed as a special thing done by a special group of people. It is a special thing, but anyone can be involved, and everyone should be involved. I often say that science should be put back into the culture where it belongs.
Public outreach is important because a lot of people think they wouldnt understand scientific issues, and so they leave it to a small group of people to make decisions. And thats not democratic. We aren't a democracy if people aren't more familiar and comfortable with science and the people who do science.
I agonized over writing a book for the general public for a long time because I didnt think there was any urgency to write one of the standard kinds of books that get written by people in my field. Not that theres anything wrong with those books. But I thought that if we could break out of the narrow mold of how popular science books are supposed to be, we could reach so many more people.
Though I was a comic book fan from a young age, I essentially snuck up on the on the graphic-novel concept backwards. The ratio between prose and illustration changed as I began to conceptualize what I really wanted to be able to do with the book. The illustration aspect began to eat the prose aspect and became a narrative in its own right. And then I realized it was going to be a graphic novel. Writers often say that you try and create the book that you want to see in the worldso I did, and I even took the time out to teach myself to draw at the level needed to do it.
In all graphic novels, spacetime is created by the reader. When youre looking at a series of comic panels, your mind constructs how space and time come alive on the page. So what better medium to talk about physics, the subject that is about spacetime, than graphic novels? I could take advantage of the medium to illustrate ideas, like arranging panels to swirl into the interior of a black hole and mess up the order to convey how space and time get messed up there.
Yes. The plan is to do a new set of dialogues. Unfortunately, Im still working on the time machine in the basement so I can manufacture more hours in the day. Sooner or later, Ill get it to work.
Most of the work is not the glamorous, sitting-around-chatting-with-Spielberg kind of thing that people envision. Theres no industry standard for science consulting. The work can be anything from a writer getting in touch with me and asking if I'll take a look at a script, or if Ill talk with them about an idea they have. Or the directors call consultants in at the end and ask us to fix something before they start shooting, although by then its usually too late for a good conversation.
If the science is going to be part of the DNA of the story, then it's best if conversations happen early. The best stuff happens when theres an environment where science can be an inspiration at the writing stage. For the Avengers: Endgame and Infinity War movies, one of the smart things the filmmakers did is they got in touch early on and then we brainstormed ideas. They did this with other scientists, too, gathering a lot of good material to draw from.
Anywhere from zero to a hundred percent. I have no control over how much. When I give public talks, I talk about the trade-off between how much control you have and the size of the audience you can reach. I have complete control over the content of a public lecture to a few hundred people. I had zero control of what ended up in the final cut of Avengers, with an audience of many millions.
In a few projects I advised on, there are even scenes where I wrote most of the words. I either went over the script and revised the science talk, or the writers left a hole for me to tell them how to say something, and then they used my suggestions verbatim. Thats not common, but it happens sometimes.
Overall, the science is more likely to survive all the way to the screen if its for television, which is more of a writers medium. In television, the director works for the writers. In film, the writers work for the directors, who may or may not care about the science content.
Season two of the show Agent Carter is a great model of how things between TV writers and science consultants are supposed to work. Entire characters and storylines on the show were invented based on things we brainstormed together in the writers room. A few times, I sketched an idea about what a machine might look like and they just went away and built the machine for the set!
Another project where I was involved very early on was the first season of National Geographics series Genius, about the life and work of Einstein. Not only did I teach the writers a lot about relativity, but I helped pick pieces of science that they could unpack thematically for episodes and helped them write scenes so that the science could really be on show.
Maybe most importantly, they took seriously my encouragement to show that Einstein discussed his ideas with others around him, to help break that lone genius mythology that often drives people away from thinking they can be scientists.
Some people get hung up on getting all the facts right, but I'd rather focus on things like representing the scientific process correctly, as opposed to making it seem like magicrepresenting the thought processes and the people doing those thought processes.
I care about whether the scientists are portrayed like real people with narratives that help you relate to them and understand them. When I'm working with artists and media people creating images of scientists, I encourage them to make those people more real, make them more accessible, show that they're human beings.
I think the most important skill to learn is dealing with interruption and knowing how to put something on hold and then come back to it. I've gotten better at doing a lot of stuff in my head in preparation for that short time I'm going to have where I will be able to sit at my desk and do my physics.
I hope that I am helping to dispel the myth that if youre good at outreach, it means that youre not goodor not interestedin research at the highest level. Thats often used to discourage people from spending time on outreach and engagement, or as an excuse to dismiss people of color or women in the field. The fact that I have been very successful at research and teaching and also science outreach shows that it is possible to be a significant player in both realms.
See the article here:
On the marvels of physics | symmetry magazine - Symmetry magazine
Posted in Quantum Physics
Comments Off on On the marvels of physics | symmetry magazine – Symmetry magazine