What’s the point: Ansible, Datadog, Amazon, Lens, Rust, and DeepMind DEVCLASS – DevClass

The team behind Red Hats IT automation tool Ansible is on track for the 2.10 release on September 22nd, and has just finished work on the base component for the upcoming version. Ansible 2.10 is the first to have the Ansible engine, which is made up of some core programs (ansible-galaxy, ansible-test, etc), a subset of modules and plugins, and some documentation, in a separate ansible-base repository.

The rest of the plugins and modules have been pushed into a variety of collections, a format for bundling Ansible artifacts. Collections are independently developed and updated, with some sought out ones becoming bundled with ansible-base for the final Ansible package. To make sure moved components wont break setups, Ansible 2.10 comes with appropriate routing data.

At Datadogs yearly user conference last week, the monitoring company introduced some additions to its portfolio that are well worth a look. One of the most sought after enhancements seems to be the Datadog mobile app for iOS and Android devices. The application is meant to provide on-call workers with dashboard and alert access. It also allows users to check the new Incidents UI, which grants a central overview of the state of all incidents. Other enhancements to the Datadog platform include investigation dashboards and threat intelligence for Security Monitoring, and compliance monitoring.

A good eight month after introducing devs interested in quantum computing to its Braket service, AWS has decided its time to make it generally available. The product aims to support researchers by providing them with a development environment to explore and build quantum algorithms, test them on quantum circuit simulators, and run them on different quantum hardware technologies. Amazon Braket comes packed with pre-built quantum computing algorithms, though implementing some from scratch is promised to be an option as well, and simulators for testing and troubleshooting different approaches.

Mirantis, recent home of Docker Enterprise, has continued on its cloud native acquisition journey by buying Kubernetes integrated development environment Lens from its authors. Lens is a MIT-licensed project which was launched in March 2020 and is supposed to run on MacOS, Windows, and Linux. It was originally developed by Kontena, whose team also became part of Mirantis earlier this year. In its announcement, Mirantis promised to keep Lens free and open source and invest in the future development of the tool.

Lovers of programming language Rust might have started to worry given the string of Mozilla layoffs announced last week. The language team therefore took to Twitter to assure users that Rust isnt in existential danger, ensuring to share more information on the topic in the coming weeks.

Developers working with just-in-time compiler JAX in their machine learning projects can now add two more helpers to their open-source toolbelt. Optax and Chex both stem from Googles DeepMind team and are meant to support users in properly using JAX, which funnily enough is also a Google research project.

Chex includes utils to instrument, test, and debug JAX code in order to make it more reliable. Meanwhile Optax was dreamt up to provide simple, well-tested, efficient implementations of gradient processing and optimisation approaches. Both projects can be found on GitHub, where the projects are protected under a Apache-2.0 License.

Original post:
What's the point: Ansible, Datadog, Amazon, Lens, Rust, and DeepMind DEVCLASS - DevClass

U.S. continues on economic road to recovery under Trump – Boston Herald

In less than two months, Americans will choose a president for the next four years. If your vote is based on which candidate can rebuild our economy, the choice is clear.

Our economy is roaring back from the depths of the pandemic, because President Trumps pro-growth economic agenda over the last four years laid the groundwork.

On Sept. 4, the Department of Labor announced that 1.4 million jobs were created since April. The national unemployment rate fell to 8.4%, a 6.3% improvement during that period. These results exceeded the expectations of economists and even the most bullish Wall Street analysts. Reflecting confidence in the economys recovery, the stock markets have traded at record highs since the nationwide economic closures that began in March.

Under Trump, the Republican Senate and then Republican-controlled House passed the most comprehensive tax cuts and tax reform legislation in a generation. The Tax Cuts and Jobs Act of 2017 reduced taxes for businesses from 35% to 21%. It also provided valuable incentives for manufacturers and small businesses including restaurants to hire more employees and allowed business owners to write off any investment in new equipment and tools for their businesses.

One of the presidents earliest directives was tomandatethat for every one new regulation, two old regulations must be eliminated. InTrumpianstyle,the presidents teamactuallyexceededhis own initialdirectiveand eliminated 22 regulations for every new regulation issued. According to the Council of Economic Advisers, Trump deregulation has reduced regulatory burden on our economy by nearly $50 billion and helped American families save at least$3,100 each year.

Since the pandemic struck, the presidents economic leadership has also been bold and decisive. For example the Pledge to Americas Workers and the White House Initiative on Industries of the Future are centered on jumpstarting high-tech job training and bolstering American dominance in transformational industries such as 5G wireless broadband, quantum computing and artificial intelligence. These are the sectors that will determine long-term American leadership of the global economy.

But as our nation continuesthe transition from pandemic tosustained economic recovery,the contrast between Trumpsoptimistic andpro-worker jobs agendaandformer vice president Joe Bidensembrace ofindefinitequarantine and economic closure is clear.During the Democratic presidential primary,Biden, who wastrailing inenthusiasmamongDemocraticactivists,raced to embrace theGreen New DealchampionedbyRep. AlexandriaOcasio-Cortezof New York.

Included in the Green New Deal is a fracking ban that would eliminate hundreds of thousands of energy, manufacturing and construction jobs in Pennsylvania, Ohio and other states. Biden wont even renounce the Green New Deals mandate to eliminate U.S. commercial airlines within a decade. This would further devastate already suffering high-skilled union jobs in the aviation, aerospace manufacturing and hospitality sectors. According to recent studies, the demise of American aviation alone would cost us 1.6 million jobs and a 1% decline in our gross domestic product.

At the end of the day, actions speak louder than words.Progressives and media naysayersscoffed at the Trump administrations vision for economic growthduring the darkest days of the pandemic. Despite the doomsday projections of sustained economic depression, Trumps economicplatformoftax cuts, deregulation and limited government have been rocket fuel for Americas coronavirusrecovery.

On the flip side, the former vice president would undermine our economy and put American workers back on the ropes.

Joseph Lai served as White House special assistant for legislative affairs from 2017 to 2019.

Read this article:
U.S. continues on economic road to recovery under Trump - Boston Herald

Meet The Scrappy Space Startup Taking Quantum Security Into Space – Forbes

Loft Orbital is helping take quantum security into space

What do you get when you combine space, lasers, photons, the laws of physics, a Fortune 100 company, the Canadian Space Agency and a scrappy space startup?

The answer, it is hoped, will be a revolution in encrypted communications. Or, at least, the start of one: a mission to test quantum security in space. Why might you want to do that? Let me explain, with the help of a scrappy space startup and a seriously clued-up quantum security boffin.

The Fortune 100 company involved here is Honeywell, the prime contractor for the Canadian Space Agency's Quantum Encryption and Science Satellite mission, QEYSSat. The aim? Quite simply to put space-based quantum key distribution (QKD) to the test. More of that in a moment, but first, let's meet the scrappy space startup.

Loft Orbital is a company that specializes in deploying and operating space infrastructure as a service. Using its Payload Hub technology, Loft Orbital takes a "Yet Another Mission" or YAM approach to payloads with a hardware and software stack to enable plug and play sensors on a standard microsatellite platform.

QEYSSat is, I am informed, the largest contract since Loft Orbital was founded in 2017. By coincidence, the same year that the Chinese Academy of Sciences launched a similar QKD program using the Micius satellite.

So, why should you give a rat's behind if it's all been done before? Because, dear reader, QKD is a nascent technology, so every new test program will, almost inevitably, unlock further and valuable information. A few years is a very long time in quantum technology, to bastardize the political idiom.

There are a bunch of differences between the older Micius approach to QKD and that which QEYSSat is taking. For a start, QEYSSat is aiming to be less than 20% the size of the Micius satellite and will leverage commercial technology. Hence the involvement of Loft Orbital. Does size matter? You betcha. Reductions in size of that scale should lead to significant savings in both cost and time as far as the next generation of test projects is concerned. Size and mass will also be key if you'll forgive the pun, as any QKD implementation at scale will demand a large satellite constellation.

Ultimately, if all goes according to plan, QEYSSat could have broad-reaching impacts as it should prove the capability to deliver QKD over much longer distances than the current ground to ground tests have managed to date. "This mission will demonstrate game-changing technology with far-reaching implications for how information will be shared and distributed in the future," says Loft Orbital CEO, Pierre-Damien Vaujour, "we are honored and thrilled to be supporting it."

Time, I think, to bring in my friendly quantum security expert, mathematician and security researcher, Dr. Mark Carney, who you may remember helped me explain why the math says Person Woman Man Camera TV made such a lousy password. Dr. Carney has a particular interest in quantum key distribution threat modeling, so makes the ideal guide to what we can expect, or not, from the QEYSSat mission.

"There are four ways quantum affects security," Dr. Carney begins, "quantum computers break classical algorithms, post-quantum algorithms try to get around this by using harder math problems in classical crypto, quantum algorithms can be used to accelerate decisions (popular in quantum finance, but nobody in infosec has really looked at what algorithms can help where), and QKD, that uses quantum effects to do cryptography, bypassing the need for 'mathematical crosswords' altogether."

Still with me, good? Because it gets a little more complicated from this point on.

The algorithms that drive QKD are oldish, and the most popular and well-established, BB84 and E91, primarily work in the same way.

"Because regular cryptography goes over regular networks, it is fully error corrected," Dr. Carney says, "the security is in the underlying math. As such, it can be packet-switched without any consequence."

What has all this got to do with QKD in space? I'm getting there, and so is Dr. Carney. "The problem with QKD is that packet switching is somewhere between very very hard and basically impossible," he says, "because unlike the security of classical crypto being in the math, the security of QKD is in the physical photon state."

Time to get your just accept this at face value head screwed on: if you observe a photon, the quantum effects you are using disappear and you may as well just use classical crypto because it is much better at being transmitted in the clear.

So, if not packet switching, then what? "You need a direct fiber link to do light photon-based QKD between every single endpoint you want to exchange a key with," Dr. Carney explains. One major manufacturer of QKD fiber solutions produces building-to-building link equipment so that the internal security of the network is the only concern of the QKD keys produced. "This is where satellites turn out to be really handy," says Dr. Carney, "send up one satellite, and have a load of users communicate with that, and no need to build dozens or hundreds of fiber links."

If you have a laser array and a laser receiver, you can send pulses of photons up to satellites and still do QKD, albeit with higher error rates due to atmospheric diffusion of light that cannot be avoided. Dr. Carney will come back to that shortly, I'm sure.

"Another advantage of space is that you don't need fiber repeaters," he says, "and for distances of over 14km, single fiber connections get kind of useless." There are fiber repeater network designs for QKD, but these are not necessarily immune to tampering, so breaking the trust modeling according to Dr. Carney.

"I mentioned error and atmospheric dispersion on uplink before," Dr. Carney reminds me, as much as bad weather doesn't actually affect cloud computing, cloud cover certainly affects QKD! Dispersion on the way down is also an issue, and targeting your downlink comms is also hard."

It turns out that getting the aperture of that link down to a minimum seems like a tough problem. "I don't think the calculations are favorable if your downlink laser disperses over a broad area," Dr. Carney adds, "Eve would just have to plant a small mirror on your fence or carefully park another satellite quietly next to yours," to break the threat model once more.

Dr. Carney is of the opinion that "going into space solves a few problems, but also introduces others." Not least because QKD has a fundamental problem which is hard to solve under any circumstance: all of the security is in the physicality of the system. "One foot wrong," Dr. Carney says, "and you can fail pretty badly very quickly."

As for the Chinese Micius program and what that taught us about QKD in space, the latest I heard was a June 2020 paper published in Nature that explained "entanglement-based QKD between two ground stations separated by 1,120 kilometers at a finite secret-key rate of 0.12 bits per second, without the need for trusted relays." That paper claims the methods used increased the on the ground secure distance tenfold and increased the "practical security of QKD to an unprecedented level."

And what of Loft Orbital, which seems to think that this new QKD technology should be available to the private sector, and adopted at scale, in the 2030s? Dr. Carney doesn't have a problem with that as a date for adoption, given that Loft Orbital is demonstrating how microsats are getting ever easier to launch.

"Adopted at scale," he says, "this is I think the kicker. There seem to be a lot of variables in the mix that don't have easy engineering solutions. Unless you are launching a satellite per region and getting decent coverage with superb bandwidth to mitigate issues such as cloud cover, it's hard to see how the cost viability is maintained."

One thing is for sure, this is a move forward, and it will be interesting to see where all this takes us. Especially with "private equity making investments that heretofore were only really of interest and in reach of nation-states," Dr. Carney concludes.

The rest is here:
Meet The Scrappy Space Startup Taking Quantum Security Into Space - Forbes

Former Intel exec to be new CEO of Semiconductor Research Corporation – WRAL Tech Wire

DURHAM A former Intel Corporation executive has been appointed as president and CEO ofSemiconductor Research Corporation (SRC), a global semiconductor research consortium based in Durham.

Todd Younkin, who is currently executive director of SRCs Joint University Microelectronics Program (JUMP),replacesKen Hansen who is retiring after leading SRC the past five years. Younkin will starttransitioning to his new role on August 18.

I am honored to lead SRC, a one-of-a-kind consortium with incredible potential and exceptionally talented people, Younkin said in a statement. Together, we will deliver on SRCs mission to bring the best minds together to achieve the unimaginable. SRC is well-positioned to meet our commitment to SRC members, employees, and stakeholders by paving the way for the semiconductor industry. Our strong values, unique innovation model, and unflinching commitment to our members are core SRC principles that we will maintain as we move forward.

Todd Younkin

Prior to SRC, Younkin held senior technical positions at Intel Corporation. Among them, he was an assignee to IMEC, an international semiconductor research and development hub, where he worked closely within the consortium to help move Extreme Ultraviolet Lithography (EUVL) into commercialization.

He holds a Ph.D. from the California Institute of Technology and Bachelor of Science from the University of Florida.

The challenges facing the semiconductor industry today are as exciting and demanding as ever before, said Gil Vandentop, SRC Chairman of the Board, in a statement. At the same time, AI, 5G+, and Quantum Computing promise to provide unfathomable gains and benefits for humanity. The need for research investments that bring these technology advances to bear is paramount. Todd has demonstrated an ability to bring organizations together, tackle common research causes, and advance technologies into industry. He has a clear vision to take SRC to the next level. I am delighted that Todd has accepted this challenge and will become the next SRC CEO.

Original post:
Former Intel exec to be new CEO of Semiconductor Research Corporation - WRAL Tech Wire

Spain Introduces the World’s First Quantum Phase Battery – News – All About Circuits

By now, were no stranger to the quantum computing hype. When (or rather, if) they are successfully developed and deliver on their promised potential, quantum computers will be able to solve problems and challenges that would otherwise require hundreds or thousands or more years for current classic computer technology to solve.

In what could be a massive step for quantum computing, researchers from the University of the Basque County claim to have developed the worlds first quantum phase battery.

Today, batteries are ubiquitous, with lithium-ion batteries being the most common out of them, although alternatives do exist. These batteries convert chemical energy into a voltage that can provide power to an electronic circuit.

In contrast, quantum technologies feature circuits based on superconducting materials through which a current can flow without voltage, therefore negating the need for classic chemical batteries. In quantum technologies, the current is induced from a phase difference of the wave function of the quantum circuit related to the wave nature of matter.

A quantum device that can provide a persistent phase difference can be used as a quantum phase battery and induce supercurrents in a quantum circuit, powering it.

This is what the researchers set out to achievecreating such a quantum devicebuilding on an idea first conceived in 2015 by Sebastian Bergeret from the Mesoscopic physics group at the Materials Physics Center. Along with Francesco Giazotto and Elia Strambini from the NEST-CNR Institute, Pisa claims to have built the worlds first functional quantum phase battery.

Bergeret and Tokatlys idea, in short, involves a combination of superconducting and magnetic materials with an intrinsic relativistic effect known as spin-orbit coupling. On top of this idea, Giazotto and Strambini identified a suitable material combination that allowed them to fabricate their quantum phase battery.

Their quantum phase battery consists of an n-doped indium arsenide (InAs) nanowire, which forms the core of the cell, also known as the pile, and aluminum superconducting leads act as poles. The battery is charged by applying an external magnetic field, which can then be turned off.

If quantum batteries are ever to be realized, they could bring significant benefits over their chemical cousins. Among other things, quantum batteries could offer vastly better thermodynamic efficiency and ultra-fast charging times, making them perfect for next-gen applications like electric vehicles.

See the article here:
Spain Introduces the World's First Quantum Phase Battery - News - All About Circuits

London’s PQShield raises 5.5 million seed to develop security solutions that match the power of quantum computing – Tech.eu

PQShield, a London-based cybersecurity startup that specialises in post-quantum cryptography, has come out of stealth mode with a 5.5 million seed investment from Kindred Capital, Crane Venture Partners, Oxford Sciences Innovation and angel investors including Andre Crawford-Brunt, Deutsche Banks former global head of equities.

According to the startup, quantum computers promise an unprecedented problem for security, since they will be able to smash through traditional public-key encryption and threaten the security of all sensitive information, past and present. For that reason, the company is developing quantum-secure cryptography, advanced solutions for hardware, software and communications that resist quantum threat yet still work with todays technology.

Whether cars, planes or other connected devices, many of the products designed and sold today are going to be used for decades. Their hardware may be built to last, but right now, their security certainly isnt. Future-proofing is an imperative, just as it is for the banks and agencies that hold so much of our sensitive data, explains founder and CEO Dr. El Kaafarani,

The team, a spin out from Oxford University, is already working on commercialisation and roll-out as well. Its System on Chip (SoC) solution, built fully in-house, will be licensed to hardware manufacturers, while a software development kit will enable the creation of secure messaging apps protected by post-quantum algorithms. Bosch is already a customer.

Go here to read the rest:
London's PQShield raises 5.5 million seed to develop security solutions that match the power of quantum computing - Tech.eu

Quantum Computing Technologies Market with Sales, Demand, Consumption and strategies 2025 – Cole of Duty

ORBIS RESEARCH has recently announced Global Quantum Computing Technologies Market report with all the critical analysis on current state of industry, demand for product, environment for investment and existing competition. Global Quantum Computing Technologies Market report is a focused study on various market affecting factors and comprehensive survey of industry covering major aspects like product types, various applications, top regions, growth analysis, market potential, challenges for investor, opportunity assessments, major drivers and key players

Request a sample of this report @ https://www.orbisresearch.com/contacts/request-sample/4696468

The Global Quantum Computing Technologies Market report provides a detailed analysis of global market size, regional and country-level market size, segmentation market growth, market share, competitive Landscape, sales analysis, impact of domestic and Global Quantum Computing Technologies Market players, value chain optimization, trade regulations, recent developments, opportunities analysis, strategic market growth analysis, product launches, area marketplace expanding, and technological innovations.

Key vendor/manufacturers in the market:

The major players covered in Quantum Computing Technologies are:Airbus GroupIntel CorporationGoogle Quantum AI LabCambridge Quantum ComputingAlibaba Group Holding LimitedIBMNokia Bell LabsMicrosoft Quantum ArchitecturesToshiba

Browse the complete report @ https://www.orbisresearch.com/reports/index/global-quantum-computing-technologies-market-2020-by-company-regions-type-and-application-forecast-to-2025

Competitive Landscape and Global Quantum Computing Technologies Market Share AnalysisGlobal Quantum Computing Technologies Market competitive landscape provides details by vendors, including company overview, company total revenue (financials), market potential, global presence, Quantum Computing Technologies sales and revenue generated, market share, price, production sites and facilities, SWOT analysis, product launch. For the period 2015-2020, this study provides the Quantum Computing Technologies sales, revenue and market share for each player covered in this report.

Global Quantum Computing Technologies Market By Type:

By Type, Quantum Computing Technologies market has been segmented into:SoftwareHardware

Global Quantum Computing Technologies Market By Application:

By Application, Quantum Computing Technologies has been segmented into:GovernmentBusinessHigh-TechBanking & SecuritiesManufacturing & LogisticsInsuranceOther

Regions and Countries Level AnalysisRegional analysis is another highly comprehensive part of the research and analysis study of the Global Quantum Computing Technologies Market presented in the report. This section sheds light on the sales growth of different regional and country-level Quantum Computing Technologies markets. For the historical and forecast period 2015 to 2025, it provides detailed and accurate country-wise volume analysis and region-wise market size analysis of the global Quantum Computing Technologies market.

The report offers in-depth assessment of the growth and other aspects of the Quantum Computing Technologies market in important countries (regions), including:North America (United States, Canada and Mexico)Europe (Germany, France, UK, Russia and Italy)Asia-Pacific (China, Japan, Korea, India, Southeast Asia and Australia)South America (Brazil, Argentina, Colombia)Middle East and Africa (Saudi Arabia, UAE, Egypt, Nigeria and South Africa)

Make an enquiry before buying this report @ https://www.orbisresearch.com/contacts/enquiry-before-buying/4696468

About Us :

Orbis Research (orbisresearch.com) is a single point aid for all your market research requirements. We have vast database of reports from the leading publishers and authors across the globe. We specialize in delivering customized reports as per the requirements of our clients. We have complete information about our publishers and hence are sure about the accuracy of the industries and verticals of their specialization. This helps our clients to map their needs and we produce the perfect required market research study for our clients.

Contact Us :

Hector CostelloSenior Manager Client Engagements4144N Central Expressway,Suite 600, Dallas,Texas 75204, U.S.A.Phone No.: +1 (972)-362-8199 ; +91 895 659 5155

View post:
Quantum Computing Technologies Market with Sales, Demand, Consumption and strategies 2025 - Cole of Duty

First master’s thesis in Quantum Computing defended at the University of Tartu – Baltic Times

On Tuesday, 2 June, student of the University of Tartu Institute of Computer Science Mykhailo Nitsenko defended his thesis Quantum Circuit Fusion in the Presence of Quantum Noise on NISQ Devices, the first masters thesis defended in the field of quantum computing at the University of Tartu.

In his thesis supervised by Dirk Oliver Theis and Dominique Unruh, Mykhailo Nitsenko studied a concept called circuit fusion, which proposes to reduce stochastic noise in estimating the expectation values of measurements at the end of quantum computations. But near-term quantum computing devices are also subject to quantum noise (such as decoherence etc.), and circuit fusion aggravates that problem.

Mykhailo Nitsenko ran thousands of experiments on IBMs cloud quantum computers and used Fourier analysis techniques to quantify and visualise noise and the resulting information loss.

According to Mykhailo Nitsenko, before he enrolled in the University of Tartu he had a strong opinion that quantum computing is an abstract idea that we will never be able to use or even implement. I just could not imagine how it is even possible to do computations on things without directly observing them. Quantum computing class showed me how it is done, and it became apparent to me that it is something I want to dedicate my academic efforts to, said Nitsenko.

If you dont want to wait for fault-tolerant quantum computers, you may endeavour to use the noisy quantum computing devices that can be built already now. In that case, researching the effects of quantum noise on computations becomes important: these effects must be mitigated, said Dirk Oliver Theis, Associate Professor of Theoretical Computer Science at the University of Tartu Institute of Computer Science. Theis added that he had expected that the mathematics which Mykhailo Nitsenko implemented in his thesis would help us understand some aspects of quantum noise which can be devastating to quantum computations, rendering the result pure gibberish.

In near-term quantum computing, one tries to run quantum circuits which are just short enough so that the correct output can be somehow reconstructed from the distorted measurement results. But quantum noise affects the results of computations on near-term quantum computers in complicated ways. In the mathematical approach based on Fourier analysis that Nitsenko implemented, some effects were predictable, such as a decrease in the amplitudes due to decoherence. What was surprising was that the low frequencies of the quantum noise showed distinct patterns. In future research, this might be exploited to mitigate the effect of quantum noise on the computation, said Theis.

This year, the Information Technology Foundation for Education (HITSA) granted funding to the University of Tartu Institute of Physics to continue and increase the training and research in the field of quantum computing at the university. With the support of this funding, new interdisciplinary courses focusing on quantum programming will be created.

Excerpt from:
First master's thesis in Quantum Computing defended at the University of Tartu - Baltic Times

This Is the First Universal Language for Quantum Computers – Popular Mechanics

Przemyslaw Klos / EyeEmGetty Images

A quantum computing startup called Quantum Machines has released a new programming language called QUA. The language runs on the startups proprietary Quantum Orchestration Platform.

Quantum Machines says its goal is to complete the stack that includes quantum computing at the very bottom-most level. Yes, those physical interactions between quantum bits (qubits) are what set quantum computers apart from traditional hardwarebut you still need the rest of the hardware that will turn physical interactions into something that will run software.

And, of course, you need the software, too. Thats where QUA comes in.

The transition from having just specific circuitsphysical circuits for specific algorithmsto the stage at which the system is programmable is the dramatic point, CEO Itavar Siman told Tech Crunch. Basically, you have a software abstraction layer and then, you get to the era of software and everything accelerated.

The language Quantum Machine describes in its materials isnt what you think of when you imagine programming, unless youre a machine language coder. Whats machine language? Thats the lowest possible level of code, where the instructions arent in natural or human language and are instead in tiny bits of direct instruction for the hardware itself.

Coder Ben Eater made a great video that walks you through a sample program written in C, which is a higher and more abstract language, and how that information translates all the way down into machine code. (Essentially, everything gets much messier and much less readable to the human eye.)

This content is imported from YouTube. You may be able to find the same content in another format, or you may be able to find more information, at their web site.

Machine code acts as a reminder that, on a fundamental level, everything inside your computer is passing nano-Morse code back and forth to do everything you see on the screen as well as all the behind the scenes routines and coordination. Since quantum computers have a brand new paradigm for the idea of hardware itself, theres an opening for a new machine code.

Quantum Machines seems to want to build the entire quantum system, from hardware to all the software to control and highlight it. And if that sounds overly proprietary or like some unfair version of how to develop new technology, we have some bad news for you about the home PC wars of the 1980s or the market share Microsoft Windows still holds among operating systems.

By offering a package deal with something for everyone when quantum computing isnt even a twinkle in the eye of the average consumer, Quantum Machines could be making inroads that will keep it ahead for decades. A universal language, indeed.

QUA is what we believe the first candidate to become what we define as the quantum computing software abstraction layer, Sivan told TechCrunch. In 20 years, we might look back on QUA the way todays users view DOS.

This content is created and maintained by a third party, and imported onto this page to help users provide their email addresses. You may be able to find more information about this and similar content at piano.io

This commenting section is created and maintained by a third party, and imported onto this page. You may be able to find more information on their web site.

Originally posted here:
This Is the First Universal Language for Quantum Computers - Popular Mechanics

Quantum Physicist Invents Code to Achieve the Impossible – Interesting Engineering

A physicist at the University of Sydney has achieved something that many researchers previously thought was impossible. He has developed a type of error-correcting code for quantum computers that will free up more hardware.

His solution also delivers an approach that will allow companies to build better quantum microchips. Dr. Benjamin Brown from the School of Physics achieved this impressive feat by applying a three-dimensional code to a two-dimensional framework.

"The trick is to use time as the third dimension. I'm using two physical dimensions and adding in time as the third dimension," Brown said in a statement. "This opens up possibilities we didn't have before."

"It's a bit like knitting," he added. "Each row is like a one-dimensional line. You knit row after row of wool and, over time, this produces a two-dimensional panel of material."

Quantum computing is rampant with errors. As such, one of the biggest obstacles scientists face before they can build machines large enough to solve problems is reducing these errors.

"Because quantum information is so fragile, it produces a lot of errors," said Brown.

Getting rid of these errors entirely is impossible. Instead, researchers are seeking to engineer a new error-tolerant system where useful processing operations outweigh error-correcting ones. This is exactly what Brown achieved.

"My approach to suppressing errors is to use a code that operates across the surface of the architecture in two dimensions. The effect of this is to free up a lot of the hardware from error correction and allow it to get on with the useful stuff," Brown explained.

The result is an approach that could change quantum computing forever.

"This result establishes a new option for performing fault-tolerant gates, which has the potential to greatly reduce overhead and bring practical quantum computing closer," saidDr. Naomi Nickerson, Director of Quantum Architecture at PsiQuantum in Palo Alto, California, who is not connected to the research.

Read the original here:
Quantum Physicist Invents Code to Achieve the Impossible - Interesting Engineering

The University of New Mexico Becomes IBM Q Hub’s First University Member – UNM Newsroom

Q Hub membership and new faculty hire will build on existing quantum expertise and investments

Under the direction of Michael Devetsikiotis, chair of the Department of Electrical and Computer Engineering (ECE), The University of New Mexico recently joined the IBM Q Hub at North Carolina State University as its first university member.

The NC State IBM Q Hub is a cloud-based quantum computing hub, one of six worldwide and the first in North America to be part of the global IBM Q Network. This global network links national laboratories, tech startups, Fortune 500 companies, and research universities, providing access to IBMs largest quantum computing systems.

Michael Devetsikiotis, chair, Department of Electrical and Computer Engineering

Mainstream computer processors inside our laptops, desktops, and smartphones manipulate bits, information that can only exist as either a 1 or a 0. In other words, the computers we are used to function through programming, which dictates a series of commands with choices restricted to yes/no or if this, then that.Quantum computers, on the other hand, process quantum bits or qubits, that are not restricted to a binary choice. Quantum computers can choose if this, then that or both through complex physics concepts such as quantum entanglement. This allows quantum computers to process information more quickly, and in unique ways compared to conventional computers.

Access to systems such as IBMs newly announced 53 qubit processor (as well as several 20 qubit machines) is just one of the many benefits to UNMs participation in the IBM Q Hub when it comes to data analysis and algorithm development for quantum hardware. Quantum knowledge will only grow with time, and the IBM Q Hub will provide unique training and research opportunities for UNM faculty and student researchers for years to come.

Quantum computer developed by IBM Research in Zrich, Switzerland.

How did this partnership come to be? Two years ago, a sort of call to arms was sent out among UNM quantum experts, saying now was the time for big ideas because federal support for quantum research was gaining traction. Devetsikiotis vision was to create a quantum ecosystem, one that could unite the foundational quantum research in physics atUNM's Center for Quantum Information and Control(CQuIC) with new quantum computing and engineering initiatives for solving big real-world mathematical problems.

At first, I thought [quantum] was something for physicists, explains Devetsikiotis. But I realized its a great opportunity for the ECE department to develop real engineering solutions to these real-world problems.

CQuIC is the foundation of UNMs long-standing involvement in quantum research, resulting in participation in the National Quantum Initiative (NQI) passed by Congress in 2018 to support multidisciplinary research and training in quantum information science. UNM has been a pioneer in quantum information science since the field emerged 25 years ago, as CQuIC Director Ivan Deutsch knows first-hand.

This is a very vibrant time in our field, moving from physics to broader activities, says Deutsch, and [Devetsikiotis] has seen this as a real growth area, connecting engineering with the existing strengths we have in the CQuIC.

With strategic support from the Office of the Vice President for Research, Devetsikiotis secured National Science Foundation funding to support a Quantum Computing & Information Science (QCIS) faculty fellow. The faculty member will join the Department of Electrical and Computer Engineering with the goal to unite well-established quantum research in physics with new quantum education and research initiatives in engineering. This includes membership in CQuIC and implementation of the IBM Q Hub program, as well as a partnership with Los Alamos National Lab for a Quantum Computing Summer School to develop new curricula, educational materials, and mentorship of next-generation quantum computing and information scientists.

IBM Q Hub at North Carolina State University.

As part of the Q Hub at NC State, UNM gains access to IBMs largest quantum computing systems for commercial use cases and fundamental research. It also allows for the restructuring of existing quantum courses to be more hands-on and interdisciplinary than they have in the past, as well as the creation of new courses, a new masters degree program in QCIS, and a new university-wide Ph.D. concentration in QCIS that can be added to several departments including ECE, Computer Science, Physics and Astronomy, and Chemistry.

Theres been a lot of challenges, Devetsikiotis says, but there has also been a lot of good timing, and thankfully The University has provided support for us. UNM has solidified our seat at the quantum table and can now bring in the industrial side.

Continued here:
The University of New Mexico Becomes IBM Q Hub's First University Member - UNM Newsroom

1QBit and Canadian health care providers team up to empower front-line clinicians with Health Canada’s first approved AI tool for radiology in the…

Health and technology providers have joined forces to deploy XrAI, a machine learning tool that acts as a co-pilot for clinicians to increase accuracy in detecting lung abnormalities associated with diseases such as COVID-19 infection, pneumonia, tuberculosis, and lung cancer.

VANCOUVER, May 7, 2020 /CNW/ - 1QBit, a global leader in advanced computing and software development, and its partners representing health authorities from East to West, have received funding from the Digital Technology Supercluster to accelerate the clinical deployment ofXrAI, the first radiology AI (artificial intelligence) tool to be certified as a Class III Medical Device by Health Canada.

XrAI(pronounced "X-ray") is a machine learning, clinical-decision support tool that improves the accuracy and consistency of chest X-ray interpretation. This tool supports medical teams by identifying lung abnormalities on chest radiographs within the teams' existing clinical workflow, requiring little to no further training. Its analysis capabilities empower clinicians with this informationso that they can more effectively manage patients with COVID-19 infections or other respiratory complications, such as SARS, pneumonia, and tuberculosis.

"As a physician, I recognize that trust is the currency with which health systems operate. So we designed XrAI to act as a trusted co-pilot that helps doctors and nurses on the front lines. The tool identifies a lung abnormality and displays this information in terms of a confidence level. This is intuitive to busy clinicians, as it reflects a familiar way in which a radiologist would share their opinion," said Dr. Deepak Kaura, Chief Medical Officer of 1QBit. "We were so impressed by how quickly the Saskatchewan Health Authority mobilized to conduct the clinical trial for XrAI, which had actually been planned for a later date. Equally impressive was Health Canada, whose team was detail oriented, responding diligently and acting effectively to grant us approval."

XrAI received certification as a Class III Medical Device by Health Canada last month, based on rigorous review and the results of a single-blind, randomized control clinical trial. 1QBit trained the algorithm on 250,000 cases taken from more than 500,000 anonymized radiograph images from Canadian health organizations, and open and subscription-based datasets. The data covered a broad spectrum of diseases, across geographically and demographically diverse populations, while the tool's features were designed with input from a broad cross section of physicians and other health care professionals.

"Many physicians recognize the value of machine learning applied to our field. However, we are not willing to sacrifice the scientific rigour upon which medicine and our profession has been built. XrAI is one of the first AI tools that I have seen that has been built and validated with a randomized control trial across multiple physician groups," said Dr. Paul Babyn, Physician Executive of the Saskatchewan Health Authority. "The trust that 1QBit's tool has garnered as a result of its rigorous approach is what I believe has led to such a prompt and positive response from the medical community."

The ability to get XrAI into the hands of clinicians is being accelerated by funding from the Digital Technology Supercluster through its COVID-19 Program. This award is contributing to the implementation costs for partnering health care authorities to deploy the software across their clinical systems, which span hospitals and clinics in British Columbia, Saskatchewan, and Ontario. Microsoft is also providing support for 1QBit as they implement XrAI with their partners.

"XrAI is yet another example of the Supercluster's 'all hands-on deck' approach to overcoming the challenges presented by COVID-19. By collaborating closely with health authorities, 1QBit has allowed us to expedite this critical technology to get into the hands of practitioners across the country and contribute to what we expect may be a turning point in the speed at which we identify abnormalities and treat those infected with COVID-19," said Sue Paish, CEO of the Digital Technology Supercluster.

Early on, 1QBit engaged health authorities, front-line health care workers, and technology providers to ensure the roll-out of its technology would be led by physicians. 1QBit's partners include the Saskatchewan Health Authority, the Fraser Health Authority, the First Nations Health Authority, Trillium Health Partners, the Vancouver Coastal Health Authority, the University of British Columbia's Faculty of Medicine as well as The Red Cross. Trans-national implementation of XrAI is now underway and will provide a comprehensive and inclusive elevation of care from West to East, including First Nations, the north, and rural communities, as well as urban centres.

1QBit is continuing to partner with new clinicians and health organizations interested in arming their teams with XrAI to enhance quality of care, and to improve the efficiency of health resources during the current COVID-19 pandemic and beyond.

About 1QBit:

1QBitis a global leader in advanced computing and software development. Founded in 2012, 1QBit builds hardware-agnostic software and partners with companies taking on computationally exhaustive problems in advanced materials, life sciences, energy, and finance. Trusted by Fortune 500 companies and top research institutions internationally, 1QBit is seen as an industry leader in quantum computing, machine learning, software development and hardware optimization. Headquartered in Vancouver, Canada, the company employs over 120 mathematicians, computer scientists, physicists, chemists, software developers, physicians, biomedical experts, and quantum computing specialists. 1QBit develops novel solutions to computational problems along the full stack of classical and quantum computing, from hardware innovations to commercial application development.

About Digital Technology Supercluster:

The Digital Technology Supercluster is led by global companies like Canfor, MDA, Microsoft, Telus, Teck Resources Limited,Mosaic Forest Management, LifeLabs, andTerramera, and tech industry leaders such asD-Wave Systems, Finger Food Advanced Technology Group, andLlamaZOO. Members also include BC's post-secondary institutions, including the Emily Carr University of Art + Design, theBritish Columbia Institute of Technology, theUniversity of British Columbia, andSimon Fraser University. A full list of members can be foundhere.

About the COVID-19 Program:

The COVID-19 Program funds projects that contribute to improving the health and safety of Canadians, supporting Canada's ability to address issues created by the COVID-19 pandemic. In addition, these projects will build the expertise and capacity needed to address and anticipate issues that may arise in future health crises. More information can be foundhere.

SOURCE 1QBit

For further information: For media requests related to 1QBit and XrAI, please contact Amanda Downs at [emailprotected] or at +1 (778) 425-4434; For media requests related to the Digital Technology Supercluster, please contact Elysa Darling at [emailprotected] or at +1 (587) 890-9833.

Homepage

View post:
1QBit and Canadian health care providers team up to empower front-line clinicians with Health Canada's first approved AI tool for radiology in the...

Quantum computing will (eventually) help us discover vaccines in days – VentureBeat

The coronavirus is proving that we have to move faster in identifying and mitigating epidemics before they become pandemics because, in todays global world, viruses spread much faster, further, and more frequently than ever before.

If COVID-19 has taught us anything, its that while our ability to identify and treat pandemics has improved greatly since the outbreak of the Spanish Flu in 1918, there is still a lot of room for improvement. Over the past few decades, weve taken huge strides to improve quick detection capabilities. It took a mere 12 days to map the outer spike protein of the COVID-19 virus using new techniques. In the 1980s, a similar structural analysis for HIV took four years.

But developing a cure or vaccine still takes a long time and involves such high costs that big pharma doesnt always have incentive to try.

Drug discovery entrepreneur Prof. Noor Shaker posited that Whenever a disease is identified, a new journey into the chemical space starts seeking a medicine that could become useful in contending diseases. The journey takes approximately 15 years and costs $2.6 billion, and starts with a process to filter millions of molecules to identify the promising hundreds with high potential to become medicines. Around 99% of selected leads fail later in the process due to inaccurate prediction of behavior and the limited pool from which they were sampled.

Prof. Shaker highlights one of the main problems with our current drug discovery process: The development of pharmaceuticals is highly empirical. Molecules are made and then tested, without being able to accurately predict performance beforehand. The testing process itself is long, tedious, cumbersome, and may not predict future complications that will surface only when the molecule is deployed at scale, further eroding the cost/benefit ratio of the field. And while AI/ML tools are already being developed and implemented to optimize certain processes, theres a limit to their efficiency at key tasks in the process.

Ideally, a great way to cut down the time and cost would be to transfer the discovery and testing from the expensive and time-inefficient laboratory process (in-vitro) we utilize today, to computer simulations (in-silico). Databases of molecules are already available to us today. If we had infinite computing power we could simply scan these databases and calculate whether each molecule could serve as a cure or vaccine to the COVID-19 virus. We would simply input our factors into the simulation and screen the chemical space for a solution to our problem.

In principle, this is possible. After all, chemical structures can be measured, and the laws of physics governing chemistry are well known. However, as the great British physicist Paul Dirac observed: The underlying physical laws necessary for the mathematical theory of a large part of physics and the whole of chemistry are thus completely known, and the difficulty is only that the exact application of these laws leads to equations much too complicated to be soluble.

In other words, we simply dont have the computing power to solve the equations, and if we stick to classical computers we never will.

This is a bit of a simplification, but the fundamental problem of chemistry is to figure out where electrons sit inside a molecule and calculate the total energy of such a configuration. With this data, one could calculate the properties of a molecule and predict its behavior. Accurate calculations of these properties will allow the screening of molecular databases for compounds that exhibit particular functions, such as a drug molecule that is able to attach to the coronavirus spike and attack it. Essentially, if we could use a computer to accurately calculate the properties of a molecule and predict its behavior in a given situation, it would speed up the process of identifying a cure and improve its efficiency.

Why are quantum computers much better than classical computers at simulating molecules?

Electrons spread out over the molecule in a strongly correlated fashion, and the characteristics of each electron depend greatly on those of its neighbors. These quantum correlations (or entanglement) are at the heart of the quantum theory and make simulating electrons with a classical computer very tricky.

The electrons of the COVID-19 virus, for example, must be treated in general as being part of a single entity having many degrees of freedom, and the description of this ensemble cannot be divided into the sum of its individual, distinguishable electrons. The electrons, due to their strong correlations, have lost their individuality and must be treated as a whole. So to solve the equations, you need to take into account all of the electrons simultaneously. Although classical computers can in principle simulate such molecules, every multi-electron configuration must be stored in memory separately.

Lets say you have a molecule with only 10 electrons (forget the rest of the atom for now), and each electron can be in two different positions within the molecule. Essentially, you have 2^10=1024 different configurations to keep track of rather just 10 electrons which would have been the case if the electrons were individual, distinguishable entities. Youd need 1024 classical bits to store the state of this molecule. Quantum computers, on the other hand, have quantum bits (qubits), which can be made to strongly correlate with one another in the same way electrons within molecules do. So in principle, you would need only about 10 such qubits to represent the strongly correlated electrons in this model system.

The exponentially large parameter space of electron configurations in molecules is exactly the space qubits naturally occupy. Thus, qubits are much more adapted to the simulation of quantum phenomena. This scaling difference between classical and quantum computation gets very big very quickly. For instance, simulating penicillin, a molecule with 41 atoms (and many more electrons) will require 10^86 classical bits, or more bits than the number of atoms in the universe. With a quantum computer, you would only need about 286 qubits. This is still far more qubits than we have today, but certainly a more reasonable and achievable number.The COVID-19 virus outer spike protein, for comparison, contains many thousands of atoms and is thus completely intractable for classical computation. The size of proteins makes them intractable to classical simulation with any degree of accuracy even on todays most powerful supercomputers. Chemists and pharma companies do simulate molecules with supercomputers (albeit not as large as the proteins), but they must resort to making very rough molecule models that dont capture the details a full simulation would, leading to large errors in estimation.

It might take several decades until a sufficiently large quantum computer capable of simulating molecules as large as proteins will emerge. But when such a computer is available, it will mean a complete revolution in the way the pharma and the chemical industries operate.

The holy grail end-to-end in-silico drug discovery involves evaluating and breaking down the entire chemical structures of the virus and the cure.

The continued development of quantum computers, if successful, will allow for end-to-end in-silico drug discovery and the discovery of procedures to fabricate the drug. Several decades from now, with the right technology in place, we could move the entire process into a computer simulation, allowing us to reach results with amazing speed. Computer simulations could eliminate 99.9% of false leads in a fraction of the time it now takes with in-vitro methods. With the appearance of a new epidemic, scientists could identify and develop a potential vaccine/drug in a matter of days.

The bottleneck for drug development would then move from drug discovery to the human testing phases including toxicity and other safety tests. Eventually, even these last stage tests could potentially be expedited with the help of a large scale quantum computer, but that would require an even greater level of quantum computing than described here. Tests at this level would require a quantum computer with enough power to contain a simulation of the human body (or part thereof) that will screen candidate compounds and simulate their impact on the human body.

Achieving all of these dreams will demand a continuous investment into the development of quantum computing as a technology. As Prof. Shohini Ghose said in her 2018 Ted Talk: You cannot build a light bulb by building better and better candles. A light bulb is a different technology based on a deeper scientific understanding. Todays computers are marvels of modern technology and will continue to improve as we move forward. However, we will not be able to solve this task with a more powerful classical computer. It requires new technology, more suited for the task.

(Special thanks Dr. Ilan Richter, MD MPH for assuring the accuracy of the medical details in this article.)

Ramon Szmuk is a Quantum Hardware Engineer at Quantum Machines.

See the original post:
Quantum computing will (eventually) help us discover vaccines in days - VentureBeat

The Force is With Physicist Andy Howell as He Discusses Star Trek Science With Cast and Crew – Noozhawk

In the most recent episode of his YouTube series Science vs. Cinema, UC Santa Barbara physicist Andy Howell takes on Star Trek: Picard, exploring how the CBS offerings presentation of supernovae and quantum computing stack up against real world science.

For Howell, the series that reviews the scientific accuracy and portrayal of scientists in Hollywoods top sci-fi films is as much an excuse to dive into exciting scientific concepts and cutting edge research.

Science fiction writers are fond of grappling with deep philosophical questions, he said. I was really excited to see that UCSB researchers were thinking about some of the same things in a more grounded way.

For the Star Trek episode, Howell spoke with series creators Alex Kurtzman and Michael Chabon, as well as a number of cast members, including Patrick Stewart. Joining him to discuss quantum science and consciousness were John Martinis a quantum expert at UCSB and chief scientist of the Google quantum computing hardware group and fellow UCSB physics professor

Matthew Fisher. Fishers group is studying whether quantum mechanics plays a role in the brain, a topic taken up in the new Star Trek series.

Howell also talked supernovae and viticulture with friend and colleague Brian Schmidt, vice chancellor of the Australian National University. Schmidt won the 2011 Nobel Prize in Physics for helping to discover that the expansion of the universe is accelerating.

"We started Science vs. Cinema to use movies as a jumping-off point to talk science Howell said. Star Trek Picard seemed like the perfect fit. Star Trek has a huge cultural impact and was even one of the things that made me want to study astronomy.

Previous episodes of Science vs. Cinema have separated fact from fiction in films such as Star Wars, The Current War, Ad Astra, Arrival and The Martian. The success of prior episodes enabled Howell to get early access to the show and interview the cast and crew.

"What most people think about scientific subjects probably isn't what they learned in a university class, but what they saw in a movie, Howell remarked. That makes movies an ideal springboard for introducing scientific concepts. And while I can only reach dozens of students at a time in a classroom, I can reach millions on TV or the internet.

Our professional journalists are working round the clock to make sure you have the news and information you need in these uncertain times.

If you appreciate Noozhawks coronavirus coverage, and the rest of the local Santa Barbara County news we deliver to you 24/7, please become a member of our Hawks Club today.

You need us more than ever, and we need your support.

We provide special member benefits to show how much we appreciate your confidence.

Continue reading here:
The Force is With Physicist Andy Howell as He Discusses Star Trek Science With Cast and Crew - Noozhawk

When quantum computing and AI collide – Raconteur

Machine-learning and quantum computing are two technologies that have incredible potential in their own right. Now researchers are bringing them together. The main goal is to achieve a so-called quantum advantage, where complex algorithms can be calculated significantly faster than with the best classical computer. This would be a game-changer in the field of AI.

Such a breakthrough could lead to new drug discoveries, advances in chemistry, as well as better data science, weather predictions and natural-language processing. We could be as little as three years away from achieving a quantum advantage in AI if the largest players in the quantum computing space meet their goals, says Ilyas Khan, chief executive of Cambridge Quantum Computing.

This comes after Google announced late last year that it had achieved quantum supremacy, claiming their quantum computer had cracked a problem that would take even the fastest conventional machine thousands of years to solve.

Developing quantum machine-learning algorithms could allow us to solve complex problems much more quickly. To realise the full potential of quantum computing for AI, we need to increase the number of qubits that make up these systems, says Dr Jay Gambetta, vice president of quantum computing at IBM Research.

Quantum devices exploit the strange properties of quantum physics and mechanics to speed up calculations. Classical computers store data in bits, as zeros or ones. Quantum computers use qubits, where data can exist in two different states simultaneously. This gives them more computational fire power. Were talking up to a million times faster than some classical computers.

And when you add a single qubit, you double the quantum computers processing power. To meet Moores Law [the number of transistors on a computer chip is doubled about every two years while the cost falls], you would need to add a single qubit every year, says Peter Chapman, chief executive of IonQ.

Our goal is to double the number of qubits every year. We expect quantum computers to be able to routinely solve problems that supercomputers cannot, within two years.

Already industrial behemoths, such as IBM, Honeywell, Google, Microsoft and Amazon, are active in the quantum computing sector. Their investments will have a major impact on acceleratingdevelopments.

We expect algorithm development to accelerate considerably. The quantum community has recognised economic opportunities in solving complex optimisation problems that permeate many aspects of the business world. These range from how do you assemble a Boeing 777 with millions of parts in the correct order? to challenges in resource distribution, explains Dr David Awschalom, professor of quantum information at the University of Chicago.

The quantum community has recognised economic opportunities in solving complex optimisation problems that permeate many aspects of the business world

Many of the computational tasks that underlie machine-learning, used currently for everything from image recognition to spam detection, have the correct form to allow a quantum speed up. Not only would this lead to faster calculations and more resource-efficient algorithms, it could also allow AI to tackle problems that are currently unfeasible because of their complexity and size.

Quantum computers arent a panacea for all humankinds informatic problems. They are best suited to very specific tasks, where there are a huge number of variables and permutations, such as calculating the best delivery route for rubbish trucks or the optimal path through traffic congestion. Mitsubishi in Japan and Volkswagen in Germany have deployed quantum computing with AI to explore solutions to these issues.

There will come a time when quantum AI could be used to help us with meaningful tasks from industrial scheduling to logistics. Financial optimisation for portfolio management could also be routinely handled by quantum computers.

This sounds like it might have limited use, but it turns out that many business problems can be expressed as an optimisation problem. This includes machine-learning problems, says Chapman.

Within a few short years we will enter the start of the quantum era. Its important for people to be excited about quantum computing; it allows government funding to increase and aids in recruitment. We need to continue to push the technology and also to support early adopters to explore how they can apply quantum computing to their businesses.

However, its still early days. The next decade is a more accurate time frame in terms of seeing quantum computing and AI coalesce and really make a difference. The need to scale to larger and more complex problems with real-world impact is one area of innovation, as is creating quantum computers that have greater precision and performance.

The limitation of quantum technology, particularly when it comes to AI, is summarised by the term decoherence. This is caused by vibrations, changes in temperature, noise and interfacing with the external environment. This causes computers to lose their quantum state and prevents them from completing computational tasks in a timely manner or at all, says Khan.

The industrys immediate priority has shifted from sheer processing power, measured by qubits, to performance, better measured by quantum volume. Rightly so the industry is channelling its energy into reducing errors to break down this major barrier and unlock the true power of machine-learning.

Over time it is the ease of access to these computers that will lead to impactful business applications and the development of successful quantum machine-learning. IBM has opened its doors to its quantum computers via the cloud since 2016 for anyone to test ideas. In the process it has fostered a vibrant community with more than 200,000 users from over 100 organisations.

The more developers and companies that get involved in first solving optimisation problems related to AI and then over time building quantum machine-learning and AI development, the sooner well see even more scalable and robust applications with business value, explains Murray Thom, vice president of software at D-Wave Systems.

Most importantly, we need a greater number of smart people identifying and developing applications. That way we will be able to overcome limitations much faster, and expand the tools and platform so they are easier to use. Bringing in more startups and forward-thinking enterprise organisations to step into quantum computing and identify potential applications for their fields is also crucial.

Follow this link:
When quantum computing and AI collide - Raconteur

RAND report finds that, like fusion power and Half Life 3, quantum computing is still 15 years away – The Register

Quantum computers pose an "urgent but manageable" threat to the security of modern communications systems, according to a report published Thursday by influential US RAND Corporation.

The non-profit think tank's report, "Securing Communications in the Quantum Computing Age: Managing the Risks to Encryption," urges the US government to act quickly because quantum code-breaking could be a thing in, say, 12-15 years.

If adequate implementation of new security measures has not taken place by the time capable quantum computers are developed, it may become impossible to ensure secure authentication and communication privacy without major, disruptive changes, said Michael Vermeer, a RAND scientist and lead author of the report in a statement.

Experts in the field of quantum computing like University of Texas at Austin computer scientist Scott Aaronson have proposed an even hazier timeline.

Noting that the quantum computers built by Google and IBM have been in the neighborhood of 50 to 100 quantum bits (qubits) and that running Shor's algorithm to break public key RSA cryptosystems would probably take several thousand logical qubits meaning millions of physical qubits due to error correction Aaronson recently opined, "I dont think anyone is close to that, and we have no idea how long it will take."

But other boffins, like University of Chicago computer science professor Diana Franklin, have suggested Shor's algorithm might be a possibility in a decade and a half.

So even though quantum computing poses a theoretical threat to most current public-key cryptography and less risk for lattice-based, symmetric, privacy key, post-quantum, and quantum cryptography there's not much consensus about how and when this threat might manifest itself.

Nonetheless, the National Institute of Standards and Technology, the US government agency overseeing tech standards, has been pushing the development of quantum-resistant cryptography since at least 2016. Last year it winnowed a list of proposed post-quantum crypto (PQC) algorithms down to a field of 26 contenders.

The RAND report anticipates quantum computers capable of crypto-cracking will be functional by 2033, with the caveat that experts propose dates both before and after that. PQC algorithm standards should gel within the next five years, with adoption not expected until the mid-to-late 2030s, or later.

But the amount of time required for the US and the rest of the world to fully implement those protocols to mitigate the risk of quantum crypto cracking may take longer still. Note that the US government is still running COBOL applications on ancient mainframes.

"If adequate implementation of PQC has not taken place by the time capable quantum computers are developed, it may become impossible to ensure secure authentication and communication privacy without major, disruptive changes to our infrastructure," the report says.

RAND's report further notes that consumer lack of awareness and indifference to the issue means there will be no civic demand for change.

Hence, the report urges federal leadership to protect consumers, perhaps unaware that Congress is considering the EARN-IT Act, which critics characterize as an "all-out assault on encryption."

"If we act in time with appropriate policies, risk reduction measures, and a collective urgency to prepare for the threat, then we have an opportunity for a future communications infrastructure that is as safe as or more safe than the current status quo, despite overlapping cyber threats from conventional and quantum computers," the report concludes.

It's worth recalling that a 2017 National Academy of Sciences, Engineering, and Medicine report, "Global Health and the Future Role of the United States," urged the US to maintain its focus on global health security and to prepare for infection disease threats.

That was the same year nonprofit PATH issued a pandemic prevention report urging the US government to "maintain its leadership position backed up by the necessary resources to ensure continued vigilance against emerging pandemic threats, both at home and abroad."

The federal government's reaction to COVID-19 is a testament to the impact of reports from external organizations. We can only hope that the threat of crypto-cracking quantum computers elicits a response that's at least as vigorous.

Sponsored: Webcast: Build the next generation of your business in the public cloud

Read the original here:
RAND report finds that, like fusion power and Half Life 3, quantum computing is still 15 years away - The Register

Alex Garland on ‘Devs,’ free will and quantum computing – Engadget

Garland views Amaya as a typical Silicon Valley success story. In the world of Devs, it's the first company that manages to mass produce quantum computers, allowing them to corner that market. (Think of what happened to search engines after Google debuted.) Quantum computing has been positioned as a potentially revolutionary technology for things like healthcare and encryption, since it can tackle complex scenarios and data sets more effectively than traditional binary computers. Instead of just processing inputs one at a time, a quantum machine would theoretically be able to tackle an input in multiple states, or superpositions, at once.

By mastering this technology, Amaya unlocks a completely new view of reality: The world is a system that can be decoded and predicted. It proves to them that the world is deterministic. Our choices don't matter; we're all just moving along predetermined paths until the end of time. Garland is quick to point out that you don't need anything high-tech to start asking questions about determinism. Indeed, it's something that's been explored since Plato's allegory of the cave.

"What I did think, though, was that if a quantum computer was as good at modeling quantum reality as it might be, then it would be able to prove in a definitive way whether we lived in a deterministic state," Garland said. "[Proving that] would completely change the way we look at ourselves, the way we look at society, the way society functions, the way relationships unfold and develop. And it would change the world in some ways, but then it would restructure itself quickly."

The sheer difficulty of coming up with something -- anything -- that's truly spontaneous and isn't causally related to something else in the universe is the strongest argument in favor of determinism. And it's something Garland aligns with personally -- though that doesn't change how he perceives the world.

"Whether or not you or I have free will, both of us could identify lots of things that we care about," he said. "There are lots of things that we enjoy or don't enjoy. Or things that we're scared of, or we anticipate. And all of that remains. It's not remotely affected by whether we've got free will or not. What might be affected is, I think, our capacity to be forgiving in some respects. And so, certain kinds of anti-social or criminal behavior, you would start to think about in terms of rehabilitation, rather than punishment. Because then, in a way, there's no point punishing someone for something they didn't decide to do."

More here:
Alex Garland on 'Devs,' free will and quantum computing - Engadget

Quantum Computing Market to Witness Robust Expansion by 2024: Intel Corporation, Google Inc., Evolutionq Inc Cole Reports – Cole of Duty

Quantum Computing Market Competitive Insights 2020, professional and in-depth study on the Quantum Computing industry with a focus on the Profit Margin Analysis, Market Value Chain Analysis, Market Entry Strategies, recent developments & their impact on the market, Roadmap of Quantum Computing Market, Opportunities, Challenges, SWOT analysis, and PESTEL analysis, Market estimates, size, and forecast for product segments from 2020 to 2024. An In-depth analysis of newer growth tactics influenced by the market-leading companies shows the global competitive scale of this market sector. The industry growth outlook is captured by ensuring ongoing process improvements of players and optimal investment strategies.

Get Sample Copy of Quantum Computing Report 2020: http://www.researchreportsinc.com/report-sample/593583

The research report studies the market Mergers & Acquisitions, Geographic Scope, company profile, Contracts, New Product Launches, Competitive Situation & Trends, key players market shares(2020), and its growth prospects during the forecast period. The Quantum Computing market report provides detailed data to mentor market key players while forming important business decisions. The given report has focused on the key aspects of the markets to ensure maximum benefit and growth potential for our readers and our extensive analysis of the market will help them achieve this much more efficiently.

The Major Companies Covered In This Report:

Intel Corporation, Google Inc., Evolutionq Inc, Magiq Technologies Inc., Nippon Telegraph And Telephone Corporation (NTT), QC Ware Corp., Accenture, Hitachi Ltd, QxBranch, LLC, Rigetti Computing, International Business Machines Corporation (IBM), 1QB Information Technologies Inc., Hewlett Packard Enterprise (HP), D-Wave Systems Inc., Northrop Grumman Corporation, Station Q Microsoft Corporation, Cambridge Quantum Computing Ltd, Quantum Circuits, Inc, Fujitsu, University Landscape, Toshiba Corporation

The Quantum Computing report covers the following Types:

On the basis of applications, the market covers:

Grab Your Report at an Impressive Discount @ http://www.researchreportsinc.com/check-discount/593583

Why You Should Buy This Report?

The report offers effective guidelines and recommendations for vendors to secure a position of strength in the Quantum Computing industry. The newly arrived key players in the market can up their growth potential by a great amount and also the current dominators of the market can keep up their dominance for a longer time by the use of our report. The Quantum Computing Market Report mentions the key geographies, market landscapes alongside the product value, revenue, volume, production, supply, demand, market growth rate, and trends, etc. This report also provides Porters Five Forces analysis, investment feasibility analysis, and investment return analysis.

Major Points Covered in The Report:

Read more here:
Quantum Computing Market to Witness Robust Expansion by 2024: Intel Corporation, Google Inc., Evolutionq Inc Cole Reports - Cole of Duty

Technology alliances will help shape our post-pandemic future – C4ISRNet

Theres no question the post-corona world will be very different. How it will look depends on actions the worlds leaders take. Decisions made in coming months will determine whether we see a renewed commitment to a rules-based international order, or a fragmented world increasingly dominated by authoritarianism. Whomever steps up to lead will drive the outcome.

China seeks the mantle of global leadership. Beijing is exploiting the global leadership vacuum, the fissures between the United States and its allies, and the growing strain on European unity. The Chinese Communist Party has aggressively pushed a narrative of acting swiftly and decisively to contain the virus, building goodwill through mask diplomacy, and sowing doubts about the virus origin to deflect blame for the magnitude of the crisis and to rewrite history. Even though the results so far are mixed, the absence of the United States on the global stage provides Beijing with good momentum.

Before the pandemic, the worlds democracies already faced their gravest challenge in decades: the shift of economic power to illiberal states. By late 2019, autocratic regimes accounted for a larger share of global GDP than democracies for the first time since 1900. As former U.K. foreign secretary David Miliband recently observed, liberal democracy is in retreat. How the United States and like-minded partners respond post-pandemic will determine if that trend holds.

There is urgency to act the problem is now even more acute. The countries that figure out how to quickly restart and rebuild their economies post-pandemic will set the course for the 21st century. It is not only economic heft that is of concern: political power and military might go hand in hand with economic dominance.

At the center of this geostrategic and economic competition are technologies artificial intelligence, quantum computing, biotechnology, and 5G that will be the backbone of the 21st century economy. Leadership and ongoing innovation in these areas will confer critical economic, political, and military power, and the opportunity to shape global norms and values. The pre-crisis trajectory of waning clout in technology development, standards-setting, and proliferation posed an unacceptable and avoidable challenge to the interests of the worlds leading liberal-democratic states.

The current crisis accentuates this even more: it lays bare the need to rethink and restructure global supply chains; the imperative of ensuring telecommunication networks are secure, robust, and resilient; the ability to surge production of critical materiel, and the need to deter and counteract destructive disinformation. This is difficult and costly and it is best done in concert.

Bold action is needed to set a new course that enhances the ability of the worlds democracies to out-compete increasingly capable illiberal states. The growing clout of authoritarian regimes is not rooted in better strategy or more effective statecraft. Rather, it lies in the fractious and complacent nature of the worlds democracies and leading technology powers.

In response, a new multilateral effort an alliance framework is needed to reverse these trends. The worlds technology and democracy leaders the G7 members and countries like Australia, the Netherlands, and South Korea should join forces to tackle matters of technology policy. The purpose of this initiative is three-fold: one, regain the initiative in the global technology competition through strengthened cooperation between like-minded countries; two, protect and preserve key areas of competitive technological advantage; and three, promote collective norms and values around the use of emerging technologies.

Sign up for the C4ISRNET newsletter about future battlefield technologies.

(please select a country) United States United Kingdom Afghanistan Albania Algeria American Samoa Andorra Angola Anguilla Antarctica Antigua and Barbuda Argentina Armenia Aruba Australia Austria Azerbaijan Bahamas Bahrain Bangladesh Barbados Belarus Belgium Belize Benin Bermuda Bhutan Bolivia Bosnia and Herzegovina Botswana Bouvet Island Brazil British Indian Ocean Territory Brunei Darussalam Bulgaria Burkina Faso Burundi Cambodia Cameroon Canada Cape Verde Cayman Islands Central African Republic Chad Chile China Christmas Island Cocos (Keeling) Islands Colombia Comoros Congo Congo, The Democratic Republic of The Cook Islands Costa Rica Cote D'ivoire Croatia Cuba Cyprus Czech Republic Denmark Djibouti Dominica Dominican Republic Ecuador Egypt El Salvador Equatorial Guinea Eritrea Estonia Ethiopia Falkland Islands (Malvinas) Faroe Islands Fiji Finland France French Guiana French Polynesia French Southern Territories Gabon Gambia Georgia Germany Ghana Gibraltar Greece Greenland Grenada Guadeloupe Guam Guatemala Guinea Guinea-bissau Guyana Haiti Heard Island and Mcdonald Islands Holy See (Vatican City State) Honduras Hong Kong Hungary Iceland India Indonesia Iran, Islamic Republic of Iraq Ireland Israel Italy Jamaica Japan Jordan Kazakhstan Kenya Kiribati Korea, Democratic People's Republic of Korea, Republic of Kuwait Kyrgyzstan Lao People's Democratic Republic Latvia Lebanon Lesotho Liberia Libyan Arab Jamahiriya Liechtenstein Lithuania Luxembourg Macao Macedonia, The Former Yugoslav Republic of Madagascar Malawi Malaysia Maldives Mali Malta Marshall Islands Martinique Mauritania Mauritius Mayotte Mexico Micronesia, Federated States of Moldova, Republic of Monaco Mongolia Montserrat Morocco Mozambique Myanmar Namibia Nauru Nepal Netherlands Netherlands Antilles New Caledonia New Zealand Nicaragua Niger Nigeria Niue Norfolk Island Northern Mariana Islands Norway Oman Pakistan Palau Palestinian Territory, Occupied Panama Papua New Guinea Paraguay Peru Philippines Pitcairn Poland Portugal Puerto Rico Qatar Reunion Romania Russian Federation Rwanda Saint Helena Saint Kitts and Nevis Saint Lucia Saint Pierre and Miquelon Saint Vincent and The Grenadines Samoa San Marino Sao Tome and Principe Saudi Arabia Senegal Serbia and Montenegro Seychelles Sierra Leone Singapore Slovakia Slovenia Solomon Islands Somalia South Africa South Georgia and The South Sandwich Islands Spain Sri Lanka Sudan Suriname Svalbard and Jan Mayen Swaziland Sweden Switzerland Syrian Arab Republic Taiwan, Province of China Tajikistan Tanzania, United Republic of Thailand Timor-leste Togo Tokelau Tonga Trinidad and Tobago Tunisia Turkey Turkmenistan Turks and Caicos Islands Tuvalu Uganda Ukraine United Arab Emirates United Kingdom United States United States Minor Outlying Islands Uruguay Uzbekistan Vanuatu Venezuela Viet Nam Virgin Islands, British Virgin Islands, U.S. Wallis and Futuna Western Sahara Yemen Zambia Zimbabwe

Subscribe

By giving us your email, you are opting in to the C4ISRNET Daily Brief.

Such cooperation is vital to effectively deal with the hardest geopolitical issues that increasingly center on technology, from competing economically to building deterrence to combating disinformation. This group should not be an exclusive club: it should also work with countries like Finland and Sweden to align policies on telecommunications; Estonia, Israel, and New Zealand for cyber issues; and states around the world to craft efforts to counter the proliferation of Chinese surveillance technology and offer sound alternatives to infrastructure development, raw material extraction, and loans from China that erode their sovereignty.

The spectrum of scale and ambition this alliance can tackle is broad. Better information sharing would yield benefits on matters like investment screening, counterespionage, and fighting disinformation. Investments in new semiconductor fabs could create more secure and diverse supply chains. A concerted effort to promote open architecture in 5G could usher in a paradigm shift for an entire industry. Collaboration will also be essential to avoiding another pandemic calamity.

Similar ideas are percolating among current and former government leaders in capitals such as Tokyo, Berlin, London, and Washington, with thought leaders like Jared Cohen and Anja Manuel, and in think tanks around the world. The task at hand is to collate these ideas, find the common ground, and devise an executable plan. This requires tackling issues like organizational structure, governance, and institutionalization. It also requires making sure that stakeholders from government, industry, and civil society from around the world provide input to make the alliance framework realistic and successful.

No one country can expect to achieve its full potential by going it alone, not even the United States. An alliance framework for technology policy is the best way to ensure that the worlds democracies can effectively compete economically, politically, and militarily in the 21st century. The links between the worlds leading democracies remain strong despite the challenges of the current crisis. These relationships are an enduring and critical advantage that no autocratic country can match. It is time to capitalize on these strengths, retake the initiative, and shape the post-corona world.

Martijn Rasser is a senior fellow at the Center for a New American Security.

Follow this link:
Technology alliances will help shape our post-pandemic future - C4ISRNet

Making Sense of the Science and Philosophy of Devs – The Ringer

Let me welcome you the same way Stewart welcomes Forest in Episode 7 of the Hulu miniseries Devs: with a lengthy, unattributed quote.

We may regard the present state of the universe as the effect of its past and the cause of its future. An intellect which at any given moment knew all of the forces that animate nature and the mutual positions of the beings that compose it, if this intellect were vast enough to submit the data to analysis, could condense into a single formula the movement of the greatest bodies of the universe and that of the lightest atom; for such an intellect nothing could be uncertain and the future, just like the past, would be present before its eyes.

Its a passage that sounds as if it could have come from Forest himself. But its not from Forest, or Katie, or evenas Katie might guess, based on her response to Stewarts Philip Larkin quoteShakespeare. Its from the French scholar and scientist Pierre-Simon Laplace, who wrote the idea down at the end of the Age of Enlightenment, in 1814. When Laplace imagined an omniscient intellectwhich has come to be called Laplaces demonhe wasnt even saying something original: Other thinkers beat him to the idea of a deterministic, perfectly predictable universe by decades and centuries (or maybe millennia).

All of which is to say that despite the futuristic setting and high-tech trappings of Devsthe eight-part Alex Garland opus that will reach its finale next weekthe series central tension is about as old as the abacus. But theres a reason the debate about determinism and free will keeps recurring: Its an existential question at the heart of human behavior. Devs doesnt answer it in a dramatically different way than the great minds of history have, but it does wrap up ancient, brain-breaking quandaries in a compelling (and occasionally kind of confusing) package. Garland has admitted as much, acknowledging, None of the ideas contained here are really my ideas, and its not that I am presenting my own insightful take. Its more Im saying some very interesting people have come up with some very interesting ideas. Here they are in the form of a story.

Devs is a watchable blend of a few engaging ingredients. Its a spy thriller that pits Russian agents against ex-CIA operatives. Its a cautionary, sci-fi polemic about a potentially limitless technology and the hubris of big tech. Like Garlands previous directorial efforts, Annihilation and Ex Machina, its also a striking aesthetic experience, a blend of brutalist compounds, sleek lines, lush nature, and an exciting, unsettling soundtrack. Most of all, though, its a meditation on age-old philosophical conundrums, served with a garnish of science. Garland has cited scientists and philosophers as inspirations for the series, so to unravel the riddles of Devs, I sought out some experts whose day jobs deal with the dilemmas Lily and Co. confront in fiction: a computer science professor who specializes in quantum computing, and several professors of philosophy.

There are many questions about Devs that we wont be able to answer. How high is Kentons health care premium? Is it distracting to work in a lab lit by a perpetually pulsing, unearthly golden glow? How do Devs programmers get any work done when they could be watching the worlds most riveting reality TV? Devs doesnt disclose all of its inner workings, but by the end of Episode 7, its pulled back the curtain almost as far as it can. The main mystery of the early episodeswhat does Devs do?is essentially solved for the viewer long before Lily learns everything via Katies parable of the pen in Episode 6. As the series proceeds, the spy stuff starts to seem incidental, and the characters motivations become clear. All that remains to be settled is the small matter of the intractable puzzles that have flummoxed philosophers for ages.

Heres what we know. Forest (Nick Offerman) is a tech genius obsessed with one goal: being reunited with his dead daughter, Amaya, who was killed in a car crash while her mother was driving and talking to Forest on the phone. (Hed probably blame himself for the accident if he believed in free will.) He doesnt disguise the fact that he hasnt moved on from Amaya emotionally: He names his company after her, uses her face for its logo, and, in case those tributes were too subtle, installs a giant statue of her at corporate HQ. (As a metaphor for the way Amaya continues to loom over his life, the statue is overly obvious, but at least it looks cool.) Together with a team of handpicked developers, Forest secretly constructs a quantum computer so powerful that, by the end of the penultimate episode, it can perfectly predict the future and reverse-project the past, allowing the denizens of Devs to tune in to any bygone event in lifelike clarity. Its Laplaces demon made real, except for the fact that its powers of perception fail past the point at which Lily is seemingly scheduled to do something that the computer cant predict.

I asked Dr. Scott Aaronson, a professor of computer science at the University of Texas at Austin (and the founding director of the schools Quantum Information Center) to assess Devs depiction of quantum computing. Aaronsons website notes that his research concentrates on the capabilities and limits of quantum computers, so hed probably be one of Forests first recruits if Amaya were an actual company. Aaronson, whom I previously consulted about the plausibility of the time travel in Avengers: Endgame, humored me again and watched Devs despite having been burned before by Hollywoods crimes against quantum mechanics. His verdict, unsurprisingly, is that the quantum computing in Devslike that of Endgame, which cites one of the same physicists (David Deutsch) that Garland said inspired himis mostly hand-wavy window dressing.

A quantum computer is a device that uses a central phenomenon of quantum mechanicsnamely, interference of amplitudesto solve certain problems with dramatically better scaling behavior than any known algorithm running on any existing computer could solve them, Aaronson says. If youre wondering what amplitudes are, you can read Aaronsons explanation in a New York Times op-ed he authored last October, shortly after Google claimed to have achieved a milestone called quantum supremacythe first use of a quantum computer to make a calculation far faster than any non-quantum computer could. According to Googles calculations, the task that its Sycamore microchip performed in a little more than three minutes would have taken 100,000 of the swiftest existing conventional computers 10,000 years to complete. Thats a pretty impressive shortcut, and were still only at the dawn of the quantum computing age.

However, that stat comes with a caveat: Quantum computers arent better across the board than conventional computers. The applications where a quantum computer dramatically outperforms classical computers are relatively few and specialized, Aaronson says. As far as we know today, theyd help a lot with prediction problems only in cases where the predictions heavily involve quantum-mechanical behavior. Potential applications of quantum computers include predicting the rate of a chemical reaction, factoring huge numbers and possibly cracking the encryption that currently protects the internet (using Shors algorithm, which is briefly mentioned on Devs), and solving optimization and machine learning problems. Notice that reconstructing what Christ looked like on the cross is not on this list, Aaronson says.

In other words, the objective that Forest is trying to achieve doesnt necessarily lie within the quantum computing wheelhouse. To whatever extent computers can help forecast plausible scenarios for the past or future at all (as we already have them do for, e.g., weather forecasting), its not at all clear to what extent a quantum computer even helpsone might simply want more powerful classical computers, Aaronson says.

Then theres the problem that goes beyond the question of quantum vs. conventional: Either kind of computer would require data on which to base its calculations, and the data set that the predictions and retrodictions in Devs would demand is inconceivably detailed. I doubt that reconstructing the remote past is really a computational problem at all, in the sense that even the most powerful science-fiction supercomputer still couldnt give you reliable answers if it lacked the appropriate input data, Aaronson says, adding, As far as we know today, the best that any computer (classical or quantum) could possibly do, even in principle, with any data we could possibly collect, is to forecast a range of possible futures, and a range of possible pasts. The data that it would need to declare one of them the real future or the real past simply wouldnt be accessible to humankind, but rather would be lost in microscopic puffs of air, radiation flying away from the earth into space, etc.

In light of the unimaginably high hurdle of gathering enough data in the present to reconstruct what someone looked or sounded like during a distant, data-free age, Forest comes out looking like a ridiculously demanding boss. We get it, dude: You miss Amaya. But how about patting your employees on the back for pulling off the impossible? The idea that chaos, the butterfly effect, sensitive dependence on initial conditions, exponential error growth, etc. mean that you run your simulation 2000 years into the past and you end up with only a blurry, staticky image of Jesus on the cross rather than a clear image, has to be, like, the wildest understatement in the history of understatements, Aaronson says. As for the future, he adds, Predicting the weather three weeks from now might be forever impossible.

On top of all that, Aaronson says, The Devs headquarters is sure a hell of a lot fancier (and cleaner) than any quantum computing lab that Ive ever visited. (Does Kenton vacuum between torture sessions?) At least the computer more or less looks like a quantum computer.

OK, so maybe I didnt need to cajole a quantum computing savant into watching several hours of television to confirm that theres no way we can watch cavepeople paint. Garland isnt guilty of any science sins that previous storytellers havent committed many times. Whenever Aaronson has advised scriptwriters, theyve only asked him to tell them which sciencey words would make their preexisting implausible stories sound somewhat feasible. Its probably incredibly rare that writers would let the actual possibilities and limits of a technology drive their story, he says.

Although the show name-checks real interpretations of quantum mechanicsPenrose, pilot wave, many-worldsit doesnt deeply engage with them. The pilot wave interpretation holds that only one future is real, whereas many-worlds asserts that a vast number of futures are all equally real. But neither one would allow for the possibility of perfectly predicting the future, considering the difficulty of accounting for every variable. Garland is seemingly aware of how far-fetched his story is, because on multiple occasions, characters like Lily, Lyndon, and Stewart voice the audiences unspoken disbelief, stating that something or other isnt possible. Whenever they do, Katie or Forest is there to tell them that it is. Which, well, fine: Like Laplaces demon, Devs is intended as more of a thought experiment than a realistic scenario. As Katie says during her blue pill-red pill dialogue with Lily, Go with it.

We might as well go along with Garland, because any scientific liberties he takes are in service of the seriess deeper ideas. As Aaronson says, My opinion is that the show isnt really talking about quantum computing at allits just using it as a fancy-sounding buzzword. Really its talking about the far more ancient questions of determinism vs. indeterminism and predictability vs. unpredictability. He concludes, The plot of this series is one that wouldve been totally, 100 percent familiar to the ancient Greeksjust swap out the quantum computer for the Delphic Oracle. Aaronsonwho says he sort of likes Devs in spite of its quantum technobabblewould know: He wrote a book called Quantum Computing Since Democritus.

Speaking of Democritus, lets consult a few philosophers on the topic of free will. One of the most mind-bending aspects of Devs adherence to hard determinismthe theory that human behavior is wholly dictated by outside factorsis its insistence that characters cant change their behavior even if theyve seen the computers prediction of what theyre about to do. As Forest asks Katie, What if one minute into the future we see you fold your arms, and you say, Fuck the future. Im a magician. My magic breaks tram lines. Im not going to fold my arms. You put your hands in your pockets, and you keep them there until the clock runs out.

It seems as if she should be able to do what she wants with her hands, but Katie quickly shuts him down. Cause precedes effect, she says. Effect leads to cause. The future is fixed in exactly the same way as the past. The tram lines are real. Of course, Katie could be wrong: A character could defy the computers prediction in the finale. (Perhaps thats the mysterious unforeseeable event.) But weve already seen some characters fail to exit the tram. In an Episode 7 scenewhich, as Aaronson notes, is highly reminiscent of the VHS scene in Spaceballswe see multiple members of the Devs team repeat the same statements that theyve just heard the computer predict they would make a split second earlier. They cant help but make the prediction come true. Similarly, Lily ends up at Devs at the end of Episode 7, despite resolving not to.

Putting aside the implausibility of a perfect prediction existing at all, does it make sense that these characters couldnt deviate from their predicted course? Yes, according to five professors of philosophy I surveyed. Keep in mind what Garland has cited as a common criticism of his work: that the ideas I talk about are sophomoric because theyre the kinds of things that people talk about when theyre getting stoned in their dorm rooms. Were about to enter the stoned zone.

In this story, [the characters] are in a totally deterministic universe, says Ben Lennertz, an assistant professor of philosophy at Colgate University. In particular, the watching of the video of the future itself has been determined by the original state of the universe and the laws. Its not as if things were going along and the person was going to cross their arms, but then a non-deterministic miracle occurred and they were shown a video of what they were going to do. The watching of the video and the persons reaction is part of the same progression as the scene the video is of. In essence, the computer would have already predicted its own predictions, as well as every characters reaction to them. Everything that happens was always part of the plan.

Ohio Wesleyan Universitys Erin Flynn echoes that interpretation. The people in those scenes do what they do not despite being informed that they will do it, but (in part) because they have been informed that they will do it, Flynn says. (Think of Katie telling Lyndon that hes about to balance on the bridge railing.) This is not to say they will be compelled to conform, only that their knowledge presumably forms an important part of the causal conditions leading to their actions. When the computer sees the future, the computer sees that what they will do is necessitated in part by this knowledge. The computer would presumably have made different predictions had people never heard them.

Furthermore, adds David Landy of San Francisco State University, the fact that we see something happen one way doesnt mean that it couldnt have happened otherwise. Suppose we know that some guy is going to fold his arms, Landy says. Does it follow that he lacks the ability to not fold his arms? Well, no, because what we usually mean by has the ability to not fold his arms is that if things had gone differently, he wouldnt have folded his arms. But by stipulating at the start that he is going to fold his arms, we also stipulate that things arent going to go differently. But it can remain true that if they did go differently, he would not have folded his arms. So, he might have that ability, even if we know he is not going to exercise it.

If your head has started spinning, you can see why the Greeks didnt settle this stuff long before Garland got to it. And if it still seems strange that Forest seemingly cant put his hands in his pockets, well, what doesnt seem strange in the world of Devs? We should expect weird things to happen when we are talking about a very weird situation, Landy says. That is, we are used to people reliably doing what they want to do. But we have become used to that by making observations in a certain environment: one without time travel or omniscient computers. Introducing those things changes the environment, so we shouldnt be surprised if our usual inferences no longer hold.

Heres where we really might want to mime a marijuana hit. Neal Tognazzini of Western Washington University points out that one could conceivably appear to predict the future by tapping into a future that already exists. Many philosophers reject determinism but nevertheless accept that there are truths about what will happen in the future, because they accept a view in the philosophy of time called eternalism, which is (roughly) the block universe ideapast, present, and future are all parts of reality, Tognazzini says. This theory says that the past and the future exist some temporal distance from the presentwe just havent yet learned to travel between them. Thus, Tognazzini continues, You can accept eternalism about time without accepting determinism, because the first is just a view about whether the future is real whereas the second is a view about how the future is connected to the past (i.e., whether there are tram lines).

According to that school of thought, the future isnt what has to happen, its simply what will happen. If we somehow got a glimpse of our futures from the present, it might appear as if our paths were fixed. But those futures actually would have been shaped by our freely chosen actions in the interim. As Tognazzini says, Its a fate of our own makingwhich is just to say, no fate at all.

If we accept that the members of Devs know what theyre doing, though, then the computers predictions are deterministic, and the past does dictate the future. Thats disturbing, because it seemingly strips us of our agency. But, Tognazzini says, Even then, its still the case that what we do now helps to shape that future. We still make a difference to what the future looks like, even if its the only difference we could have made, given the tram lines we happen to be on. Determinism isnt like some force that operates independently of what we want, making us marionettes. If its true, then it would apply equally to our mental lives as well, so that the future that comes about might well be exactly the future we wanted.

This is akin to the compatibilist position espoused by David Hume, which seeks to reconcile the seemingly conflicting concepts of determinism and free will. As our final philosopher, Georgetown Universitys William Blattner, says, If determinism is to be plausible, it must find a way to save the appearances, in this case, explain why we feel like were choosing, even if at some level the choice is an illusion. The compatibilist perspective concedes that there may be only one possible future, but, Flynn says, insists that there is a difference between being causally determined (necessitated) to act and being forced or compelled to act. As long as one who has seen their future does not do what has been predicted because they were forced to do it (against their will, so to speak), then they will still have done it freely.

In the finale, well find out whether the computers predictions are as flawless and inviolable as Katie claims. Well also likely learn one of Devs most closely kept secrets: What Forest intends to do with his perfect model of Amaya. The show hasnt hinted that the computer can resurrect the dead in any physical fashion, so unless Forest is content to see his simulated daughter on a screen, he may try to enter the simulation himself. In Episode 7, Devs seemed to set the stage for such a step; as Stewart said, Thats the reality right there. Its not even a clone of reality. The box contains everything.

Would a simulated Forest, united with his simulated daughter, be happier inside the simulation than he was in real life, assuming hes aware hes inside the simulation? The philosopher Robert Nozick explored a similar question with his hypothetical experience machine. The experience machine would stimulate our brains in such a way that we could supply as much pleasure as we wanted, in any form. It sounds like a nice place to visit, and yet most of us wouldnt want to live there. That reluctance to enter the experience machine permanently seems to suggest that we see some value in an authentic connection to reality, however unpleasurable. Thinking Im hanging out with my family and friends is just different from actually hanging out with my family and friends, Tognazzini says. And since I think relationships are key to happiness, Im skeptical that we could be happy in a simulation.

If reality were painful enough, though, the relief from that pain might be worth the sacrifice. Suppose, for instance, that the real world had become nearly uninhabitable or otherwise full of misery, Flynn says. It seems to me that life in a simulation might be experienced as a sanctuary. Perhaps ones experience there would be tinged with sadness for the lost world, but Im not sure knowing its a simulation would necessarily keep one from being happy in it. Forest still seems miserable about Amaya IRL, so for him, that trade-off might make sense.

Whats more, if real life is totally deterministic, then Forest may not draw a distinction between life inside and outside of his quantum computer. If freedom is a critical component of fulfillment, then its hard to see how we could be fulfilled in a simulation, Blattner says. But for Forest, freedom isnt an option anywhere. Something about the situation seems sad, maybe pathetic, maybe even tragic, Flynn says. But if the world is a true simulation in the matter described, why not just understand it as the ability to visit another real world in which his daughter exists?

Those who subscribe to the simulation hypothesis believe that what we think of as real lifeincluding my experience of writing this sentence and your experience of reading itis itself a simulation created by some higher order of being. In our world, it may seem dubious that such a sophisticated creation could exist (or that anything or anyone would care to create it). But in Forests world, a simulation just as sophisticated as real life already exists inside Devswhich means that what Forest perceives as real life could be someone elses simulation. If hes possibly stuck inside a simulation either way, he might as well choose the one with Amaya (if he has a choice at all).

Garland chose to tell this story on TV because on the big screen, he said, it would have been slightly too truncated. On the small screen, its probably slightly too long: Because weve known more than Lily all along, what shes learned in later episodes has rehashed old info for us. Then again, Devs has felt familiar from the start. If Laplace got a pass for recycling Cicero and Leibniz, well give Garland a pass for channeling Laplace. Whats one more presentation of a puzzle thats had humans flummoxed forever?

See the rest here:
Making Sense of the Science and Philosophy of Devs - The Ringer