Another view: What is driving the Justice Department’s decision to go after Google? – Amarillo.com

Bloomberg

For months, President Donald Trumps Justice Department has hinted that it intends to crack down on Silicon Valley. It recently took a big step closer as senior antitrust officials met with their state counterparts to plot out a case against Alphabet Inc.s Google. What they plan to argue isnt quite yet clear. But as the final months of Trumps first term wind down, and an election draws near, some exceptional skepticism is in order.

One reason for caution is that Attorney General William Barr has not exactly been a disinterested enforcer of competition law. Quite the opposite: In recent testimony, a senior Justice Department whistle-blower described how Barr pressured antitrust prosecutors to harass automakers (and others) for transparently political reasons.

Now, according to news reports, Barr has taken an unusual interest in the Google case. Why? In a recent interview with Fox News, he intimated that he hopes to use antitrust law to punish tech companies for censoring conservative viewpoints, a frequent preoccupation of Trumps. Never mind that this accusation is false, and that tech companies would be entirely within their rights to so discriminate if they chose. The whole thing has nothing to do with antitrust.

Perhaps Barr was musing idly, and perhaps the department has more legitimate objections in mind. But even under more traditional theories of competition law, Google makes an odd target.

Feasibly, a case might be made against its dominance of the online advertising market, for instance. Combined with Facebook, Google took in about 60% of digital ad spending last year. Yet theres no law against building a good product. And with pressure rising from Amazon and other contenders, online ad rates have fallen by more than 40% over the past decade. That doesnt look like a market lacking in competition.

Nor could anyone credibly argue that Google has harmed consumers, the standard traditionally applied in antitrust analysis. To the contrary, it gives them (among other things) access to limitless email, a smartphone operating system, innovative mapping software and a search engine that ranks among the greatest inventions of the last century all for free. Its targeted advertising has been a boon to businesses big and small. Thats to say nothing of its work on driverless cars, quantum computing or esoteric life-extension technologies.

Even an otherwise blameless company shouldnt get a pass for anticompetitive behavior, of course. And some allege that Google has unfairly privileged its own products, pursued harmful mergers and engaged in other dubious conduct. If the Justice Department imposed targeted remedies for such violations after a transparent investigation, it would be entirely appropriate.

Yet the Trump administration has suggested nothing of the sort publicly. If its track record is any guide, this case is more likely to amount to a political attack with a belabored legal rationale attached. Even if its motives are pure, the administration should be wary: Government intervention in a market where no obvious harm has been caused to consumers and in pursuit of vague or unrelated objectives is a recipe for disaster.

The fact is, for all the criticism leveled at tech companies, they employ hundreds of thousands of people, create immensely useful products, propel what growth the American economy still enjoys and are among the most trusted brands going. They need to follow the rules like everyone else. But abusing antitrust law to clobber them for electoral gain wont end well for anyone.

See the original post:
Another view: What is driving the Justice Department's decision to go after Google? - Amarillo.com

Standard Chartered and Universities Space Research Association Join Forces on Quantum Computing – HPCwire

LONDONandMOUNTAIN VIEW, Calif.,July 13, 2020 Standard Chartered Bank and Universities Space Research Association (USRA) have signed a Collaborative Research Agreement to partner on quantum computing research and developing quantum computing applications.

In finance, the most promising use cases with real-world applications include quantum machine learning models (generating synthetic data and data anonymization) and discriminative models (building strong classifiers and predictors) with multiple potential uses such as credit scoring and generating trading signals. As quantum computing technology matures, clients should benefit from higher quality services such as faster execution, better risk management and the development of new financial products.

Kahina Van Dyke, Global Head of Digital Channels and Client Data Analytics at Standard Chartered, said: Similar to other major technological advancements, quantum computing is set to bring widespread benefits as well as disrupt many existing business processes. This is why its important for companies to future-proof themselves by adopting this new technology from an early stage. The partnership with USRA gives us access to world-class academic researchers and provides us with a unique opportunity to explore a wide range of models and algorithms with the potential to establish quantum advantage for the real-world use cases.

Bernie Seery, Senior VP of Technology at USRA noted that This partnership with the private sector enables a diversity of research through a competitively selected portfolio of quantum computing research projects involving academic institutions and non-profits, growing an ecosystem for quantum artificial intelligence that has already involved over 150 researchers from more than 40 organizations that produced over 50 peer-reviewed publications over the last seven years.

Alex Manson, Global Head of SC Ventures, Standard Chartereds innovation, fintech investment and ventures arm, stated: The world is currently in the process of identifying commercial use cases where quantum computer capabilities will surpass classical computers. We have a conviction that some of these use cases will transform the way we manage risks in financial services, for example by simulating portfolios and exponentially speeding up the generation of market data. We will work with USRA to identify such use cases in financial services, with a view to implementing them within our bank, as well as potentially offering this service to other market participants over time.

Mark Johnson, Vice President, Processor Design, Development and Quantum Products at D-Wave said: Quantum computing research and development are poised to have a profound impact on the industries responsible for solving todays most complex problems. Thats why researchers and businesses alike are looking to quantum computing today to start demonstrating tangible value. Were proud to work with USRA and Standard Chartered Bank as they improve global access to quantum systems and undertake essential research and development.

At USRAs Research Institute for Advanced Computer Science, Dr.Davide Venturelli, Associate Director for Quantum Computing, notes that quantum annealing is implementing a powerful approach to computing, featuring unique advantages with respect to other traditional and novel approaches, that should be studied, theoretically and experimentally, to advance the state of art of computing technologies for the benefit of nearly all disciplines.

Standard Chartereds team, led by Dr.Alexei Kondratyev, Global Head of Data Science and Innovation, and USRA have collaborated in quantum computing research since 2017. An earlier success in investigating the quantum annealing approach to computational problems in portfolio optimisation use cases led to this strategic partnership, where USRA will continue to support fundamental academic research in quantum physics and artificial intelligence and Standard Chartered will focus on future commercial applications.

In 2012, USRA partnered with NASA to found the Quantum Artificial Intelligence Laboratory (QuAIL): the space agencys hub to evaluate the near-term impact of quantum technologies. With QuAIL, the USRA team has investigated the physics, the engineering and the performance of multiple generations of quantum annealing processors built by D-Wave Systems, as well as participating in U.S. government research programs that looked into application of quantum annealing for combinatorial optimization, aviation, earth science and machine learning. NASA Ames Research Center is currently hosting a D-Wave 2000Q annealing system that will be made available for free for research by U.S. Universities, thanks to the support of this partnership.

Standard Chartered and USRA intend to develop this initial collaboration beyond quantum annealing to all unconventional computing systems that could provide an advantage to applications of interest, such as gate-model noisy-intermediate scale quantum (NISQ) processors and Coherent Ising machines.

About USRA

Founded in 1969, under the auspices of the National Academy of Sciences at the request of the U.S. Government, the Universities Space Research Association (USRA) is a nonprofit corporation chartered to advance space-related science, technology and engineering. USRA operates scientific institutes and facilities, and conducts other major research and educational programs, under Federal funding. USRA engages the university community and employs in-house scientific leadership, innovative research and development, and project management expertise. RIACS is a USRA department for research in fundamental and applied information sciences, leading projects on quantum computing funded by NASA, DARPA, the US Airforce and NSF. More info at:https://riacs.usra.edu/quantum/andwww.usra.edu.

About Standard Chartered

We are a leading international banking group, with a presence in 59 of the worlds most dynamic markets, and serving clients in a further 85. Our purpose is to drive commerce and prosperity through our unique diversity, and our heritage and values are expressed in our brand promise, Here for good. Standard Chartered PLC is listed on theLondonand Hong Kong Stock Exchanges as well as theBombayand National Stock Exchanges inIndia. For more stories and expert opinions please visitInsightsatsc.com.

Source: Universities Space Research Association

Read the original post:
Standard Chartered and Universities Space Research Association Join Forces on Quantum Computing - HPCwire

The Honeywell transition: From quantum computing to making masks – WRAL Tech Wire

CHARLOTTE Honeywell no longer sells its iconic home thermostats, but its still in the business of making control systems for buildings and aircraft.

Thats put the 114-year-old conglomerate in a tough spot as workplaces have gone vacant and flights grounded in response to the coronavirus pandemic.

Darius Adamczyk, who became CEO in 2017, spoke with The Associated Press about how the business is adjusting to the pandemic, diverting resources to build personal protective equipment and continuing a quest for a powerful quantum computer that works by trapping ions. The interview has been edited for length and clarity.

Q: How is the crisis affecting some of your your core business segments, especially aerospace?

A: The air transport segment obviously is impacted the most because its tied to air travel and production of new aircraft. Business aviation is depressed as well. The third segment, which has been fairly resilient, is defense and space. We expect to see growth in that segment even this year.

Q: Youve had to do layoffs?

A: Unfortunately, weve had to take some cost actions. Its a bit more drastic in aerospace and our (performance materials) business and much less so in some of the other businesses. Some of the actions weve taken have been to do temporary things. Weve created a $10 million dollar fund for employees who are financially impacted by COVID. We extended sick leave for a lot of our hourly employees. Taking care of our employees is the No. 1 priority and making sure that theyre healthy and safe, but also protecting the business long-term because the economic conditions are severe. Some of the levels of fall off here in Q2 are much more dramatic than we saw in the 2008/2009 recession.

Q: How did Honeywell get into building a quantum computer?

A: One of the bigger challenges in making a quantum computer work is the ability to really control the computer itself. The way we kind of came into this play is weve had the controls expertise, but we didnt have so much trapped ion expertise.

Q: How does your approach differ from from what Google and IBM have been trying to do?

A: I dont know exactly technically what theyre doing. Some of these things are very proprietary and very secret. But were very confident in terms of the public announcements and what weve been able to learn from some of the publicly available information that we, in fact, have the most powerful quantum computer in the world. Its going to get better and better by an order of magnitude every year.

Q: Howd you go about re-purposing factories in Rhode Island and Arizona to make respiratory masks?

A: We very quickly mobilized a couple of facilities that we werent fully utilizing. Something that would normally take us nine months took us literally four to five weeks to create. Weve gone from zero production to having two fully functioning facilities, making about 20 million masks a month.

Q: President Trump didnt wear a mask while visiting Honeywells Arizona factory in May. Did he talk to you about whether he should wear a mask?

A: No.

Q: What did he talk about?

A: He was very kind in his comments about the kind of contribution Honeywell has made, not just today, this crisis, but really in other times of crisis, such as in World War II and some of the other technologies that weve provided in the past. So I think it was certainly nice to hear.

Continued here:
The Honeywell transition: From quantum computing to making masks - WRAL Tech Wire

Opinion |Dance of the synchronized quantum particles – Livemint

Three of our gang, you see, were women. On our second morning, all three found their periods had kicked in. They were so charmed and amused by this that they forgot any possible cramps or migraines. This was, they told us ignorant men, menstrual synchrony" the tendency for women who live together to begin menstruating on the same day every month. In 1971, a psychologist called Martha McClintock studied 180 women in a college dormitory. Menstrual synchrony, she concluded then, was real.

Now, this really didnt apply that weekend in NYC, because these ladies had only spent one day together. Besides, more recent research has questioned McClintocks findings. Even so, those long-ago NYC days came back to me after reading about some even more recent research, at IIT Kanpur. Not about menstruation, but about synchronization, and in the quantum world.

Whats synchronization? Imagine an individual a bird, a pendulum doing a particular motion over and over again. The bird is flapping its wings as it flies, the pendulum is swinging back and forth. Imagine several such individuals near each other, all doing the same motion several birds flying together in a flock, several pendulums swinging while hanging from a beam. When they start out, the birds are flapping to their own individual rhythms, the pendulums going in different directions. But then something beautiful happens: these individual motions synchronize. The birds flap in perfect coordination, so the flock moves as one marvellous whole. The pendulums swing in harmony.

In fact, synchronization was first observed in pendulums. In 1665, the great Dutch scientist Christiaan Huygens attached two pendulum clocks to a heavy beam. Soon after, the two pendulums were in lockstep.

Similarly, fireflies are known to break into spontaneous synchrony. When there are just one or a few, they light up at different timesa pleasant enough sight, but nothing to write home about. But there are spots in the coastal mangroves of Malaysia and Indonesia where whole hosts of the little insects congregate every evening and suddenly, synchrony happens. They switch on and off in perfect unison, putting on a light show like none youve seen.

There are, yes, other examples. At a concert, the audience will tend to applaud in sync. The reason we only ever see one side of the Moon is that the orbital and rotational periods of the Moon have, over time, synchronized with the rotation of our Earth. Your heart beats because the thousands of pacemaker" cells it contains pulse in synchrony. Some years ago, a bridge of a new and radical design was built over the Thames in London. When it was opened, people swarmed onto it on foot. It quickly started swaying disconcertingly from side to side enough, in turn, to force the pedestrians to walk in a certain awkward way just to keep their footing. On video, youll see hundreds of people on the bridge, all walking awkwardly but in step.

In his book Sync: The Emerging Science of Spontaneous Order, the mathematician Steven Strogatz writes: At the heart of the universe is a steady, insistent beat: the sound of cycles in sync. It pervades nature at every scale from the nucleus to the cosmos." He goes on to observe that this tendency for synchronization does not depend on intelligence, or life, or natural selection. It springs from the deepest source of all: the laws of physics". And thats where IIT Kanpur comes in.

In 2018, a team of Swiss researchers looked at the possibility of synchronization at the lower end of that scale that Strogatz mentions, or in some ways even off that end of the scale. Do the most elementary, fundamental particles known to physicists exhibit the same tendency to synchronize as somewhat larger objects such as starlings and pendulums and the moon? Were talking about electrons and neutrons, particles that occupy the so-called quantum" world. Can we get them to synchronize?

They concluded that the smallest quantum particles actually cannot be synchronized. These exhibit a spin"a form of angular momentum, in a sense the degree to which the particle is rotating of 1/2 (half). But there are ways in which such spin-half" particles can combine to form a spin-1" system, and the Swiss team predicted that these combinations are the smallest quantum systems that can be synchronized.

So, a physics research group at IIT Kanpur decided to test this prediction. These are guys, I should tell you, who are thoroughly accustomed to working with atoms: One day in 2016, their professor, Dr Saikat Ghosh, took me into their darkened lab and pointed to a small red glow visible in the middle of their apparatus. Thats a group of atoms," he said with a grin, and then tweaked some settings and the glow dropped out of sight. The point? They are able to manipulate atoms. On another visit, they underlined this particular skill by showing me their work with graphene, a sheet of carbon that is get this one atom thick.

So, after the Swiss prediction, Ghosh and his students took a million atoms of rubidiuma soft, silvery metal and cooled them nearly to whats known as absolute zero", or -273 Celsius. Could they get these atoms to show synchrony?

Lets be clear about what they were dealing with, though. The usual objects that synchronize pendulums, birds are called oscillators" because they are in some regular, rhythmic motion. Strictly, it is that motion of the oscillators that synchronizes. But were dealing here with objects we can see, which means the rules of classical" physics apply. Quantum objects like atoms behave differently. In fact, Ghosh told me that spin-1 atoms are not really oscillating in the same sense as pendulums and starlings in flight. Still, with that caveat in place, there are ways in which we can abstract their motion and treat them as oscillators.

In their experiment, the IIT team shot pulses of light at the group of rubidium atoms. Light is made up of photons, which are like minuscule bundles of energy. When they hit an atom, they flip" its spin. Embodied in that flip is the photons quantum information; in a real way, the photons are actually stored in these flipped atoms. This happens with such precision that you can later flip the atoms back and release the photons, thus retrieving" the stored light. In fact, with this storage and retrieval behaviour, the atoms are like memory cells, and this is part of the mechanism of quantum computing. (See my column from October 2018, Catch a quantum computer and pin it down).

But when the atoms are flipped and they store these photons, something else happens to them. When the light is retrieved, the IIT team found it displays interference fringes" a characteristic pattern of light and shadow (similar in concept to what causes stripes on tigers and zebras, or patterns in the sand on a beach). From this fringe pattern, the scientists can reconstruct the quantum state the atoms were inand voil, theres synchrony.

Did each individual atom synchronize to the light and since all one million atoms did so, is that how they are synchronized with each other as well? Thats to be tested still, but its a good way to think of what happened. Again, take fireflies. In one experiment, a single flashing LED bulb was placed in a forest. When the fireflies appeared, they quickly synchronized to the flashing bulb, and therefore to each other. As Dr Ghosh commented: two fireflies synchronizing is interesting, but an entire forest filled with fireflies lighting up in sync reveals new emergent patterns."

There are implications in all this for, among other things, quantum computing. The IIT teams paper remarks; [The] synchronization of spin-1 systems can provide insights in open quantum systems and find applications in synchronized quantum networks." (Observation of quantum phase synchronization in spin-1 atoms, by Arif Warsi Laskar, Pratik Adhikary, Suprodip Mondal, Parag Katiyar, Sai Vinjanampathy and Saikat Ghosh, published 3 June 2020).

There will be other applications too. But over 350 years after Christiaan Huygens stumbled on classical" synchronization, the IIT team has shown for the first time that this strangely satisfying behaviour happens in the quantum world too. No wonder their paper was chosen recently for special mention in the premier physics journal, Physical Review Letters.

A round of applause for the IIT folks, please. I know it will happen in synchrony.

Once a computer scientist, Dilip DSouza now lives in Mumbai and writes for his dinners. His Twitter handle is @DeathEndsFun

Subscribe to newsletters

* Enter a valid email

* Thank you for subscribing to our newsletter.

More here:
Opinion |Dance of the synchronized quantum particles - Livemint

Quantum Software Market 2020: Potential Growth, Challenges, and Know the Companies List Could Potentially Benefit or Loose out From the Impact of…

Latest Quantum Software Market report evaluates the impact of Covid-19 outbreak on the industry, involving potential opportunity and challenges, drivers and risks and market growth forecast based on different scenario. Global Quantum Software industry Market Report is a professional and in-depth research report on the worlds major regional market.

This Quantum Software Market report will help the business leaders to detail better field-tested strategies and settle on educated choices to improved benefit

Get Exclusive Sample Report on Quantum Software Market is available at https://inforgrowth.com/sample-request/6358191/quantum-software-market

Top Players Listed in the Quantum Software Market Report areOrigin Quantum Computing Technology, D Wave, IBM, Microsoft, Intel, Google, Ion.

Quantum Softwaremarket report provides a detailed analysis of global market size, regional and country-level market size, segmentation market growth, market share, competitive Landscape, sales analysis, the impact of domestic and global market players, value chain optimization, trade regulations, recent developments, opportunities analysis, strategic market growth analysis, product launches, area marketplace expanding, and technological innovations.

Market Segmentations: Global Quantum Software market competition by top manufacturers, with production, price, revenue (value) and market share for each manufacturer.

Based on type, report split into System Software, Application Softwa.

Based on the end users/applications, this report focuses on the status and outlook for major applications/end users, consumption (sales), market share and growth rate for each application, including Big Data Analysis, Biochemical Manufacturing, Machine Learni.

Download the Sample ToC to understand the CORONA Virus/COVID19 impact and be smart in redefining business strategies. https://inforgrowth.com/CovidImpact-Request/6358191/quantum-software-market

The report introduces Quantum Software basic information including definition, classification, application, industry chain structure, industry overview, policy analysis, and news analysis. Insightful predictions for the Quantum Software Market for the coming few years have also been included in the report.

In the end, Quantum Softwarereport provides details of competitive developments such as expansions, agreements, new product launches, and acquisitions in the market for forecasting, regional demand, and supply factor, investment, market dynamics including technical scenario, consumer behavior, and end-use industry trends and dynamics, capacity, spending were taken into consideration.

Important Key questions answered in Quantum Softwaremarket report:

Get Special Discount Up To 50%, https://inforgrowth.com/discount/6358191/quantum-software-market

FOR ALL YOUR RESEARCH NEEDS, REACH OUT TO US AT:Address: 6400 Village Pkwy suite # 104, Dublin, CA 94568, USAContact Name: Rohan S.Email:[emailprotected]Phone: +1-909-329-2808UK: +44 (203) 743 1898Website: http://www.inforgrowth.com

Follow this link:
Quantum Software Market 2020: Potential Growth, Challenges, and Know the Companies List Could Potentially Benefit or Loose out From the Impact of...

The UKs Huawei Decision: Why the West is Losing the Tech Race – Chatham House

The UKs decision to ban its mobile providers from buying new Huawei 5G equipment after December 2020 and removing all the companys 5G kit from their networks by 2027 is a blow to Huawei and China, but it is one battle in a long war that the West is currently losing.

5Gs significance for the next generation of technology is indisputable and so is its critical role in helping countries achieve digital transformation and economic success. Not only does it offer faster and better connection speeds and greater capacity, it also transforms the way people interact with online services. And it will allow industry to automate and optimize processes that are not possible today.

Due to its transformative importance, what is in essence a technological issue has turned into a contest over global technological leadership that extends beyond the US-China rivalry and has created tensions between the US and its long-time allies. Yet 5G is just one key technology in a more expansive landscape that will underpin the future of the worlds critical infrastructure, including in areas such as quantum computing, biotechnology, artificial intelligence, the internet of things and big data.

To achieve technological leadership in these domains requires governments to invest in a long-term, strategic and agile vision that is able to encompass the interdependencies between these areas and then leverage the resulting technological advances for economic progress. It also requires governments working with each other and with the private sector to support research and development and to create companies with leading-edge technologies that can compete globally.

China understands this and has a national and international vision to establish itself as a technological superpower. Re-balancing from a hub oflabour-intensivemanufacturing to a global innovation powerhouse is the absolute priority of the ruling Chinese Communist Party.

In the earlier part of this journey, commercial espionage and IP theft of western R&D were at the heart of the Chinese way of competing. Now, Beijing is cultivating national champions that can drive Chinas technological innovation, with the goal of using domestic suppliers to reducereliance on foreign technology at home as well as extending its international outreach.

In the 5G area, Beijing has introduced domestically the so-called New Infrastructure Investments Fund, which earmarks special loans to boost 5G technology applications in medical devices, electric vehicles and communication platforms. This Fund constitutes a major part of the stimulus package for Chinas post-COVID economic recovery.

Apart from 5G, China's recent launch of a second state-fundedsemiconductor development fundvalued at $29 billion, following an earlier $20 billion fund for the same purpose, shows the extent to which state financial resources are being utilized in Chinas quest to become technologically self-sufficient.

It is too early to know if the Chinese governments industrial policies will eventually achieve the technological self-sufficiency Beijing has long desired. But its growing national capabilities have stoked serious concerns across the West and led to the current US administrations determined effort to dismantle Chinese high-tech companies.

Chinas approach to macroeconomic management diverges significantly from that of the US and other market economies, particularly in its policy towards driving innovation. Due to the legacy of a state-planned economy, China is certain that simply relying on market forces is insufficient.

While Beijing financially supports government-controlled technological enterprises, Washington takes a laissez-faire, light-touch approach by the state to the business sector. The US believes that a politicized process of distributing public money is inherently susceptible to rent-seeking and corruption, and gets in the way of competitive innovation. In line with most liberal economists, many Western governments believe the government should refrain from market intervention. For its part, Beijing stresses a state-dominated economy as a necessary precondition both to the future growth of the Chinese economy and to the legitimization of one-party rule.

If the pro-market economists view iscorrect, the USshould have little to fear from Chinese industrial innovation policy in the long-term. Let Beijing waste money and distort resource allocation, while Washingtonfollows its private sector-led principles, condent that this approach will produce a more competitive economy in the long run.

But one area that should concern the US and that illustrates the Chinese vision for global technological dominance is technical standard setting. Technical standards determine how technologies work with each other, enabling their interoperability around the world, meaning they can function irrespective of where they are being used.

The Chinese leadership has long understood the relationship between technical standards and economic power. Standards help to monetize technological innovation and research and can help shape new technologies. China has therefore been playing an increasingly active role in international standards organizations to legitimize Chinese technologies, whereas the US, which historically has been highly influential in this area, has not been participating as much or as effectively.

China has also been using its Belt and Road Initiative (BRI) as an opportunity to internationalize thedistribution of its standards tocountries signed up to the BRI.The so-called Digital Silk Road, which has been described as Chinas most important global governance initiative, acts as a route to accelerate this process.Later this year, China is expected to launch its new China Standards 2035 plan, which aims to shape how the next generation of technologies will work together.

Chinas preferred model and its recent actions have given Western leaders much to worry about. But standing up to Chinas growing global influence in high technology and re-establishing the Wests desired technological edge will take much more than achieving a common front on excluding China from their 5G networks. It requires a long-term vision built on the power of competitive markets, backed by solid investment in the next generation of technology. This will require, in turn, much greater cooperation between Western governments and between them and their private sectors.

And, whilst recent protective steps taken in Washington and other Western capitals may slow down Chinas trailblazing in the technology sphere, it will only hasten China's determination to become tech self-sufficient in the long term. This will increase the probability of a splintered internet, which will have negative repercussions for all.

See the original post here:
The UKs Huawei Decision: Why the West is Losing the Tech Race - Chatham House

The Brookings glossary of AI and emerging technologies – Brookings Institution

Algorithms:

According to author Pedro Domingos, algorithms are a sequence of instructions telling a computer what to do.[1] These software-based coding rules started with simple and routine tasks, but now have advanced into more complex formulations, such as providing driving instructions for autonomous vehicles, identifying possible malignancies in X-rays and CT scans, and assigning students to public schools. Algorithms are widely used in finance, retail, communications, national defense, and many other areas.

Indian engineers Shukla Shubhendu and Jaiswal Vijay define AI as machines that respond to stimulation consistent with traditional responses from humans, given the human capacity for contemplation, judgment, and intention.[2] This definition emphasizes several qualities that separate AI from mechanical devices or traditional computer software, specifically intentionality, intelligence, and adaptability. AI-based computer systems can learn from data, text, or images and make intentional and intelligent decisions based on that analysis.

Augmented reality puts people in realistic situations that are augmented by computer-generated video, audio, or sensory information. This kind of system allows people to interact with actual and artificial features, be monitored for their reactions, or be trained on the best ways to deal with various stimuli.

Extremely large data sets that are statistically analyzed to gain detailed insights. The data can involve billions of records and require substantial computer-processing power. Data sets are sometimes linked together to see how patterns in one domain affect other areas. Data can be structured into fixed fields or unstructured as free-flowing information. The analysis of big data sets can reveal patterns, trends, or underlying relationships that were not previously apparent to researchers.

Automated tools for answering human questions. Chatbots are being used in retail, finance, government agencies, nonprofits, and other organizations to respond to frequently asked questions or routine inquiries.

Data storage and processing used to take place on personal computers or local servers controlled by individual users. In recent years, however, storage and processing have migrated to digital servers hosted at data centers operated by internet platforms, and people can store information and process data without being in close proximity to the data center. Cloud computing offers convenience, reliability, and the ability to scale applications quickly.

Computers that develop knowledge based on digital pictures or videos.[3] For example, cameras in automated retail outlets that are connected to CV systems can observe what products shoppers picked up, identify the specific items and their prices, and charge consumers credit card or mobile payment system without involving a cash register or sales clerk. CV also is being deployed to analyze satellite images, human faces, and video imagery.

Cars, trucks, and buses that communicate directly with one another and with highway infrastructure. This capacity speeds navigation, raises human safety, and takes advantage of the experiences of other vehicles on the road to improve the driving experience.

The analysis of data to gather substantive insights. Researchers use statistical techniques to find trends or patterns in the data, which give them a better understanding of a range of different topics. Data analytic approaches are used in many businesses and organizations to track day-to-day activities and improve operational efficiency.

Techniques that analyze large amounts of information to gain insights, spot trends, or uncover substantive patterns. These approaches are used to help businesses and organizations improve their processes or identify associations that shed light on relevant questions.

Digital images and audio that are artificially altered or manipulated by AI and/or deep learning to make someone do or say something he or she did not actually do or say. Pictures or videos can be edited to put someone in a compromising position or to have someone make a controversial statement, even though the person did not actually do or say what is shown. Increasingly, it is becoming difficult to distinguish artificially manufactured material from actual videos and images.

A subset of machine learning that relies on neural networks with many layers of neurons. In so doing, deep learning employs statistics to spot underlying trends or data patterns and applies that knowledge to other layers of analysis. Some have labeled this as a way to learn by example and a technique that perform[s] classification tasks directly from images, text, or sound and then applies that knowledge independently.[4] Deep learning requires extensive computing power and labeled data, and is used in medical research, automated vehicles, electronics, and manufacturing, among other areas.

The speed, scope, and timing of technology innovation today is often decided not by government officials but by coders, software designers, and corporate executives. Digital sovereigns set the rules of the road and terms of service for consumers. What they decide, directly or indirectly, has far-reaching consequences for those using their software or platform. The power of business decisionmakers raises important governance questions regarding who should decide on matters affecting society as a whole and the role that policymakers, consumers, and ethicists should play in digital innovation.

Connecting frontline people with others who have differing skills and getting them to work together to solve problems. Distributed collaboration differs from current governance paradigms that emphasize hierarchical, top-down decisionmaking by those who do not always have relevant knowledge about the issues being addressed. The new model takes advantage of the fact that a range of skills are needed to resolve technology issues, and those skills are located in different subject areas and organizational parts. Rather than keeping AI expertise in isolation, distributed collaboration brings together software and product designers, engineers, ethicists, social scientists, and policymakers to draw on their respective expertise and integrate their knowledge to solve pressing problems.

Many technologies can be used in a good or ill manner. The very same facial recognition system could be used to find missing children or provide a means for mass surveillance. It is not the technology per se that raises ethical issues but how the technology is put to use. The dual-use nature of technologies makes regulation difficult because it raises the question of how to gain the benefits of technology innovation while avoiding its detrimental features.

A technology for identifying specific people based on pictures or videos. It operates by analyzing features such as the structure of the face, the distance between the eyes, and the angles between a persons eyes, nose, and mouth. It is controversial because of worries about privacy invasion, malicious applications, or abuse by government or corporate entities. In addition, there have been well-documented biases by race and gender with some facial recognition algorithms.

These are fifth-generation wireless telecommunications networks that have been deployed in major cities and feature faster speeds and enhanced capabilities for transmitting data and images. As such, 5G networks enable new digital products and services, such as video streaming, autonomous vehicles, and automated factories and homes that require a fast broadband.

High-tech military situations in which robots, sensors, AI, and autonomous systems play important roles and command decisions have to unfold at speeds heretofore unseen in warfare. Because of the acceleration of the pace and scope of conflict, countries will have to conduct simultaneous operations in every warfare domain and national leaders will need to accelerate technology innovation to build a safe and stable future.[5]

According to Dorian Pyle and Cristina San Jose of the McKinsey Quarterly, machine learning is based on algorithms that can learn from data without relying on rules-based programming.[6] ML represents a way to classify data, pictures, text, or objects without detailed instruction and to learn in the process so that new pictures or objects can be accurately identified based on that learned information. ML furthermore can be used to estimate continuous variables (such as estimating home sales prices) or to play games. Many of its insights come by examining prior data and learning how to improve understanding.

The analysis of textual information to make sense of its meaning and intentions. NLP software can take a large amount of text and see how words are linked together to assess positive or negative sentiment, relationships, associations, and meaning. For example, researchers can study medical records to see which patient symptoms appear to be most related to particular illnesses.

Researchers use computer software to perform some task by analyzing training examples and by grouping data based on common similarities.[7] Similar to the neural nodes of a brain, neural networks learn in layers and build complex concepts out of simpler ones. They break up tasks, identify objects at a number of different levels, and apply that knowledge to other activities. These kinds of systems allow computers to learn and adapt to changing circumstances, similar to the way a brain functions. Deep learning and many of the most prominent recent applications of machine learning operate through neural networks (e.g., driverless cars, deepfakes, and AlphaGo game playing).

Quantum computers have tremendous capacity for storing and processing information because their storage processes are not in the form of a zero or one, as is the case with traditional computers. Rather, they take advantage of superpositionthe fact that electrons can be in two places at onceto create quantum bits that store multiple values in each point.[8] That capability dramatically increases storage capacity and decreases processing times, thereby improving the scope of data, textual, or image analysis.

Futurist Ray Kurzweil describes a singularity as a machine-based superintelligence [that is] greater than human intelligence.[9] It combines advanced computing power with artificial intelligence, machine learning, and data analytics to create super-powered entities. There are extensive (and unresolved) debates regarding whether humanity will face a computing singularity in the next 50, 100, or 250 years.

The ubiquity of peoples online activities enables technology that tracks behavior and rates people based on their online actions. As an illustration, some organizations have piloted systems that compile data on social media activities, personal infractions, and behaviors such as paying taxes on time. They use that data to rate people for creditworthiness, travel, school enrollment, and government positions.[10] These systems are problematic from an ethical standpoint because they lack transparency and can be used to penalize political opponents.

According to Science magazine, supervised learning is a type of machine learning in which the algorithm compares its outputs with the correct outputs during training. In unsupervised learning, the algorithm merely looks for patterns in a set of data.[11] Supervised learning allows ML and AI to improve information processing and become more accurate.

The backlash against emerging technologies that has developed among many individuals. People worry about a host of problems related to technology innovation, such as privacy invasions, mass surveillance, widening income inequality, and possible job losses. Figuring out how to assuage understandable human fears is a major societal challenge going forward.

Virtual reality uses headsets equipped with projection visors to put people in realistic-seeming situations that are completely generated by computers. People can see, hear, and experience many types of environments and interact with them. By simulating actual settings, VR can train people how to deal with various situations, vary the features that are observed, and monitor how people respond to differing stimuli.

[1] Pedro Domingos, The Master Algorithm: How the Quest for the Ultimate Learning Machine Will Remake Our World (New York: Basic Books, 2018).

[2] Shukla Shubhendu and Jaiswal Vijay, Applicability of Artificial Intelligence in Different Fields of Life, International Journal of Scientific Engineering and Research, vol. 1, no. 1 (September 2013), pp. 2835.

[3] Jason Brownlee, A Gentle Introduction to Computer Vision, Machine Learning Mastery, July 5, 2019.

[4] Math Works, What Is Deep Learning? undated.

[5] John R. Allen and Amir Husain, Hyperwar and Shifts in Global Power in the AI Century, in Amir Husain and others, Hyperwar: Conflict and Competition in the AI Century (Austin, TX: SparkCognition Press, 2018), p. 15.

[6] Dorian Pyle and Cristina San Jose, An Executives Guide to Machine Learning, McKinsey Quarterly, June, 2015.

[7] Larry Hardesty, Explained: Neural Networks, MIT News, April 14, 2017.

[8] Cade Metz, In Quantum Computing Race, Yale Professors Battle Tech Giants, New York Times, November 14, 2017, p. B3.

[9] Quoted in Tom Wheeler, From Gutenberg to Google: The History of Our Future (Brookings, 2019), p. 226. Also see Ray Kurzweil, The Singularity Is Near: Where Humans Transcend Biology (London: Penguin Books, 2006).

[10] Jack Karsten and Darrell M. West, Chinas Social Credit System Spreads to More Daily Transactions, TechTank (blog), Brookings, June 18, 2018.

[11] Matthew Hutson, AI Glossary: Artificial Intelligence, in So Many Words, Science, July 7, 2017.

Continued here:
The Brookings glossary of AI and emerging technologies - Brookings Institution

Topological Quantum Computing Market Growth By Manufacturers, Type And Application, Forecast To 2026 – 3rd Watch News

New Jersey, United States,- Market Research Intellect sheds light on the market scope, potential, and performance perspective of the Global Topological Quantum Computing Market by carrying out an extensive market analysis. Pivotal market aspects like market trends, the shift in customer preferences, fluctuating consumption, cost volatility, the product range available in the market, growth rate, drivers and constraints, financial standing, and challenges existing in the market are comprehensively evaluated to deduce their impact on the growth of the market in the coming years. The report also gives an industry-wide competitive analysis, highlighting the different market segments, individual market share of leading players, and the contemporary market scenario and the most vital elements to study while assessing the global Topological Quantum Computing market.

The research study includes the latest updates about the COVID-19 impact on the Topological Quantum Computing sector. The outbreak has broadly influenced the global economic landscape. The report contains a complete breakdown of the current situation in the ever-evolving business sector and estimates the aftereffects of the outbreak on the overall economy.

Leading Topological Quantum Computing manufacturers/companies operating at both regional and global levels:

To get Incredible Discounts on this Premium Report, Click Here @ https://www.marketresearchintellect.com/ask-for-discount/?rid=174712&utm_source=3WN&utm_medium=888

The Topological Quantum Computing market report provides successfully marked contemplated policy changes, favorable circumstances, industry news, developments, and trends. This information can help readers fortify their market position. It packs various parts of information gathered from secondary sources, including press releases, web, magazines, and journals as numbers, tables, pie-charts, and graphs. The information is verified and validated through primary interviews and questionnaires. The data on growth and trends focuses on new technologies, market capacities, raw materials, CAPEX cycle, and the dynamic structure of the Topological Quantum Computing market.

This study analyzes the growth of Topological Quantum Computing based on the present, past and futuristic data and will render complete information about the Topological Quantum Computing industry to the market-leading industry players that will guide the direction of the Topological Quantum Computing market through the forecast period. All of these players are analyzed in detail so as to get details concerning their recent announcements and partnerships, product/services, and investment strategies, among others.

Sales Forecast:

The report contains historical revenue and volume that backing information about the market capacity, and it helps to evaluate conjecture numbers for key areas in the Topological Quantum Computing market. Additionally, it includes a share of each segment of the Topological Quantum Computing market, giving methodical information about types and applications of the market.

Reasons for Buying Topological Quantum Computing Market Report

This report gives a forward-looking prospect of various factors driving or restraining market growth.

It renders an in-depth analysis for changing competitive dynamics.

It presents a detailed analysis of changing competition dynamics and puts you ahead of competitors.

It gives a six-year forecast evaluated on the basis of how the market is predicted to grow.

It assists in making informed business decisions by performing a pin-point analysis of market segments and by having complete insights of the Topological Quantum Computing market.

This report helps the readers understand key product segments and their future.

Have Any Query? Ask Our Expert @ https://www.marketresearchintellect.com/need-customization/?rid=174712&utm_source=3WN&utm_medium=888

In the end, the Topological Quantum Computing market is analyzed for revenue, sales, price, and gross margin. These points are examined for companies, types, applications, and regions.

To summarize, the global Topological Quantum Computing market report studies the contemporary market to forecast the growth prospects, challenges, opportunities, risks, threats, and the trends observed in the market that can either propel or curtail the growth rate of the industry. The market factors impacting the global sector also include provincial trade policies, international trade disputes, entry barriers, and other regulatory restrictions.

About Us:

Market Research Intellect provides syndicated and customized research reports to clients from various industries and organizations with the aim of delivering functional expertise. We provide reports for all industries including Energy, Technology, Manufacturing and Construction, Chemicals and Materials, Food and Beverage, and more. These reports deliver an in-depth study of the market with industry analysis, the market value for regions and countries, and trends that are pertinent to the industry.

Contact Us:

Mr. Steven Fernandes

Market Research Intellect

New Jersey ( USA )

Tel: +1-650-781-4080

Our Trending Reports

Laser Pointer Market Size, Growth Analysis, Opportunities, Business Outlook and Forecast to 2026

Music Publishing Market Size, Growth Analysis, Opportunities, Business Outlook and Forecast to 2026

Storage As A Service Market Size, Growth Analysis, Opportunities, Business Outlook and Forecast to 2026

Indonesia Marine Lubricants Market Size, Growth Analysis, Opportunities, Business Outlook and Forecast to 2026

United States & Asia Low Smoke Halogen-Free Cable Materials Market Size, Growth Analysis, Opportunities, Business Outlook and Forecast to 2026

Read more here:
Topological Quantum Computing Market Growth By Manufacturers, Type And Application, Forecast To 2026 - 3rd Watch News

The biggest flipping challenge in quantum computing – Science Magazine

By Adrian ChoJul. 9, 2020 , 2:00 PM

In October 2019, researchers at Google announced to great fanfare that their embryonic quantum computer had solved a problem that would overwhelm the best supercomputers. Some said the milestone, known as quantum supremacy, marked the dawn of the age of quantum computing. However, Greg Kuperberg, a mathematician at the University of California, Davis, who specializes in quantum computing, wasnt so impressed. He had expected Google to aim for a goal that is less flashy but, he says, far more important.

Whether its calculating your taxes or making Mario jump a canyon, your computer works its magic by manipulating long strings of bits that can be set to 0 or 1. In contrast, a quantum computer employs quantum bits, or qubits, that can be both 0 and 1 at the same time, the equivalent of you sitting at both ends of your couch at once. Embodied in ions, photons, or tiny superconducting circuits, such two-way states give a quantum computer its power. But theyre also fragile, and the slightest interaction with their surroundings can distort them. So scientists must learn to correct such errors, and Kuperberg had expected Google to take a key step toward that goal. I consider it a more relevant benchmark, he says.

If some experts question the significance of Googles quantum supremacy experiment, all stress the importance of quantum error correction. It is really the difference between a $100 million, 10,000-qubit quantum computer being a random noise generator or the most powerful computer in the world, says Chad Rigetti, a physicist and co-founder of Rigetti Computing. And all agree with Kuperberg on the first step: spreading the information ordinarily encoded in a single jittery qubit among many of them in a way that maintains the information even as noise rattles the underlying qubits. Youre trying to build a ship that remains the same ship, even as every plank in it rots and has to be replaced, explains Scott Aaronson, a computer scientist at the University of Texas, Austin.

The early leaders in quantum computingGoogle, Rigetti, and IBMhave all trained their sights on that target. Thats very explicitly the next big milestone, says Hartmut Neven, who leads Googles Quantum Artificial Intelligence lab. Jay Gambetta, who leads IBMs quantum computing efforts, says, In the next couple of years, youll see a series of results that will come out from us to deal with error correction.

Physicists have begun to test their theoretical schemes in small experiments, but the challenge is grand. To demonstrate quantum supremacy, Google scientists had to wrangle 53 qubits. To encode the data in a single qubit with sufficient fidelity, they may need to master 1000 of them.

The quest for quantum computers took off in 1994 when Peter Shor, a mathematician at the Massachusetts Institute of Technology, showed that such a machinethen hypotheticalshould be able to quickly factor huge numbers. Shors algorithm represents the possible factorizations of a number as quantum waves that can slosh simultaneously through the computers qubits, thanks to the qubits two-way states. The waves interfere so that the wrong factorizations cancel one another and the right one pops out. A machine running Shors algorithm could, among other things, crack the encryption systems that now secure internet communications, which rely on the fact that searching for the factors of a huge number overwhelms any ordinary computer.

However, Shor assumed each qubit would maintain its state so the quantum waves could slosh around as long as necessary. Real qubits are far less stable. Google, IBM, and Rigetti use qubits made of tiny resonating circuits of superconducting metal etched into microchips, which so far have proved easier to control and integrate into circuits than other types of qubits. Each circuit has two distinct energy states, which can denote 0 or 1. By plying a circuit with microwaves, researchers can ease it into either state or any combination of the twosay, 30% 0 and 70% 1. But those in-between states will fuzz out or decohere in a fraction of a second. Even before that happens, noise can jostle the state and alter it, potentially derailing a calculation.

Whereas an ordinary bit must be either 0 or 1, a qubit can be in any combination of 0 and 1 at the same time. Those two parts of the state mesh in a way described by an abstract angle, or phase. So the qubits state is like a point on a globe whose latitude reveals how much the qubit is 0 and how much it is 1, and whose longitude indicates the phase. Noise can jostle the qubit in two basic ways that knock the point around the globe.

Bit-flip error Exchanges 0 and 1, flippingthe qubit in latitudeQubitstatePhase-flip error Pushes the qubits state halfwayaround the sphere in longitudePhaseEqual mixof 1 and 0Mixtureof 1 and 010

C. Bickel/Science

Such noise nearly drowned out the signal in Googles quantum supremacy experiment. Researchers began by setting the 53 qubits to encode all possible outputs, which ranged from zero to 253. They implemented a set of randomly chosen interactions among the qubits that in repeated trials made some outputs more likely than others. Given the complexity of the interactions, a supercomputer would need thousands of years to calculate the pattern of outputs, the researchers said. So by measuring it, the quantum computer did something that no ordinary computer could match. But the pattern was barely distinguishable from the random flipping of qubits caused by noise. Their demonstration is 99% noise and only 1% signal, Kuperberg says.

To realize their ultimate dreams, developers want qubits that are as reliable as the bits in an ordinary computer. You want to have a qubit that stays coherent until you switch off the machine, Neven says.

Scientists approach of spreading the information of one qubita logical qubitamong many physical ones traces its roots to the early days of ordinary computers in the 1950s. The bits of early computers consisted of vacuum tubes or mechanical relays, which were prone to flip unexpectedly. To overcome the problem, famed mathematician John von Neumann pioneered the field of error correction.

Von Neumanns approach relied on redundancy. Suppose a computer makes three copies of each bit. Then, even if one of the three flips, the majority of the bits will preserve the correct setting. The computer can find and fix the flipped bit by comparing the bits in pairs, in so-called parity checks. If the first and third bits match, but the first and second and second and third differ, then most likely, the second bit flipped, and the computer can flip it back. Greater redundancy means greater ability to correct errors. Ironically, the transistors, etched into microchips, that modern computers use to encode their bits are so reliable that error correction isnt much used.

But a quantum computer will depend on it, at least if its made of superconducting qubits. (Qubits made of individual ions suffer less from noise, but are harder to integrate.) Unfortunately for developers, quantum mechanics itself makes their task much harder by depriving them of their simplest error-correcting tool, copying. In quantum mechanics, a no-cloning theorem says its not possible to copy the state of one qubit onto another without altering the state of the first one. This means that its not possible to directly translate our classical error correction codes to quantum error correction codes, says Joschka Roffe, a theorist at the University of Sheffield.

In a conventional computer, a bit is a switch that can be set to either 0 or 1. To protect a bit, a computer can copy it. If noise then flips a copy, the machine can find the error by making parity measurements: comparing pairs of bits to see whether theyre the same or different.

10101010101010101010Parity measurementsErrorcorrectionNoiseFlipped bitCopying

C. Bickel/Science

Even worse, quantum mechanics requires researchers to find errors blindfolded. Although a qubit can have a state that is both 0 and 1 at the same time, according to quantum theory, experimenters cant measure that two-way state without collapsing it into either 0 or 1. Checking a state obliterates it. The simplest [classical error] correction is that you look at all the bits to see whats gone wrong, Kuperberg says. But if its qubits then you have to find the error without looking.

Those hurdles may sound insurmountable, but quantum mechanics points to a potential solution. Researchers cannot copy a qubits state, but they can extend it to other qubits using a mysterious quantum connection called entanglement.

How the entangling is done shows just how subtle quantum computing is. Prodded with microwaves, the original qubit interacts with another that must start in the 0 state through a controlled not (CNOT) operation. The CNOT will change the state of the second qubit if the state of the first is 1 and leave it unchanged if the first qubit is 0. However, the maneuver doesnt actually measure the first qubit and collapse its state. Instead, it maintains the both-ways state of the first qubit while both changing and not changing the second qubit at the same time. It leaves the two qubits in a state in which, simultaneously, they are both 0 and both 1.

If the original qubit is in, for example, a 30% 0 and 70% 1 state, physicists can link it to other qubits to make a chain of, say, three qubits that share an entangled state thats 30% all three are 0 and 70% all three are 1. That state is distinct from three copies of the original qubit. In fact, none of the three entangled qubits in the string possesses a well defined quantum state of its own. But now, the three qubits are completely correlated: If you measure the first one and it collapses to 1, then the other two must also instantly collapse to 1. If the first collapses to 0, the others must also. That correlation is the essence of entanglement.

With that bigger entangled state, scientists can now keep an eye out for errors. To do that, they entangle still other ancillary qubits with the chain of three, one with first and second qubits in the string and another with the second and third. They then use measurements on the ancillas to make the quantum mechanical equivalent of parity checks. For example, without breaking the entanglement, noise can flip any one of the three coding qubits so that its 0 and 1 parts get switched, changing the latent correlations among all three. If researchers set things up right, they can make stabilizer measurements on the ancillary qubits to probe those correlations.

Although measuring the ancillary qubits collapses their states, it leaves the coding qubits unperturbed. These are specially designed parity measurements that dont collapse the information encoded in the logical state, Roffe says. For example, if the measurement shows the first ancilla is 0, it reveals only that the first and second coding qubits must be in the same state, but not which state that is. If the ancilla is 1, then the measurement reveals only that the coding qubits must be in opposite states. If researchers can find a flipped qubit more quickly than the qubits tend to fuzz out, they can use microwaves to flip it back to its original state and restore its coherence.

The rules of quantum mechanics make it impossible to watch for errors by copying and measuring qubits (top). Instead, physicists want to spread the qubits state to other qubits through entanglement (middle) and monitor those to detect errors; then nudge an errant bit back to the correct state (bottom).

Bigger is betterInstead of trying to copy the state of a qubit, physicists can enlarge it by entangling the qubit with others, resulting in a single state that corresponds to the same point on a sphere.Lost identity In the entangledcondition, none of thethree qubits has awell-defined quantumstate of its own. 0101EntanglementCopyingNot so fast! Quantum mechanicsdoes not allow the stateof one qubit to becopied onto others.Originalqubit010101000111111000101010111000NoiseAncillary qubit entangled with the first and second qubitsAncillary qubit entangled with the second and third qubitsCorrectionGentle correctivesNow, if noise flips one of the qubits, physicists can detect the change without actually measuring the state. They entangle pairs of the main qubits with other ancillary qubits whose state can be measured and will be 0 if the correlation between a pair remains the same and 1 if the correlation is flipped. Microwaves can then unflip the qubit and restore the initial entangled state.

C. Bickel/Science

Thats just the basic idea. The state of a qubit is more complex than just a combination of 0 and 1. It also depends on exactly how those two parts mesh, which, in turn, depends on an abstract angle called the phase. The phase can range from 0 to 360 and is key to the wavelike interference effects that give a quantum computer its power. Quantum mechanically, any error in a qubits state can be thought of as some combination of a bit-flip error that swaps 0 and 1 and a phase flip that changes the phase by 180.

To correct both types, researchers can expand into another dimensionliterally. Whereas a string of three entangled qubits, with two ancillas woven between them, is the smallest array that can detect and correct a bit-flip error, a three-by-three grid of qubits, with eight interspersed ancillas, is the simplest one that can detect and correct both bit-flip and phase-flip errors. The logical qubit now resides in an entangled state of the nine qubitsbe thankful you dont have to write it out mathematically! Stabilizer measurements along one dimension of the grid check for bit-flip errors, while slightly different stabilizer measurements along the other dimension check for phase-flip errors.

Schemes for pushing into two dimensions vary, depending on the geometric arrangement of the qubits and the details of the stabilizer measurements. Nevertheless, researchers road to error correction is now clear: Encode a single logical qubit in a grid of physical qubits and show that the fidelity of the logical qubit gets better as the size of the grid increases.

Experimenters have already made a start. For example, in aNature Physicsstudy published on 8 June, Andreas Wallraff at ETH Zurich and colleagues demonstrated that they could detectbut not correcterrors in a logical qubit encoded in a square of four qubits with three ancillary qubits.

But experimenters face a daunting challenge. Manipulating individual qubits can introduce errors, and unless that error rate falls below a certain level, then entangling more qubits with the original one only adds more noise to the system, says Maika Takita, a physicist at IBM. To demonstrate anything you have to get below that threshold, she says. The ancillary qubits and other error-correction machinery add even more noise, and once those effects are included, the necessary error threshold plummets further. To make the scheme work, physicists must lower their error rate to less than 1%. When I heard we achieved an 3% error rate, I thought that was great, Takita says. Now, it needs to be much lower.

Error correction also requires twiddling with qubits repeatedly. That makes the process more demanding than quantum supremacy, which involved measuring all the qubits just once, says Marissa Giustina, a physicist with Google. Error correction requires you to measure and measure and measure over and over again in a cycle, and that has to be done quickly and reliably, she says.

Although a handful of qubits would suffice to demonstrate the principle of quantum error correction, in practice physicists will have to control huge numbers of them. To run Shors algorithm well enough to factor, say, a number 1000 bits longroughly the size used in some internet encryption schemestheyll need to maintain logical qubits with a part-in-1-billion error rate. That may require entangling a grid of 1000 physical qubits to safeguard a single logical qubit, researchers say, a prospect that will take generations of bigger and better quantum computing chips.

Ironically, overcoming that challenge would put developers back where they were 20 years ago, when they were just setting out to make pairs of physical qubits interact to perform the various logical operations, or gates, needed for computation. Once scientists have begun to master error correction, theyll have to repeat nearly every development so far in quantum computing with the more robust but highly complex logical qubits. People say that error correction is the next step in quantum computing; its the next 25 steps, Giustina quips.

Retracing those steps wont be easy. Its not just that any logical gate currently involving two qubits will require thousands of them. Worse, another theorem from quantum mechanics states that, no matter what scheme researchers use, not all of the logical gates can be easily translated from individual physical qubits to diffuse logical ones.

Researchers think they can sidestep that problem if they can initialize all the qubits in their computer in particular magic states that, more or less, do half the work of the problematic gates. Unfortunately, still more qubits may be needed to produce those magic states. If you want to perform something like Shors algorithm, probably 90% of the qubits would have to be dedicated to preparing these magic states, Roffe says. So a full-fledged quantum computer, with 1000 logical qubits, might end up containing many millions of physical qubits.

Google has a plan to build just such a machine within 10 years. At first blush, that sounds preposterous. Superconducting qubits need to be cooled to near absolute zero, in a device called a cryostat that fills a small room. A million-qubit machine conjures visions of a thousand cryostats in a huge factory. But Google researchers think they can keep their device compact. I dont want to tip my hand, but we believe we figured this out, Neven says.

Others are taking different tacks. Googles scheme would require 1000 physical qubits to encode a single logical qubit because its chip allows only neighboring qubits to interact. If more distant qubits can be made to interact, too, the number of physical qubits could be much smaller, Gambetta says. If I can achieve that, then these ridiculously scary numbers for the overhead of error correction can come crashing down, he says. So IBM researchers are exploring a scheme with more distant interconnections among the qubits.

Nobody is willing to predict how long it will take researchers to master error correction. But it is time to turn to the problem in earnest, Rigetti says. Thus far, substantially all the researchers who would identify themselves as error correction researchers are theorists, he says. We have to make this an empirical field with real feedback on real data generated with real machines. Quantum supremacy is so 2019. In quantum computing, error correction is the next hot thing.

See the article here:
The biggest flipping challenge in quantum computing - Science Magazine

London’s PQShield raises 5.5 million seed to develop security solutions that match the power of quantum computing – Tech.eu

PQShield, a London-based cybersecurity startup that specialises in post-quantum cryptography, has come out of stealth mode with a 5.5 million seed investment from Kindred Capital, Crane Venture Partners, Oxford Sciences Innovation and angel investors including Andre Crawford-Brunt, Deutsche Banks former global head of equities.

According to the startup, quantum computers promise an unprecedented problem for security, since they will be able to smash through traditional public-key encryption and threaten the security of all sensitive information, past and present. For that reason, the company is developing quantum-secure cryptography, advanced solutions for hardware, software and communications that resist quantum threat yet still work with todays technology.

Whether cars, planes or other connected devices, many of the products designed and sold today are going to be used for decades. Their hardware may be built to last, but right now, their security certainly isnt. Future-proofing is an imperative, just as it is for the banks and agencies that hold so much of our sensitive data, explains founder and CEO Dr. El Kaafarani,

The team, a spin out from Oxford University, is already working on commercialisation and roll-out as well. Its System on Chip (SoC) solution, built fully in-house, will be licensed to hardware manufacturers, while a software development kit will enable the creation of secure messaging apps protected by post-quantum algorithms. Bosch is already a customer.

Read the original post:
London's PQShield raises 5.5 million seed to develop security solutions that match the power of quantum computing - Tech.eu