Honeywell Achieves Breakthrough That Will Enable The Worlds Most Powerful Quantum Computer #47655 – New Kerala

The company also announced it has made strategic investments in two leading quantum computing software providers and will work together to develop quantum computing algorithms with JPMorgan Chase. Together, these announcements demonstrate significant technological and commercial progress for quantum computing and change the dynamics in the quantum computing industry.

Within the next three months, Honeywell will bring to market the world's most powerful quantum computer in terms of quantum volume, a measure of quantum capability that goes beyond the number of qubits. Quantum volume measures computational ability, indicating the relative complexity of a problem that can be solved by a quantum computer. When released, Honeywell's quantum computer will have a quantum volume of at least 64, twice that of the next alternative in the industry.

In a scientific paper that will be posted to the online repository arXiv later today and is available now on Honeywell's website, Honeywell has demonstrated its quantum charge coupled device (QCCD) architecture, a major technical breakthrough in accelerating quantum capability. The company also announced it is on a trajectory to increase its computer's quantum volume by an order of magnitude each year for the next five years.

This breakthrough in quantum volume results from Honeywell's solution having the highest-quality, fully-connected qubits with the lowest error rates.

Building quantum computers capable of solving deeper, more complex problems is not just a simple matter of increasing the number of qubits, said Paul Smith-Goodson, analyst-in-residence for quantum computing, Moor Insights & Strategy. Quantum volume is a powerful tool that should be adopted as an interim benchmarking tool by other gate-based quantum computer companies.

Honeywell Chairman and Chief Executive Officer Darius Adamczyk said companies should start now to determine their strategy to leverage or mitigate the many business changes that are likely to result from new quantum computing technology.

Quantum computing will enable us to tackle complex scientific and business challenges, driving step-change improvements in computational power, operating costs and speed, Adamczyk said. Materials companies will explore new molecular structures. Transportation companies will optimize logistics. Financial institutions will need faster and more precise software applications. Pharmaceutical companies will accelerate the discovery of new drugs. Honeywell is striving to influence how quantum computing evolves and to create opportunities for our customers to benefit from this powerful new technology.

To accelerate the development of quantum computing and explore practical applications for its customers, Honeywell Ventures, the strategic venture capital arm of Honeywell, has made investments in two leading quantum software and algorithm providers Cambridge Quantum Computing (CQC) and Zapata Computing. Both Zapata and CQC complement Honeywell's own quantum computing capabilities by bringing a wealth of cross-vertical market algorithm and software expertise. CQC has strong expertise in quantum software, specifically a quantum development platform and enterprise applications in the areas of chemistry, machine learning and augmented cybersecurity. Zapata creates enterprise-grade, quantum-enabled software for a variety of industries and use cases, allowing users to build quantum workflows and execute them freely across a range of quantum and classical devices.

Honeywell also announced that it will collaborate with JPMorgan Chase, a global financial services firm, to develop quantum algorithms using Honeywell's computer.

Honeywell's unique quantum computer, along with the ecosystem Honeywell has developed around it, will enable us to get closer to tackling major and growing business challenges in the financial services industry, said Dr. Marco Pistoia, managing director and research lead for Future Lab for Applied Research & Engineering (FLARE), JPMorgan Chase.

Honeywell first announced its quantum computing capabilities in late 2018, although the company had been working on the technical foundations for its quantum computer for a decade prior to that. In late 2019, Honeywell announced a partnership with Microsoft to provide cloud access to Honeywell's quantum computer through Microsoft Azure Quantum services.

Honeywell's quantum computer uses trapped-ion technology, which leverages numerous, individual, charged atoms (ions) to hold quantum information. Honeywell's system applies electromagnetic fields to hold (trap) each ion so it can be manipulated and encoded using laser pulses.

Honeywell's trapped-ion qubits can be uniformly generated with errors more well understood compared with alternative qubit technologies that do not directly use atoms. These high-performance operations require deep experience across multiple disciplines, including atomic physics, optics, cryogenics, lasers, magnetics, ultra-high vacuum, and precision control systems. Honeywell has a decades-long legacy of expertise in these technologies.

Today, Honeywell has a cross-disciplinary team of more than 100 scientists, engineers, and software developers dedicated to advancing quantum volume and addressing real enterprise problems across industries.

Honeywell (www.honeywell.com) is a Fortune 100 technology company that delivers industry-specific solutions that include aerospace products and services; control technologies for buildings and industry; and performance materials globally. Our technologies help aircraft, buildings, manufacturing plants, supply chains, and workers become more connected to make our world smarter, safer, and more sustainable. For more news and information on Honeywell, please visit http://www.honeywell.com/newsroom.

Originally posted here:
Honeywell Achieves Breakthrough That Will Enable The Worlds Most Powerful Quantum Computer #47655 - New Kerala

What Is Quantum Computing? The Complete WIRED Guide

First, accepted explanations of the subatomic world turned out to be incomplete. Electrons and other particles didnt just neatly carom around like Newtonian billiard balls, for example. Sometimes they acted like waves instead. Quantum mechanics emerged to explain such quirks, but introduced troubling questions of its own. To take just one brow-wrinkling example, this new math implied that physical properties of the subatomic world, like the position of an electron, didnt really exist until they were observed.

Quantum Leaps

1980

Physicist Paul Benioff suggests quantum mechanics could be used for computation.

1981

Nobel-winning physicist Richard Feynman, at Caltech, coins the term quantum computer.

1985

Physicist David Deutsch, at Oxford, maps out how a quantum computer would operate, a blueprint that underpins the nascent industry of today.

1994

Mathematician Peter Shor, at Bell Labs, writes an algorithm that could tap a quantum computers power to break widely used forms of encryption.

2007

D-Wave, a Canadian startup, announces a quantum computing chip it says can solve Sudoku puzzles, triggering years of debate over whether the companys technology really works.

2013

Google teams up with NASA to fund a lab to try out D-Waves hardware.

2014

Google hires the professor behind some of the best quantum computer hardware yet to lead its new quantum hardware lab.

2016

IBM puts some of its prototype quantum processors on the internet for anyone to experiment with, saying programmers need to get ready to write quantum code.

2017

Startup Rigetti opens its own quantum computer fabrication facility to build prototype hardware and compete with Google and IBM.

If you find that baffling, youre in good company. A year before winning a Nobel for his contributions to quantum theory, Caltechs Richard Feynman remarked that nobody understands quantum mechanics. The way we experience the world just isnt compatible. But some people grasped it well enough to redefine our understanding of the universe. And in the 1980s a few of themincluding Feynmanbegan to wonder if quantum phenomena like subatomic particles' dont look and I dont exist trick could be used to process information. The basic theory or blueprint for quantum computers that took shape in the 80s and 90s still guides Google and others working on the technology.

Before we belly flop into the murky shallows of quantum computing 0.101, we should refresh our understanding of regular old computers. As you know, smartwatches, iPhones, and the worlds fastest supercomputer all basically do the same thing: they perform calculations by encoding information as digital bits, aka 0s and 1s. A computer might flip the voltage in a circuit on and off to represent 1s and 0s for example.

Quantum computers do calculations using bits, too. After all, we want them to plug into our existing data and computers. But quantum bits, or qubits, have unique and powerful properties that allow a group of them to do much more than an equivalent number of conventional bits.

Qubits can be built in various ways, but they all represent digital 0s and 1s using the quantum properties of something that can be controlled electronically. Popular examplesat least among a very select slice of humanityinclude superconducting circuits, or individual atoms levitated inside electromagnetic fields. The magic power of quantum computing is that this arrangement lets qubits do more than just flip between 0 and 1. Treat them right and they can flip into a mysterious extra mode called a superposition.

You may have heard that a qubit in superposition is both 0 and 1 at the same time. Thats not quite true and also not quite falsetheres just no equivalent in Homo sapiens humdrum classical reality. If you have a yearning to truly grok it, you must make a mathematical odyssey WIRED cannot equip you for. But in the simplified and dare we say perfect world of this explainer, the important thing to know is that the math of a superposition describes the probability of discovering either a 0 or 1 when a qubit is read outan operation that crashes it out of a quantum superposition into classical reality. A quantum computer can use a collection of qubits in superpositions to play with different possible paths through a calculation. If done correctly, the pointers to incorrect paths cancel out, leaving the correct answer when the qubits are read out as 0s and 1s.

Jargon for the Quantum Qurious

What's a qubit?

A device that uses quantum mechanical effects to represent 0s and 1s of digital data, similar to the bits in a conventional computer.

What's a superposition?

It's the trick that makes quantum computers tick, and makes qubits more powerful than ordinary bits. A superposition is in an intuition-defying mathematical combination of both 0 and 1. Quantum algorithms can use a group of qubits in a superposition to shortcut through calculations.

What's quantum entanglement?

A quantum effect so unintuitive that Einstein dubbed it spooky action at a distance. When two qubits in a superposition are entangled, certain operations on one have instant effects on the other, a process that helps quantum algorithms be more powerful than conventional ones.

What's quantum speedup?

The holy grail of quantum computinga measure of how much faster a quantum computer could crack a problem than a conventional computer could. Quantum computers arent well-suited to all kinds of problems, but for some they offer an exponential speedup, meaning their advantage over a conventional computer grows explosively with the size of the input problem.

For some problems that are very time consuming for conventional computers, this allows a quantum computer to find a solution in far fewer steps than a conventional computer would need. Grovers algorithm, a famous quantum search algorithm, could find you in a phone book with 100 million names with just 10,000 operations. If a classical search algorithm just spooled through all the listings to find you, it would require 50 million operations, on average. For Grovers and some other quantum algorithms, the bigger the initial problemor phonebookthe further behind a conventional computer is left in the digital dust.

The reason we dont have useful quantum computers today is that qubits are extremely finicky. The quantum effects they must control are very delicate, and stray heat or noise can flip 0s and 1s, or wipe out a crucial superposition. Qubits have to be carefully shielded, and operated at very cold temperatures, sometimes only fractions of a degree above absolute zero. Most plans for quantum computing depend on using a sizable chunk of a quantum processors power to correct its own errors, caused by misfiring qubits.

Recent excitement about quantum computing stems from progress in making qubits less flaky. Thats giving researchers the confidence to start bundling the devices into larger groups. Startup Rigetti Computing recently announced it has built a processor with 128 qubits made with aluminum circuits that are super-cooled to make them superconducting. Google and IBM have announced their own chips with 72 and 50 qubits, respectively. Thats still far fewer than would be needed to do useful work with a quantum computerit would probably require at least thousandsbut as recently as 2016 those companies best chips had qubits only in the single digits. After tantalizing computer scientists for 30 years, practical quantum computing may not exactly be close, but it has begun to feel a lot closer.

What the Future Holds for Quantum Computing

Some large companies and governments have started treating quantum computing research like a raceperhaps fittingly its one where both the distance to the finish line and the prize for getting there are unknown.

Google, IBM, Intel, and Microsoft have all expanded their teams working on the technology, with a growing swarm of startups such as Rigetti in hot pursuit. China and the European Union have each launched new programs measured in the billions of dollars to stimulate quantum R&D. And in the US, the Trump White House has created a new committee to coordinate government work on quantum information science. Several bills were introduced to Congress in 2018 proposing new funding for quantum research, totalling upwards of $1.3 billion. Its not quite clear what the first killer apps of quantum computing will be, or when they will appear. But theres a sense that whoever is first make these machines useful will gain big economic and national security advantages.

More:
What Is Quantum Computing? The Complete WIRED Guide

QUANTUM COMPUTING : Management’s Discussion and Analysis of Financial Condition and Results of Operations, (form 10-Q) – marketscreener.com

This quarterly report on Form 10-Q and other reports filed Quantum Computing,Inc. (the "Company" "we", "our", and "us") from time to time with the U.S.Securities and Exchange Commission (the "SEC") contain or may containforward-looking statements and information that are based upon beliefs of, andinformation currently available to, the Company's management as well asestimates and assumptions made by Company's management. Readers are cautionednot to place undue reliance on these forward-looking statements, which are onlypredictions and speak only as of the date hereof. When used in the filings, thewords "anticipate," "believe," "estimate," "expect," "future," "intend," "plan,"or the negative of these terms and similar expressions as they relate to theCompany or the Company's management identify forward-looking statements. Suchstatements reflect the current view of the Company with respect to future eventsand are subject to risks, uncertainties, assumptions, and other factors,including the risks contained in the "Risk Factors" section of the Company'sAnnual Report on Form 10-K for the fiscal year ended December 31, 2019, relatingto the Company's industry, the Company's operations and results of operations,and any businesses that the Company may acquire. Should one or more of theserisks or uncertainties materialize, or should the underlying assumptions proveincorrect, actual results may differ significantly from those anticipated,believed, estimated, expected, intended, or planned.Although the Company believes that the expectations reflected in theforward-looking statements are reasonable, the Company cannot guarantee futureresults, levels of activity, performance, or achievements. Except as required byapplicable law, including the securities laws of the United States, the Companydoes not intend to update any of the forward-looking statements to conform thesestatements to actual results.Our financial statements are prepared in accordance with accounting principlesgenerally accepted in the United States ("GAAP"). These accounting principlesrequire us to make certain estimates, judgments and assumptions. We believe thatthe estimates, judgments and assumptions upon which we rely are reasonable basedupon information available to us at the time that these estimates, judgments andassumptions are made. These estimates, judgments and assumptions can affect thereported amounts of assets and liabilities as of the date of the financialstatements as well as the reported amounts of revenues and expenses during theperiods presented. Our financial statements would be affected to the extentthere are material differences between these estimates and actual results. Inmany cases, the accounting treatment of a particular transaction is specificallydictated by GAAP and does not require management's judgment in its application.There are also areas in which management's judgment in selecting any availablealternative would not produce a materially different result. The followingdiscussion should be read in conjunction with our financial statements and notesthereto appearing elsewhere in this report.OverviewAt the present time, we are a development stage company with limitedoperations. The Company is currently developing "quantum ready" softwareapplications and solutions for companies that want to leverage the promise ofquantum computing. We believe the quantum computer holds the potential todisrupt several global industries. Independent of when quantum computingdelivers compelling performance advantage over classic computing, the softwaretools and applications to accelerate real-world problems must be developed todeliver quantum computing's full promise. We specialize in quantumcomputer-ready software application, analytics, and tools, with a mission todeliver differentiated performance using non-quantum processors in thenear-term.We are leveraging our collective expertise in finance, computing, mathematicsand physics to develop a suite of quantum software applications that may enableglobal industries to utilize quantum computers, quantum annealers and digitalsimulators to improve their processes, profitability, and security. We primarilyfocus on the quadratic unconstrained binary optimization (QUBO) formulation,which is equivalent to the Ising model implemented by hardware annealers, bothnon-quantum from Fujitsu and others and quantum from D-Wave Systems, and alsomappable to gate-model quantum processors. We have built a software stack thatmaps and optimizes problems in the QUBO form and then solves them powerfully oncloud-based processors. Our software is designed to be capable of running onboth classic computers and on annealers such as D-Wave's quantum processor. Weare also building applications and analytics that deliver the power of oursoftware stack to high-value discrete optimization problems posed by financial,bio/pharma, and cybersecurity analysts. The advantages our software delivers canbe faster time-to-solution to the same results, more-optimal solutions, ormultiple solutions. 19

Products and Products in Development

The Company is currently working on software products to address, communitydetection (analysis for pharmaceutical applications and epidemiology),optimization of job shop scheduling, logistics, and dynamic route optimizationfor transportation systems. The Company is continuing to seek out difficultproblems for which our technology may provide improvement over existingsolutions.

We are continuing to develop software to address two classes of financialoptimization problems: Asset allocation and Yield Curve Trades. For assetallocation, our target clients are the asset allocation departments of largefunds, who we envision using our application to improve their allocation ofcapital into various asset classes.

Three Months Ended June 30, 2020 vs. June 30, 2019

Gross margin for the three months ended June 30, 2020 was $0 as compared with $0for the comparable prior year period. There was no gross margin because theCompany has not yet commenced marketing and selling products or services.

Six Months Ended June 30, 2020 vs. June 30, 2019

Gross margin for the Six months ended June 30, 2020 was $0 as compared with $0for the comparable prior year period. There was no gross margin because theCompany has not yet commenced marketing and selling products or services.

Liquidity and Capital Resources

The following table summarizes total current assets, liabilities and workingcapital at June 30, 2020, compared to December 31, 2019:

Off Balance Sheet Arrangements

Critical Accounting Policies and Estimates

We have identified the accounting policies below as critical to our businessoperations and the understanding of our results of operations.

The Company's policy is to present bank balances under cash and cashequivalents, which at times, may exceed federally insured limits. The Companyhas not experienced any losses in such accounts.

Net loss per share is based on the weighted average number of common shares andcommon shares equivalents outstanding during the period.

Edgar Online, source Glimpses

Read the original:
QUANTUM COMPUTING : Management's Discussion and Analysis of Financial Condition and Results of Operations, (form 10-Q) - marketscreener.com

NIST Works on the Industries of the Future in Buildings from the Past – Nextgov

The presidents budget request for fiscal 2021 proposed $738 million to fund the National Institutes of Science and Technology, a dramatic reduction from the more than $1 billion in enacted funds allocated for the agency this fiscal year.

The House Science, Space and Technology Committees Research and Technology Subcommittee on Wednesday held a hearing to hone in on NISTs reauthorizationbut instead of focusing on relevant budget considerations, lawmakers had other plans.

We're disappointed by the president's destructive budget request, which proposes over a 30% cut to NIST programs, Subcommittee Chairwoman Rep. Haley Stevens, D-Mich., said at the top of the hearing. But today, I don't want to dwell on a proposal that we know Congress is going to reject ... today I would like this committee to focus on improving NIST and getting the agency the tools it needs to do better, to do its job.

Per Stevens suggestion, Under Secretary of Commerce for Standards and Technology and NIST Director Walter Copan reflected on some of the agencys dire needs and offered updates and his view on a range of its ongoing programs and efforts.

NISTs Facilities Are in Bad Shape

President Trumps budget proposal for fiscal 2021 requests only $60 million in funds for facility construction, which is down from the $118 million enacted for fiscal 2020 and comes at a time when the agencys workspaces need upgrades.

Indeed the condition of NIST facilities are challenging, Copan explained. Over 55% of NIST's facilities are considered in poor to critical condition per [Commerce Department] standards, and so it does provide some significant challenges for us.

Some of the agencys decades-old facilities and infrastructures are deteriorating and Copan added that hed recently heard NISTs deferred maintenance backlog has hit more than $775 million. If the lawmakers or public venture out to visit some of the agencys facilities, you'll see the good, the bad, and the embarrassingly bad, he said. Those conditions are a testament to the resilience and the commitment of NISTs people, that they can work in sometimes challenging, outdated environments, Copan said.

The director noted that there have already been some creative solutions proposed to address the issue, including the development of a federal capital revolving fund. The agency is also looking creatively at the combination of maintenance with lease options for some of its facilities, in hopes that it can then move more rapidly by having its officials cycle out of laboratories to launch rebuilding and renovation processes.

It's one of my top priorities as the NIST director to have our NIST people work in 21st-century facilities that we can be proud of and that enable the important work of NIST for the nation, Copan said.

Advancing Efforts in Artificial Intelligence and Quantum Computing

The presidents budget request placed a sharp focus on industries of the future, which will be powered by many emerging technologies, and particularly quantum computing and AI.

During the hearing and in his written testimony, Copan highlighted some of NISTs work in both areas. The agency has helped shape an entire generation of quantum science, over the last century, and a significant portion of quantum scientists from around the globe have trained at the agencys facilities. Some of NISTs more recent quantum achievements include supporting the development of a quantum logic clock and helping steer advancements in quantum simulation. Following a recent mandate from the Trump administration, the agency is also in the midst of instituting the Quantum Economic Development Consortium, or QEDC, which aims to advance industry collaboration to expand the nations leadership in quantum research and development.

Looking forward, over the coming years NIST will focus a portion of its quantum research portfolio on the grand challenge of quantum networking, Copans written testimony said. Serving as the basis for secure and highly efficient quantum information transmission that links together multiple quantum devices and sensors, quantum networks will be a key element in the long-term evolution of quantum technologies.

Though there were cuts across many areas, the presidents budget request also proposed a doubling of NISTs funding in artificial intelligence and Copan said the technology is already broadly applied across all of the agencys laboratories to help improve productivity.

Going forward and with increased funding, he laid out some of the agencys top priorities, noting that there's much work to be done in developing tools to provide insights into artificial intelligence programs, and there is also important work to be done in standardization, so that the United States can lead the world in the application of [AI] in a trustworthy and ethical manner.

Standardization to Help the U.S. Lead in 5G

Rep. Frank Lucas, R-Okla., asked Copan to weigh in on the moves China is making across the fifth-generation wireless technology landscape, and the moves the U.S. needs to make to leadnot just competein that specific area.

We have entered in the United States, as we know, a hyper-competitive environment with China as a lead in activities related to standardization, Copan responded.

The director said that officials see, in some ways, that the standardization process has been weaponized, where the free market economy that is represented by the United States, now needs to lead in more effective coordination internally and incentivize industry to participate in the standards process. Though U.S. officials have already seen those rules of fair play bent or indeed broken by other players, NIST and others need to help improve information sharing across American standards-focused stakeholders, which could, in turn, accelerate adoption around the emerging technology.

We want the best technologies in the world to win and we want the United States to continue to be the leader in not only delivering those technologies, but securing the intellectual properties behind them and translating those into market value, he said.

See the original post here:
NIST Works on the Industries of the Future in Buildings from the Past - Nextgov

What Is Quantum Computing? The Next Era of Computational …

When you first stumble across the term quantum computer, you might pass it off as some far-flung science fiction concept rather than a serious current news item.

But with the phrase being thrown around with increasing frequency, its understandable to wonder exactly what quantum computers are, and just as understandable to be at a loss as to where to dive in. Heres the rundown on what quantum computers are, why theres so much buzz around them, and what they might mean for you.

All computing relies on bits, the smallest unit of information that is encoded as an on state or an off state, more commonly referred to as a 1 or a 0, in some physical medium or another.

Most of the time, a bit takes the physical form of an electrical signal traveling over the circuits in the computers motherboard. By stringing multiple bits together, we can represent more complex and useful things like text, music, and more.

The two key differences between quantum bits and classical bits (from the computers we use today) are the physical form the bits take and, correspondingly, the nature of data encoded in them. The electrical bits of a classical computer can only exist in one state at a time, either 1 or 0.

Quantum bits (or qubits) are made of subatomic particles, namely individual photons or electrons. Because these subatomic particles conform more to the rules of quantum mechanics than classical mechanics, they exhibit the bizarre properties of quantum particles. The most salient of these properties for computer scientists is superposition. This is the idea that a particle can exist in multiple states simultaneously, at least until that state is measured and collapses into a single state. By harnessing this superposition property, computer scientists can make qubits encode a 1 and a 0 at the same time.

The other quantum mechanical quirk that makes quantum computers tick is entanglement, a linking of two quantum particles or, in this case, two qubits. When the two particles are entangled, the change in state of one particle will alter the state of its partner in a predictable way, which comes in handy when it comes time to get a quantum computer to calculate the answer to the problem you feed it.

A quantum computers qubits start in their 1-and-0 hybrid state as the computer initially starts crunching through a problem. When the solution is found, the qubits in superposition collapse to the correct orientation of stable 1s and 0s for returning the solution.

Aside from the fact that they are far beyond the reach of all but the most elite research teams (and will likely stay that way for a while), most of us dont have much use for quantum computers. They dont offer any real advantage over classical computers for the kinds of tasks we do most of the time.

However, even the most formidable classical supercomputers have a hard time cracking certain problems due to their inherent computational complexity. This is because some calculations can only be achieved by brute force, guessing until the answer is found. They end up with so many possible solutions that it would take thousands of years for all the worlds supercomputers combined to find the correct one.

The superposition property exhibited by qubits can allow supercomputers to cut this guessing time down precipitously. Classical computings laborious trial-and-error computations can only ever make one guess at a time, while the dual 1-and-0 state of a quantum computers qubits lets it make multiple guesses at the same time.

So, what kind of problems require all this time-consuming guesswork calculation? One example is simulating atomic structures, especially when they interact chemically with those of other atoms. With a quantum computer powering the atomic modeling, researchers in material science could create new compounds for use in engineering and manufacturing. Quantum computers are well suited to simulating similarly intricate systems like economic market forces, astrophysical dynamics, or genetic mutation patterns in organisms, to name only a few.

Amidst all these generally inoffensive applications of this emerging technology, though, there are also some uses of quantum computers that raise serious concerns. By far the most frequently cited harm is the potential for quantum computers to break some of the strongest encryption algorithms currently in use.

In the hands of an aggressive foreign government adversary, quantum computers could compromise a broad swath of otherwise secure internet traffic, leaving sensitive communications susceptible to widespread surveillance. Work is currently being undertaken to mature encryption ciphers based on calculations that are still hard for even quantum computers to do, but they are not all ready for prime-time, or widely adopted at present.

A little over a decade ago, actual fabrication of quantum computers was barely in its incipient stages. Starting in the 2010s, though, development of functioning prototype quantum computers took off. A number of companies have assembled working quantum computers as of a few years ago, with IBM going so far as to allow researchers and hobbyists to run their own programs on it via the cloud.

Despite the strides that companies like IBM have undoubtedly made to build functioning prototypes, quantum computers are still in their infancy. Currently, the quantum computers that research teams have constructed so far require a lot of overhead for executing error correction. For every qubit that actually performs a calculation, there are several dozen whose job it is to compensate for the ones mistake. The aggregate of all these qubits make what is called a logical qubit.

Long story short, industry and academic titans have gotten quantum computers to work, but they do so very inefficiently.

Fierce competition between quantum computer researchers is still raging, between big and small players alike. Among those who have working quantum computers are the traditionally dominant tech companies one would expect: IBM, Intel, Microsoft, and Google.

As exacting and costly of a venture as creating a quantum computer is, there are a surprising number of smaller companies and even startups that are rising to the challenge.

The comparatively lean D-Wave Systems has spurred many advances in the fieldand proved it was not out of contention by answering Googles momentous announcement with news of a huge deal with Los Alamos National Labs. Still, smaller competitors like Rigetti Computing are also in the running for establishing themselves as quantum computing innovators.

Depending on who you ask, youll get a different frontrunner for the most powerful quantum computer. Google certainly made its case recently with its achievement of quantum supremacy, a metric that itself Google more or less devised. Quantum supremacy is the point at which a quantum computer is first able to outperform a classical computer at some computation. Googles Sycamore prototype equipped with 54 qubits was able to break that barrier by zipping through a problem in just under three-and-a-half minutes that would take the mightiest classical supercomputer 10,000 years to churn through.

Not to be outdone, D-Wave boasts that the devices it will soon be supplying to Los Alamos weigh in at 5000 qubits apiece, although it should be noted that the quality of D-Waves qubits has been called into question before. IBM hasnt made the same kind of splash as Google and D-Wave in the last couple of years, but they shouldnt be counted out yet, either, especially considering their track record of slow and steady accomplishments.

Put simply, the race for the worlds most powerful quantum computer is as wide open as it ever was.

The short answer to this is not really, at least for the near-term future. Quantum computers require an immense volume of equipment, and finely tuned environments to operate. The leading architecture requires cooling to mere degrees above absolute zero, meaning they are nowhere near practical for ordinary consumers to ever own.

But as the explosion of cloud computing has proven, you dont need to own a specialized computer to harness its capabilities. As mentioned above, IBM is already offering daring technophiles the chance to run programs on a small subset of its Q System Ones qubits. In time, IBM and its competitors will likely sell compute time on more robust quantum computers for those interested in applying them to otherwise inscrutable problems.

But if you arent researching the kinds of exceptionally tricky problems that quantum computers aim to solve, you probably wont interact with them much. In fact, quantum computers are in some cases worse at the sort of tasks we use computers for every day, purely because quantum computers are so hyper-specialized. Unless you are an academic running the kind of modeling where quantum computing thrives, youll likely never get your hands on one, and never need to.

See more here:
What Is Quantum Computing? The Next Era of Computational ...

Top 10 breakthrough technologies of 2020 – TechRepublic

Between tiny AI and unhackable internet, this decade's tech trends will revolutionize the business world.

MIT Technology Review unveiled its top 10 breakthrough technology predictions on Wednesday. The trends--which include hype-inducing tech like quantum computing and unhackable internet--are expected to become realities in the next decade, changing the enterprise and world.

SEE: Internet of Things: Progress, risks, and opportunities (free PDF) (TechRepublic)

While many of the trends have a more scientific background, most can also apply to business, said David Rotman editor at MIT Technology Review.

"Even though some of these sound science-y or research-y, all really do have important implications and business impacts. [For example], unhackable internet," Rotman said. "It's early, but we can all see why that would be a big deal.

"Digital money will change how we do commerce; satellite mega constellations will potentially change how we do communications and the price of communications," Rotman added.The methodology behind determining the breakthrough technologies focused on what writers, editors, and journalists have been reporting on in the past year. All of the technologies are still being developed and improved in labs, Rotman said.

The MIT Technology Review outlined the following 10 most exciting technologies being created and deployed in the next 10 years.

One of the most exciting technologies of the bunch, according to Rotman, quantum supremacy indicates that quantum computers are not only becoming a reality, but the functionality is becoming even more advanced.Murmurs of quantum computer development have floated around the enterprise. The technology is able to process massive computational solutions faster than any supercomputer.

While this form of computing hasn't been widely used yet, it will not only be usable by 2030, but possibly reach quantum supremacy, MIT found.

"Quantum supremacy is the point where a quantum computer can do something that a classical conventional computer cannot do or take hundreds of years for a classical computer to do," Rotman said.

The technology is now getting to the point where people can test them in their businesses and try different applications, and will become more popular in the coming years, Rotman said.

Quantum computers are especially useful for massive scheduling or logistical problems, which can be particularly useful in large corporations with many moving parts, he added.

"Satellites have become so small and relatively cheap that people are sending up whole clusters of these satellites," Rotman said. "It's going to have an enormous impact on communication and all the things that we rely on satellites for."

These satellites could be able to cover the entire globe with high-speed internet. Applications of satellite mega-constellation use are currently being tested by companies including SpaceX, OneWeb, Amazon, and Telesat, according to the report.

Another interesting, and surprising, technology in the study concerned tiny AI. The surprising nature of this comes with how quickly AI is growing, Rotman said.

Starting in the present day, AI will become even more functional, independently running on phones and wearables. This ability would prevent devices from needing the cloud to use AI-driven features, Rotman said.

"It's not just a first step, but it would be an important step in speeding up the search for new drugs," Rotman said.

Scientists have used AI to find drug-like compounds with specific desirable characteristics. In the next three to five years, new drugs might be able to be commercialized for a lesser cost, compared to the current $2.5 billion it takes to currently commercialize a new drug, the report found.

Researchers are now able to detect climate change's role in extreme weather conditions. With this discovery, scientists can help people better prepare for severe weather, according to the report.

In less than five years, researchers will find drugs that treat ailments based on the body's natural aging process, the report found. Potentially, diseases including cancer, heart disease and dementia could be treated by slowing age.

Within five years, the internet could be unhackable, the report found.

Researchers are using quantum encryption to try and make an unhackable internet, which is particularly important as data privacy concerns heighten, Rotman said.

Digital money, also known as cryptocurrency, will become more widely used in 2020. However, the rise of this money will also have major impacts on financial privacy, as the need for an intermediary becomes less necessary, according to the report.

Occupying three trends on the list, medicine is proving to potentially be a huge area for innovation. Currently, doctors and researchers are designing novel drugs to treat unique genetic mutations. These specialized drugs could cure some ailments that were previously uncurable, the report found.

Differential privacy is a technique currently being used by the US government collecting data for the 2020 census. The US Census Bureau has issues keeping the data it collects private, but this tactic helps to anonymize the data, a tactic other countries may also adopt, according to the report.

For more, check out Forget quantum supremacy: This quantum-computing milestone could be just as important on ZDNet.

Be in the know about smart cities, AI, Internet of Things, VR, AR, robotics, drones, autonomous driving, and more of the coolest tech innovations. Delivered Wednesdays and Fridays

Image: Urupong, Getty Images/iStockphoto

See the rest here:
Top 10 breakthrough technologies of 2020 - TechRepublic

New Intel chip could accelerate the advent of quantum computing – RedShark News

The marathon to achieve the promise of quantum computers hasedged a few steps forward as Intel unveils a new chip capable, it believes, of accelerating the process.

Called Horse Ridgeand named after one of the coldest places in Oregon, the system-on-chip can control a total of 128 qubits (quantum bits) which is more than double the number of qubits Intel heralded in its Tangle Lake test chip in early 2018.

While companies like IBM and Microsoft have been leapfrogging each other with systems capable of handling ever greater qubits the breakthrough in this case appears to be an ability to lead to more efficient quantum computers by allowing one chip to handle more tasks. It is therefore a step toward moving quantum computing from the lab and into real commercial viability.

Applying quantum computing to practical problems hinges on the ability to scale, and control, thousands of qubits at the same time with high levels of fidelity. Intel suggests Horse Ridge greatly simplifies current complex electronics required to operate a quantum system.

To recap why this is important lets take it for read that Quantum computing has the potential to tackle problems conventional computers cant by leveraging a phenomena of quantum physics: that Qubits can exist in multiple states simultaneously. As a result, they are able to conduct a large number of calculations at the same time.

This can dramatically speed up complex problem-solving from years to a matter of minutes. But in order for these qubits to do their jobs, hundreds of connective wires have to be strung into and out of the cryogenic refrigerator where quantum computing occurs (at temperatures colder than deep space).

The extensive control cabling for each qubit drastically hinders the ability to control the hundreds or thousands of qubits that will be required to demonstrate quantum practicality in the lab not to mention the millions of qubits that will be required for a commercially viable quantum solution in the real world.

Researchers outlined the capability of Horse Ridge in a paper presented at the 2020 International Solid-State Circuits Conference in San Francisco and co-written by collaborators at Dutch institute QuTech.

The integrated SoC design is described as being implemented using Intels 22nm FFL (FinFET Low Power) CMOS technology and integrates four radio frequency channels into a single device. Each channel is able to control up to 32 qubits leveraging frequency multiplexing a technique that divides the total bandwidth available into a series of non-overlapping frequency bands, each of which is used to carry a separate signal.

With these four channels, Horse Ridge can potentially control up to 128 qubits with a single device, substantially reducing the number of cables and rack instrumentations previously required.

The paper goes on to argue that increases in qubit count trigger other issues that challenge the capacity and operation of the quantum system. One such potential impact is a decline in qubit fidelity and performance. In developing Horse Ridge, Intel optimised the multiplexing technology that enables the system to scale and reduce errors from crosstalk among qubits.

While developing control systems isnt, evidently, as hype-worthy as the increase in qubit count has been, it is a necessity, says Jim Clarke, director of quantum hardware, Intel Labs. Horse Ridge could take quantum practicality to the finish line much faster than is currently possible. By systematically working to scale to thousands of qubits required for quantum practicality, were continuing to make steady progress toward making commercially viable quantum computing a reality in our future.

Intels own research suggests it will most likely take at least thousands of qubits working reliably together before the first practical problems can be solved via quantum computing. Other estimates suggest it will require at least one million qubits.

Intel is exploring silicon spin qubits, which have the potential to operate at temperatures as high as 1 kelvin. This research paves the way for integrating silicon spin qubit devices and the cryogenic controls of Horse Ridge to create a solution that delivers the qubits and controls in one package.

Quantum computer applications are thought to include drug development high on the worlds list of priorities just now, logistics optimisation (that is, finding the most efficient way from any number of possible travel routes) and natural disaster prediction.

Read the rest here:
New Intel chip could accelerate the advent of quantum computing - RedShark News

Please check your data: A self-driving car dataset failed to label hundreds of pedestrians, thousands of vehicles – The Register

Roundup It's a long weekend in the US, though sadly not in Blighty. So, for those of you starting your week, here's some bite-sized machine-learning news, beyond what we've recently covered, if that's your jam.

Check your training data: A popular dataset for training self-driving vehicles, including an open-source autonomous car system, failed to correctly label hundreds of pedestrians and thousands of vehicles.

Brad Dwyer, founder of Roboflow, a startup focused on building data science tools, discovered the errors when he started digging into the dataset compiled by Udacity, an online education platform.

I first noticed images that were missing annotations, Dwyer told The Register. That led me to dig in deeper and check some of the other images. I found so many errors I ended up going through all 15,000 images because I didnt want to re-share a dataset that had such obvious errors.

After flicking through each image, he found that 33 per cent of them contained mistakes. Thousands of vehicles, hundreds of pedestrians, and dozens of cyclists were not labelled. Some of the bounding boxes around objects were duplicated or needlessly oversized too.

Training an autonomous car on such an incomplete dataset could potentially be dangerous. The collection was pulled together to make it easier for engineers to collaborate and build a self-driving car. Thankfully, a project to develop such a system using this information seems to have died down since it launched more than three years ago.

Udacity created this dataset years ago as a tool purely for educational purposes, back when self-driving car datasets were very hard to come by, and those learning the skills needed to develop a career in this field lacked adequate training resources, a Udacity spokesperson told El Reg.

At the time it was helpful to the researchers and engineers who were transitioning into the autonomous vehicle community. In the intervening years, companies like Waymo, nuTonomy, and Voyage have published newer, better datasets intended for real-world scenarios. As a result, our project hasn't been active for three years.

We make no representations that the dataset is fully labeled or complete. Any attempts to show this educational data set as an actual dataset are both misleading and unhelpful. Udacity's self-driving car currently operates for educational purposes only on a closed test track. Our car has not operated on public streets for several years, so our car poses no risk to the public.

Roboflow has since corrected the errors on the dataset, and issued an improved version.

Standing up to patent trolls works: Mycroft AI, a startup building an open-source voice-controlled assistant for Linux-based devices, was sued for allegedly infringing a couple of patents, as we reported earlier this month.

Mycrofts CEO Joshua Montgomery spoke to The Register about his strong suspicions that he was being targeted by a so-called patent troll. His biz was told by a lawyer representing the patents' owner to cough up a license fee, and when Montgomery ignored the request, a patent-infringement lawsuit was filed against his company.

The mysterious patent owner, Voice Tech Corp, turned out to a brand new company in Texas, USA, and its address was someones bungalow, according to court filings. All of that fueled the growing speculation that, yes, Voice Tech Corp, was probably a patent troll.

Now, after facing sufficient resistance from Mycroft, Voice Tech Corp has dropped its case. Montgomery threatened to fight the lawsuit all the way to get Voice Tech Corps patents invalidated so that no other startup would have to face the same problem.

More Clearview drama: The controversial facial-recognition outfit that admitted to harvesting more than three billion publicly shared photos from social media sites is back in the news again.

The American Civil Liberties Union (ACLU) revealed it is trying to get Clearview to remove the claim from its marketing that its facial recognition code was verified using a methodology used by the ACLU. The rights warriors said they had no involvement in the product and do not endorse it. In fact, the union is pretty much against everything Clearview is doing.

Clearview boasts that its technology is 99 per cent accurate following numerous tests. Buzzfeed News, however, reckons it is nowhere near that good. The upstart previously said its algorithms helped police in New York City catch a terrorist planning to plant fake bombs on the subway. NYPD denied using Clearviews software.

Google, YouTube, Twitter, and Facebook have sent Clearview cease-and-desist letters demanding the startup stop scraping images of their platforms, and to delete those in its database. In a bizarre interview, Clearviews CEO fought back and said he believed that since all the photos were public, his stateside company, therefore, had a First Amendment right to public information." Er, yeah right.

Public funding for AI, 5G: President Donald Trump has vowed to spend more of US taxpayers' money on the research and development of emergent technologies, such as AI, quantum computing, and 5G, than traditional sciences.

The Budget prioritizes accelerating AI solutions, according to a proposal, subject to congressional approval, published this week. Along with quantum information sciences, advanced manufacturing, biotechnology, and 5G research and development (R&D), these technologies will be at the forefront of shaping future economies.

The Budget proposes large increases for key industries, including doubling AI and quantum information sciences R&D by 2022 as part of an all-of-Government approach to ensure the United States leads the world in these areas well into the future.

Trump pledged to spend $142.2bn in R&D for the next fiscal year, nine per cent less than this year. While AI and quantum computing are favored, there's less federal funding for general research and development for the other sciences.

The Department of Energy, the National Science Foundation, the National Institutes of Health, and others, will see cuts. The DOEs Advanced Research Projects Agency-Energy (ARPA-E) will be particularly hard hit: not only does the proposed budget effectively eliminate the agency, it must pay back $311m to the treasury.

You can read more about the proposed budget for the fiscal year of 2021, here.

CEO of AI startup steps down over allegations: The CEO of Clinc, a small artificial-intelligence outfit spun out of the University of Michigan, has resigned following claims he sexually harassed employees and customers.

Jason Mars, an assistant professor of computer science at the university, was accused of physically accosting clients, making lewd comments about female employees and interns, and hiring a prostitute during a work trip.

In an email to employees at Clinc, first reported by The Verge, Mars said the allegations against him were rife with embellishments and fabrications. He did, however, admit to drinking too much and partying with staff in a way thats not becoming of a CEO.

Sponsored: Detecting cyber attacks as a small to medium business

More:
Please check your data: A self-driving car dataset failed to label hundreds of pedestrians, thousands of vehicles - The Register

Quantum Internet Workshop Begins Mapping the Future of Quantum Communications – Quantaneo, the Quantum Computing Source

Building on the efforts of the Chicago Quantum Exchange at the University of Chicago, Argonne and Fermi National Laboratories, and LiQuIDNet (Long Island Quantum Distribution Network) at Brookhaven National Laboratory and Stony Brook University, the event was organized by Brookhaven. The technical program committee was co-chaired by Kerstin Kleese Van Dam, director of the Computational Science Initiative at Brookhaven, and Inder Monga, director of ESnet at Lawrence Berkeley National Lab.

The dollars we have put into quantum information science have increased by about fivefold over the last three years, Dabbar told the New York Times on February 10 after the Trump Administration announced a new budget proposal that includes significant funding for quantum information science, including the quantum Internet.

In parallel with the growing interest and investment in creating viable quantum computing technologies, researchers believe that a quantum Internet could have a profound impact on a number of application areas critical to science, national security, and industry. Application areas include upscaling of quantum computing by helping connect distributed quantum computers, quantum sensing through a network of quantum telescopes, quantum metrology, and secure communications.

Toward this end, the workshop explored the specific research and engineering advances needed to build a quantum Internet in the near term, along with what is needed to move from todays limited local network experiments to a viable, secure quantum Internet.

This meeting was a great first step in identifying what will be needed to create a quantum Internet, said Monga, noting that ESnet engineers have been helping Brookhaven and Stony Brook researchers build the fiber infrastructure to test some of the initial devices and techniques that are expected to play a key role in enabling long-distance quantum communications. The group was very engaged and is looking to define a blueprint. They identified a clear research roadmap with many grand challenges and are cautiously optimistic on the timeframe to accomplish that vision.

Berkeley Labs Thomas Schenkel was the Labs point of contact for the workshop, a co-organizer, and co-chair of the quantum networking control hardware breakout session. ESnets Michael Blodgett also attended the workshop.

Read the original:
Quantum Internet Workshop Begins Mapping the Future of Quantum Communications - Quantaneo, the Quantum Computing Source

For the tech world, New Hampshire is anyone’s race – Politico

With help from John Hendel, Cristiano Lima, Leah Nylen and Katy Murphy

Editors Note: This edition of Morning Tech is published weekdays at 10 a.m. POLITICO Pro Technology subscribers hold exclusive early access to the newsletter each morning at 6 a.m. Learn more about POLITICO Pros comprehensive policy intelligence coverage, policy tools and services, at politicopro.com.

Advertisement

If Sanders wins in New Hampshire: If the polls hold true, the tech world may see a ton more heat from the Vermont senator, who has long been critical of tech giants market power and labor practices.

Trumps 2021 funding requests: President Donald Trumps 2021 budget proposal would give big funding boosts to artificial intelligence and quantum computing, as well as the Commerce Departments NTIA and the Justice Departments antitrust division, but not to the FTC or FCC.

Bipartisanship at risk?: House Judiciarys Republican leaders say recent comments from the Democratic chairman about Silicon Valley giants threatens the panels tech antitrust probe, a rare point of bipartisanship in a hotly divided Congress.

ITS TUESDAY, AND ALL EYES ARE ON THE FIRST PRESIDENTIAL PRIMARY OF 2020: NEW HAMPSHIRE. WELCOME TO MORNING TECH! Im your host, Alexandra Levine.

Got a news tip? Write Alex at alevine@politico.com or @Ali_Lev. An event for our calendar? Send details to techcalendar@politicopro.com. Anything else? Full team info below. And dont forget: add @MorningTech and @PoliticoPro on Twitter.

WHAT NEW HAMPSHIRE MEANS FOR TECH A week after winning the most votes in Iowa, Sen. Bernie Sanders (I-Vt.) is polling first in New Hampshire, with Pete Buttigieg a close-second. (Further behind, and mostly neck-and-neck, are Elizabeth Warren, Joe Biden and Amy Klobuchar.) What could this mean for the tech world? Just about anything.

But if the Vermont senator prevails in tonights Democratic presidential primary, we can expect to hear more of his usual anti-Amazon commentary (Sanders has repeatedly criticized Amazons labor practices and complained that the online giant pays zero in taxes); more break up big tech talk (Sanders has said he would absolutely look to break up tech companies like Amazon, Google and Facebook); and more attacks on corporate power and influence (he has proposed taxing tech giants based on how big a gap exists between the salaries of their CEOs and their mid-level employees).

Several prime tech policy issues are also fair game: Sanders criminal justice reform plan includes a ban on law enforcements use of facial recognition technology, and he has spoken out about tech's legal liability shield, Section 230 debates that are playing out (often, with fireworks) at the federal level. (Further reading in POLITICO Magazine: Is it Bernies Party Now?)

Plus: Could New Hampshire be the next Iowa? State and local election officials running this primary without apps (voters will cast their ballots on paper, which in some cases will be counted by hand) say no. POLITICOs Eric Geller provides the birds-eye view.

Heres everything you need to know about the 2020 race in New Hampshire.

BUDGET DISPATCH: HUGE JUMP FOR DOJ ANTITRUST, NO BIG CHANGES FOR FCC AND FTC The White House on Monday rolled out its fiscal year 2021 funding requests, including a proposed 71 percent bump in congressional spending on the Justice Departments antitrust division an increase that, as Leah reports, is another indicator that the agency is serious about its pending investigations into tech giants like Google and Facebook. (It would also allow the agency to hire 87 additional staffers.)

In contrast, the FCC and FTC arent requesting any big changes in their funding or staffing. The FCC is seeking $343 million, up 1.2 percent from its 2020 funding level, while the FTC is asking for a little over $330 million, which is about $800,000 less than its current funding. The FCC noted its on track to move to its new Washington headquarters in June, while FTC Commissioner Rebecca Slaughter, a Democrat, objected to the request for her agency, saying in a statement that it does not accurately reflect the funding the FTC needs to protect consumers and promote competition.

Artificial intelligence and quantum computing would also receive big funding boosts under the budget proposal, Nancy reports. So would the Commerce Departments NTIA, to help prepare the agency for 5G and other technological changes, as John reported for Pros.

IS THE BIPARTISAN TECH ANTITRUST PROBE IN JEOPARDY? The House Judiciary Committees investigation into competition in the tech sector which garnered rare bipartisan momentum in a hotly divided Congress could now be in trouble. On Monday night, the committees Republican leaders criticized Democratic Chairman Jerry Nadlers recent remarks railing against the power of Silicon Valley giants, writing in a letter that Nadlers comments "have jeopardized" the panel's "ability to perform bipartisan work." Spokespeople for Nadler did not offer comment. A Cicilline spokesperson declined comment.

The dust-up marks the first major sign of fracturing between House Judiciary Republicans and Democrats over their bipartisan investigation into possible anti-competitive conduct in the tech industry a probe widely seen as one of Silicon Valleys biggest threats on Capitol Hill, Cristiano reports in a new dispatch. The dispute could threaten the push to advance bipartisan antitrust legislation in the House, something House Judiciary antitrust Chairman David Cicilline (D-R.I.) has said the committee plans to do early this year.

T-MOBILE-SPRINT WIN T-Mobile and Sprint can merge, a federal judge is expected to rule today, rejecting a challenge by California, New York and other state attorneys general, Leah reports. U.S. District Judge Victor Marrero is expected to release his hotly anticipated decision on the $26.2 billion telecom megadeal later this morning.

FCCS FUTURE-OF-WORK FOCUS Amazon, AT&T, Walmart, LinkedIn and Postmates are among the tech companies expected at a future-of-work event today that Democratic FCC Commissioner Geoffrey Starks is hosting at the agencys headquarters.

The public roundtable will address the same kinds of issues that several Democratic presidential candidates have raised, such as concerns about AIs effect on labor economies. Issues of #5G, #InternetInequality, automation & education are colliding in ways that will impact all Americans, Starks wrote on Twitter. Eager to host this important policy discussion!

CCPA UPDATE: GET ME REWRITE! California Attorney General Xavier Becerra on Monday published a business-friendly tweak to his proposed Privacy Act regulations, a change that his office said had been inadvertently omitted from a revised draft unveiled on Friday.

Only businesses that collect, sell or share the information of at least 10 million Californians per year thats about 1 in 4 residents would have to report annual statistics about CCPA requests and how quickly they responded to privacy-minded consumers, under the change. That threshold was originally 4 million.

The update will come as a relief to companies that no longer need to pull back the curtain on their Privacy Act responsiveness. Its also good news for procrastinators, as the new deadline for submitting comments on the AGs rules was pushed back a day to Feb. 25.

TECH QUOTE DU JOUR Senate Judiciary antitrust Chairman Mike Lee (R-Utah) offered colorful praise on Monday for Sen. Josh Hawleys (R-Mo.) proposal to have the Justice Department absorb the FTC, a plan aimed in part at addressing concerns over the FTCs enforcement of antitrust standards in the technology sector.

Having two federal agencies in charge of enforcing antitrust law makes as much sense as having two popes, Lee told MT in an emailed statement. This is an issue weve had hearings on in the Judiciary Committee and I think Sen. Hawley has identified a productive and constitutionally sound way forward. (Hawleys proposal swiftly drew pushback from one industry group, NetChoice, which said it would make political abuse more likely.")

The state of play: Some Republicans in the GOP-led Senate now want to reduce the number of regulators overseeing competition in the digital marketplace. A small contingent of House Democrats wants to create a new federal enforcer to police online privacy. But a vast majority of the discussions happening on Capitol Hill around those issues have so far focused on ways to empower the FTC, not downgrade it.

Mike Hopkins, chairman of Sony Pictures Television, is joining Amazon as a senior vice president overseeing Amazons Prime video platform and movie and television studios.

AB 5 blow: Uber and Postmates on Monday lost the first round in their challenge to Californias new worker classification law, POLITICO reports.

Uber IPO fallout: As tax season begins, some of Uber's earliest employees are realizing they had little idea how their stock grants worked and are now grappling with the fallout on their tax bills after last May's disappointing IPO, Protocol reports.

JEDI latest: Amazon wants Trump and Defense Secretary Mark Esper to testify in its lawsuit against the Pentagon over the award of the multibillion-dollar JEDI cloud computing contract to Microsoft, POLITICO reports.

ICYMI: Federal prosecutors announced charges Monday against four Chinese intelligence officers for hacking the credit-reporting giant Equifax in one of the largest data breaches in history, POLITICO reports.

Facebook ad tracker: New Hampshire saw more than $1 million in Facebook spending in the month leading up to todays presidential primary, Zach Montellaro reports for Pros.

Can privacy be a piece of cake?: A privacy app called Jumbo presents a startling contrast to the maze of privacy controls presented by companies like Facebook, Twitter and Google, Protocol reports heres how it works, and how it plans turn a buck.

Virus watch: Following Amazons lead, Sony and NTT are pulling out of this months Mobile World Congress in Barcelona as a precaution during the coronavirus outbreak, Reuters reports.

In profile: Zapata Computing, a startup that creates software for quantum computers by avoiding as much as possible actually using a quantum machine, Protocol reports.

Out today: Alexis Wichowski, New York Citys deputy chief technology director and a professor at Columbias School of International and Public Affairs, is out today with The Information Trade: How Big Tech Conquers Countries, Challenges Our Rights, and Transforms Our World, a book published by HarperCollins.

Tips, comments, suggestions? Send them along via email to our team: Bob King (bking@politico.com, @bkingdc), Mike Farrell (mfarrell@politico.com, @mikebfarrell), Nancy Scola (nscola@politico.com, @nancyscola), Steven Overly (soverly@politico.com, @stevenoverly), John Hendel (jhendel@politico.com, @JohnHendel), Cristiano Lima (clima@politico.com, @viaCristiano), Alexandra S. Levine (alevine@politico.com, @Ali_Lev), and Leah Nylen (lnylen@politico.com, @leah_nylen).

TTYL.

Read the original:
For the tech world, New Hampshire is anyone's race - Politico

Daily AI Roundup: The Coolest Things on Earth Today – AiThority

Todays Daily AI Roundup covers the latest Artificial Intelligence announcements on AI capabilities, AI mobility products, Robotic Service, Technology from IBM, Comscore, Arista Networks, Cisco, Atos and DJI.

IBM and Delta Air Lines announced that the global airline is embarking on a multi-year collaborative effort with IBM including joining the IBM Q Network to explore the potential capabilities of quantum computing to transform experiences for customers and employees.

New research from Comscore, a trusted partner for planning, transacting and evaluating media across platforms, found that for the 4th month in a row, Toyota RAV4 was the most shopped new vehicle model market wide.

Arista Networks announced the acquisition of Big Switch Networks, a network monitoring and SDN (Software Defined Networking) company. Arista Networks provides a complete and visionary cloud networking suite, with rich capabilities in all critical areas of the campus, data center and public cloud.

Cisco announced that it has joined Facebooks Express Wi-Fi Technology Partner Program to close the digital divide and enable more people around the world to get connected to a faster, better internet.

Atos, a global leader in digital transformation, announced that it has expanded its collaboration with Microsoft to jointly address the fast-growing SAP HANA market, targeting the most demanding customers, many of whom are running mission-critical SAP workloads.

Talk of drones might have circled around military uses lately, but drones are actually being used to do good around the world. With its #DronesforGood campaign, DJIwishes to make people aware of the many ways in which drones can make our lives better and help keep us safe.

Read the original here:
Daily AI Roundup: The Coolest Things on Earth Today - AiThority

Quantum networking projected to be $5.5 billion market in 2025 – TechRepublic

Several companies are working to advance the technology, according to a new report.

The market for quantum networking is projected to reach $5.5 billion by 2025, according to a new report from Inside Quantum Technology (IQT).

While all computing systems rely on the ability to store and manipulate information in individual bits, quantum computers "leverage quantum mechanical phenomena to manipulate information" and to do so requires the use of quantum bits, or qubits, according to IBM.

SEE:Quantum computing: An insider's guide (TechRepublic)

Quantum computing is seen as the panacea for solving the problems computers are not equipped to handle now.

"For problems above a certain size and complexity, we don't have enough computational power on earth to tackle them,'' IBM said. This requires a new kind of computing, and this is where quantum comes in.

IQT says that quantum networking revenue comes primarily from quantum key distribution (QK), quantum cloud computing, and quantum sensor networks. Eventually, these strands will merge into a Quantum Internet, the report said.

Cloud access to quantum computers is core to the business models of many leading quantum computer companiessuch as IBM, Microsoft and Rigettias well as several leading academic institutions, according to the report.

Microsoft, for instance, designed a special programming language for quantum computers, called Q#, and released a Quantum Development Kit to help programmers create new applications, according to CBInsights.

One of Google's quantum computing projects involves working with NASA to apply the tech's optimization abilities to space travel.

The Quantum Internet network will have the same "geographical breadth of coverage as today's internet," the IQT report stated.

It will provide a powerful platform for communications among quantum computers and other quantum devices, the report said.

And will enable a quantum version of the Internet of Things. "Finally, quantum networks can be the most secure networks ever built completely invulnerable if constructed properly," the report said.

The report, "Quantum Networks: A Ten-Year Forecast and Opportunity Analysis," forecasts demand for quantum network equipment, software and services in both volume and value terms.

"The time has come when the rapidly developing quantum technology industry needs to quantify the opportunities coming out of quantum networking," said Lawrence Gasman, president of Inside Quantum Technology, in a statement.

Quantum Key Distribution (QKD) adds unbreakable coding of key distribution to public key encryption, making it virtually invulnerable, according to the report.

QKD is the first significant revenue source to come from the emerging Quantum Internet and will create almost $150 million in revenue in 2020, the report said.

QKD's early success is due to potential usersbig financial and government organizationshave an immediate need for 100% secure encryption, the IQT report stated.

By 2025, IQT projects that revenue from "quantum clouds" are expected to exceed $2 billion.

Although some large research and government organizations are buying quantum computers for on-premise use, the high cost of the machines coupled with the immaturity of the technology means that the majority of quantum users are accessing quantum through clouds, the report explained.

Quantum sensor networks promise enhanced navigation and positioning and more sensitive medical imaging modalities, among other use cases, the report said.

"This is a very diverse area in terms of both the range of applications and the maturity of the technology."

However, by 2025 revenue from quantum sensors is expected to reach about $1.2 billion.

We deliver the top business tech news stories about the companies, the people, and the products revolutionizing the planet. Delivered Daily

Image: Getty Images/iStockphoto

See original here:
Quantum networking projected to be $5.5 billion market in 2025 - TechRepublic

IBM And University Of Tokyo Launch Quantum Computing Initiative For Japan – E3zine.com

IBM and the University of Tokyo announced an agreement to partner to advance quantum computing and make it practical for the benefit of industry, science and society.

IBM and theUniversity of Tokyowill form theJapan IBM Quantum Partnership, a broad national partnership framework in which other universities, industry, and government can engage. The partnership will have three tracks of engagement: one focused on the development of quantum applications with industry; another on quantum computing system technology development; and the third focused on advancing the state of quantum science and education.

Under the agreement, anIBM Q System One, owned and operated by IBM, willbe installed in an IBM facility inJapan. It will be the first installation of its kind in the region and only the third in the world followingthe United StatesandGermany. The Q System One will be used to advance research in quantum algorithms, applications and software, with the goal of developing the first practical applications of quantum computing.

IBM and theUniversity of Tokyowill also create a first-of-a-kind quantumsystem technology center for the development of hardware components and technologies that will be used in next generation quantum computers. The center will include a laboratory facility to develop and test novel hardware components for quantum computing, including advanced cryogenic and microwave test capabilities.

IBM and theUniversity of Tokyowill also directly collaborateon foundational research topics important to the advancement of quantum computing, and establish a collaboration space on the University campus to engage students, faculty, and industry researchers with seminars, workshops, and events.

Developed byresearchers and engineers fromIBM Researchand Systems, the IBM Q System One is optimized for the quality, stability, reliability, and reproducibility of multi-qubit operations. IBM established theIBM Q Network, a community of Fortune 500 companies, startups, academic institutions and research labs working with IBM to advance quantum computing and explore practical applications for business and science.

Advances in quantum computing could open the door to future scientific discoveries such as new medicines and materials, improvements in the optimization of supply chains, and new ways to model financial data to better manage and reduce risk.

TheUniversity of Tokyowill lead theJapan IBM Quantum Partnership and bring academic excellence from universities and prominent research associations together with large-scale industry, small and medium enterprises, startups as well as industrial associations from diverse market sectors. A high priority will be placed on building quantum programming as well as application and technology development skills and expertise.

Follow this link:
IBM And University Of Tokyo Launch Quantum Computing Initiative For Japan - E3zine.com

Were approaching the limits of computer power we need new programmers now – The Guardian

Way back in the 1960s, Gordon Moore, the co-founder of Intel, observed that the number of transistors that could be fitted on a silicon chip was doubling every two years. Since the transistor count is related to processing power, that meant that computing power was effectively doubling every two years. Thus was born Moores law, which for most people working in the computer industry or at any rate those younger than 40 has provided the kind of bedrock certainty that Newtons laws of motion did for mechanical engineers.

There is, however, one difference. Moores law is just a statement of an empirical correlation observed over a particular period in history and we are reaching the limits of its application. In 2010, Moore himself predicted that the laws of physics would call a halt to the exponential increases. In terms of size of transistor, he said, you can see that were approaching the size of atoms, which is a fundamental barrier, but itll be two or three generations before we get that far but thats as far out as weve ever been able to see. We have another 10 to 20 years before we reach a fundamental limit.

Weve now reached 2020 and so the certainty that we will always have sufficiently powerful computing hardware for our expanding needs is beginning to look complacent. Since this has been obvious for decades to those in the business, theres been lots of research into ingenious ways of packing more computing power into machines, for example using multi-core architectures in which a CPU has two or more separate processing units called cores in the hope of postponing the awful day when the silicon chip finally runs out of road. (The new Apple Mac Pro, for example, is powered by a 28-core Intel Xeon processor.) And of course there is also a good deal of frenzied research into quantum computing, which could, in principle, be an epochal development.

But computing involves a combination of hardware and software and one of the predictable consequences of Moores law is that it made programmers lazier. Writing software is a craft and some people are better at it than others. They write code that is more elegant and, more importantly, leaner, so that it executes faster. In the early days, when the hardware was relatively primitive, craftsmanship really mattered. When Bill Gates was a lad, for example, he wrote a Basic interpreter for one of the earliest microcomputers, the TRS-80. Because the machine had only a tiny read-only memory, Gates had to fit it into just 16 kilobytes. He wrote it in assembly language to increase efficiency and save space; theres a legend that for years afterwards he could recite the entire program by heart.

There are thousands of stories like this from the early days of computing. But as Moores law took hold, the need to write lean, parsimonious code gradually disappeared and incentives changed. Programming became industrialised as software engineering. The construction of sprawling software ecosystems such as operating systems and commercial applications required large teams of developers; these then spawned associated bureaucracies of project managers and executives. Large software projects morphed into the kind of death march memorably chronicled in Fred Brookss celebrated book, The Mythical Man-Month, which was published in 1975 and has never been out of print, for the very good reason that its still relevant. And in the process, software became bloated and often inefficient.

But this didnt matter because the hardware was always delivering the computing power that concealed the bloatware problem. Conscientious programmers were often infuriated by this. The only consequence of the powerful hardware I see, wrote one, is that programmers write more and more bloated software on it. They become lazier, because the hardware is fast they do not try to learn algorithms nor to optimise their code this is crazy!

It is. In a lecture in 1997, Nathan Myhrvold, who was once Bill Gatess chief technology officer, set out his Four Laws of Software. 1: software is like a gas it expands to fill its container. 2: software grows until it is limited by Moores law. 3: software growth makes Moores law possible people buy new hardware because the software requires it. And, finally, 4: software is only limited by human ambition and expectation.

As Moores law reaches the end of its dominion, Myhrvolds laws suggest that we basically have only two options. Either we moderate our ambitions or we go back to writing leaner, more efficient code. In other words, back to the future.

What just happened?Writer and researcher Dan Wang has a remarkable review of the year in technology on his blog, including an informed, detached perspective on the prospects for Chinese domination of new tech.

Algorithm says noTheres a provocative essay by Cory Doctorow on the LA Review of Books blog on the innate conservatism of machine-learning.

Fall of the big beastsHow to lose a monopoly: Microsoft, IBM and antitrust is a terrific long-view essay about company survival and change by Benedict Evans on his blog.

Original post:
Were approaching the limits of computer power we need new programmers now - The Guardian

HPC In 2020: Acquisitions And Mergers As The New Normal – The Next Platform

After a decade of vendor consolidation that saw some of the worlds biggest IT firms acquire first-class HPC providers such as SGI, Cray, and Sun Microsystems, as well as smaller players like Penguin Computing, WhamCloud, Appro, and Isilon, it is natural to wonder who is next. Or maybe, more to the point, who is left?

As it turns out, there are still plenty of companies, large and small, that can fill critical holes in the product portfolios of HPC providers, or those who want to be HPC players. These niche acquisitions will be especially important to these same providers as they expand into HPC-adjacent markets such as artificial intelligence, data analytics and edge computing.

One company that can play into all of these markets is FPGA-maker Xilinx. Since Intel acquired Altera in 2015, Xilinx is the only standalone company of any size that makes reconfigurable logic devices. Give that, the natural buyer for Xilinx would be AMD, Intels arch-nemesis. AMD, of course, already has a highly competitive lineup of CPUs and GPUs to challenge its much larger rival, and the addition of an FPGA portfolio would open a third front. It would also provide AMD entry into a whole array of new application markets where FPGAs operate: ASIC prototyping, IoT, embedded aerospace/automotive, 5G communications, AI inference, database acceleration, and computational storage, to name a few.

The only problem is that Xilinxs current market cap of around $25 billion, or about half the current market cap of AMD. And if youre wondering about AMDs piggy bank, the chipmaker has $1.2 billion cash on hand as of September 2019. Which means any deal would probably take the form of a merger rather than a straight acquisition. Theres nothing wrong with that, but a merger is a more complex decision and has greater ramifications for both parties. Thats why the rumors of a Xilinx acquisition have tended to center on larger semiconductor manufacturers that might be looking to diversify their offerings, like Broadcom or Qualcomm. Those acquisitions wouldnt offer the HPC and AI technology synergies that AMD could provide, but they would likely be easier to execute.

Another area that continues to be ripe for acquisitions is the storage market. In HPC, Panasas and DataDirect Networks stand alone well, stand together as the two HPC specialists left in the market. And of those two, the more modest-sized Panasas would be easier to swallow. But most HPC OEMs, including the biggies like Hewlett Packard Enterprise, Dell Technologies, and Lenovo already have their own HPC storage and file system offerings of one sort or another, although Lenovo is probably most deficient in this regard. For what its worth though, Panasas, which has been around since 1999, has never attracted the kind of suitor willing to fold the companys rather specialized parallel file system technologies into its own product portfolio. In all honesty, we dont expect that to change.

The real storage action in the coming years in HPC, as well as in the enterprise and the cloud, is going to be in the software defined space, where companies like WekaIO, VAST Data, Excelero, and DataCore Software have built products that can virtualize all sorts of hardware. Thats because the way storage is being used and deployed in the datacenter these days is being transformed by cheaper capacity (disks) and cheaper IOPS (NVM-Express and other SSD devices), the availability of cloud storage, and the inverse trends of disaggregation and hyperconvergence.

As we noted last July: While there are plenty of NAS and SAN appliances being sold into the enterprise to support legacy applications, modern storage tends to be either disaggregated with compute and storage broken free of each other at the hardware level but glued together on the fly with software to look local or hyperconverged with the compute and block storage virtualized and running on the same physical server clusters and atop the same server virtualization hypervisors.

Any of the aforementioned SDS companies, along with others, may find themselves courted by OEMs and storage-makers, and even cloud providers. DDN has been busy in that regard, having acquired software-defined storage maker Nexenta in May 2019. We expect to see more of such deals in the coming years. Besides DDN, other storage companies like NetApp should be looking hard at bringing more SDS in-house. The big cloud providers Amazon, Microsoft, Google, and so on will also be making some big investments in SDS technologies, even if theyre not buying such companies outright.

One market that is nowhere near the consolidation stage is quantum computing. However, that doesnt mean companies wont be looking to acquire some promising startups in this area, even at this early stage. While major tech firms such as IBM, Google, Intel, Fujitsu, Microsoft, and Baidu have already invested a lot on in-house development and are busy selecting technology partners, other companies have taken a more wait-and-see approach.

In the latter category, one that particularly stands out is HPE. In this case, the company is more focused on near-term R&D, like memristors or other memory-centric technologies. While there may be some logic in letting other companies spend their money figuring out the most promising approaches for quantum computing, and then swoop in and copy (or buy) whatever technology is most viable, there is also the risk of being left behind. Thats something HPE cannot afford.

That said, HPE has recently invested in IonQ, a promising quantum computing startup that has built workable prototype using ion trap technology. The investment was provided via Pathfinder, HPEs investment arm. In an internal blog post on the subject penned by Abhishek Shukla, managing director of global venture investments, and Ray Beausoleil, Senior Fellow of large scale integrated photonics, the authors extol the virtues of IonQs technical approach:

IonQs technology has already surpassed all other quantum computers now available, demonstrating the largest number of usable qubits in the market. Its gate fidelity, which measures the accuracy of logical operations, is greater than 98 percent for both one-qubit and two-qubit operations, meaning it can handle longer calculations than other commercial quantum computers. We believe IonQs qubits and methodology are of such high quality, they will be able to scale to 100 qubits (and 10,000 gate operations) without needing any error correction.

As far as we can tell, HPE has no plans to acquire the company (and it shares investment in the startup with other companies, including Amazon, Google, and Samsung, among others). But if HPE is truly convinced IonQ is the path forward, it would make sense to pull the acquisition trigger sooner rather than later.

We have no illusions that any of this comes to pass in 2020 or ever. As logical as the deals we have suggested seem to us, the world of acquisitions and mergers is a lot more mysterious and counterintuitive than wed like to admit (cases in point: Intel buying Whamcloud or essentially buying Cloudera through such heavy investment). More certain is the fact that these deals will continue to reshape the HPC vendor landscape in the coming decade as companies go after new markets and consolidate their hold on old ones. If anything, the number of businesses bought and sold will increase as high performance computing, driven by AI and analytics, will extend into more application domains. Or as the Greeks put it more succinctly, the only constant is change.

Read more from the original source:
HPC In 2020: Acquisitions And Mergers As The New Normal - The Next Platform

Quantum Computers Finally Beat Supercomputers in 2019 – Discover Magazine

In his 2013 book, Schrdingers Killer App, Louisiana State University theoretical physicist Jonathan Dowling predicted what he called super exponential growth. He was right. Back in May, during Googles Quantum Spring Symposium, computer engineer Hartmut Neven reported the companys quantum computing chip had been gaining power at breakneck speed.

The subtext: We are venturing into an age of quantum supremacy the point at which quantum computers outperform the best classical supercomputers in solving a well-defined problem.

Engineers test the accuracy of quantum computing chips by using them to solve a problem, and then verifying the work with a classical machine. But in early 2019, that process became problematic, reported Neven, who runs Googles Quantum Artificial Intelligence Lab. Googles quantum chip was improving so quickly that his group had to commandeer increasingly large computers and then clusters of computers to check its work. Its become clear that eventually, theyll run out of machines.

Case in point: Google announced in October that its 53-qubit quantum processor had needed only 200 seconds to complete a problem that would have required 10,000 years on a supercomputer.

Nevens group observed a double exponential growth rate in the chips computing power over a few months. Plain old exponential growth is already really fast: It means that from one step to the next, the value of something multiplies. Bacterial growth can be exponential if the number of organisms doubles during an observed time interval. So can computing power of classical computers under Moores Law, the idea that it doubles roughly every year or two. But under double exponential growth, the exponents have exponents. That makes a world of difference: Instead of a progression from 2 to 4 to 8 to 16 to 32 bacteria, for example, a double-exponentially growing colony in the same time would grow from 2 to 4 to 16 to 256 to 65,536.

Neven credits the growth rate to two factors: the predicted way that quantum computers improve on the computational power of classical ones, and quick improvement of quantum chips themselves. Some began referring to this growth rate as Nevens Law. Some theorists say such growth was unavoidable.

We talked to Dowling (who suggests a more fitting moniker: the Dowling-Neven Law) about double exponential growth, his prediction and his underappreciated Beer Theory of Quantum Mechanics.

Q: You saw double exponential growth on the horizon long before it showed up in a lab. How?

A: Anytime theres a new technology, if it is worthwhile, eventually it kicks into exponential growth in something. We see this with the internet, we saw this with classical computers. You eventually hit a point where all of the engineers figure out how to make this work, miniaturize it and then you suddenly run into exponential growth in terms of the hardware. If it doesnt happen, that hardware falls off the face of the Earth as a nonviable technology.

Q: So you werent surprised to see Googles chip improving so quickly?

A: Im only surprised that it happened earlier than I expected. In my book, I said within the next 50 to 80 years. I guessed a little too conservatively.

Q: Youre a theoretical physicist. Are you typically conservative in your predictions?

People say Im fracking nuts when I publish this stuff. I like to think that Im the crazy guy that always makes the least conservative prediction. I thought this was far-out wacky stuff, and I was making the most outrageous prediction. Thats why its taking everybody by surprise. Nobody expected double exponential growth in processing power to happen this soon.

Q: Given that quantum chips are getting so fast, can I buy my own quantum computer now?

A: Most of the people think the quantum computer is a solved problem. That we can just wait, and Google will sell you one that can do whatever you want. But no. Were in the [prototype] era. The number of qubits is doubling every six months, but the qubits are not perfect. They fail a lot and have imperfections and so forth. But Intel and Google and IBM arent going to wait for perfect qubits. The people who made the [first computers] didnt say, Were going to stop making bigger computers until we figure out how to make perfect vacuum tubes.

Q: Whats the big deal about doing problems with quantum mechanics instead of classical physics?

A: If you have 32 qubits, its like you have 232 parallel universes that are working on parts of your computation. Or like you have a parallel processor with 232 processors. But you only pay the electric bill in our universe.

Q: Quantum mechanics gets really difficult, really fast. How do you deal with that?

A: Everybody has their own interpretation of quantum mechanics. Mine is the Many Beers Interpretation of Quantum Mechanics. With no beer, quantum mechanics doesnt make any sense. After one, two or three beers, it makes perfect sense. But once you get to six or 10, it doesnt make any sense again. Im on my first bottle, so Im in the zone.

[This story originally appeared in print as "The Rules of the Road to Quantum Supremacy."]

Read more from the original source:

Quantum Computers Finally Beat Supercomputers in 2019 - Discover Magazine

January 9th: France will unveil its quantum strategy. What can we expect from this report? – Quantaneo, the Quantum Computing Source

It is eagerly awaited! The "Forteza" report, named after its rapporteur, Paula Forteza, Member of Parliament for La Rpublique en Marche (political party of actual President Emmanuel Macron), should finally be officially revealed on January 9th. The three rapporteurs are Paula Forteza, Member of Parliament for French Latin America and the Caribbean, Jean-Paul Herteman, former CEO of Safran, and Iordanis Kerenidis, researcher at the CNRS. Announced last April, this report was initially due at the end of August, then in November, then... No doubt the complex agenda, between the social movements in France, and the active participation of the MP in the Parisian election campaign of Cdric Villani, mathematician and dissident of La Rpublique en Marche... had to be shaken up. In any case, it is thus finally on January 9th that this report entitled "Quantum: the technological shift that France will not miss", will be unveiled.

"Entrusted by the Prime Minister in April 2019, the mission on quantum technologies ends with the submission of the report by the three rapporteurs Paula Forteza, Jean-Paul Herteman, and Iordanis Kerenidis. Fifty proposals and recommendations are thus detailed in order to strengthen France's role and international position on these complex but highly strategic technologies. The in-depth work carried out over the last few months, fueled by numerous consultations with scientific experts in the field, has led the rapporteurs to the conclusion that France's success in this field will be achieved by making quantum technologies more accessible and more attractive. This is one of the sine qua non conditions for the success of the French strategy", explains the French National Congress in the invitation to the official presentation ceremony of the report.

The presentation, by the three rapporteurs, will be made in the presence of the ministers for the army, the economy and finance, and higher education and research. The presence of the Minister of the Armed Forces, as well as the co-signature of the report by the former president of Safran, already indicates that military applications will be one of the main areas of proposals, and possibly of funding. Just as is the case in the United States, China or Russia.

Of course, the report will go into detail about the role of research, and of the CNRS, in advances in quantum computing and communication. Of course, the excellent work of French researchers, in collaboration with their European peers, will be highlighted. And of course, France's excellence in these fields will be explained. France is a pioneer in this field, but the important questions are precisely what the next steps will be. The National Congress indicates that this report will present 50 "proposals and recommendations". Are we to conclude that it will be just a list of proposals? Or will we know how to move from advice to action?

These are our pending questions:

- The United States is announcing an investment of USD 1.2 billion, China perhaps USD 10 billion, Great Britain about 1 billion euros, while Amazon's R&D budget alone is USD 18 billion... how can a country like France position itself regarding the scale of these investments? To sum up, is the amount of funds allocated to this research and development in line with the ambitions?

- Mastering quantum technologies are becoming a geopolitical issue between the United States and China. Should Europe master its own technologies so as not to depend on these two major powers? On the other hand, is this not the return of a quantum "Plan calcul from the 60s? How can we avoid repeating the same mistakes?

- Cecilia Bonefeld-Dahl, Managing Director of DigitalEurope recently wrote that Europe risks being deprived of the use of quantum technologies if it does not develop them itself. Christophe Jurzcak, the head of Quantonation, stated that it is not certain that France will have access to quantum technologies if it does not develop them itself. Is this realistic? Do we have the ressources?

- French companies currently invest very little in research in the field of quantum computing. With the exception of Airbus, the main feedback that we know of is in Canada, Australia, Spain, Germany, etc. Should we also help companies to embrace these technologies, or should we only finance research and development on the part of universities and business creators? Is there a support component for companies? So that technologies are not simply developed in France and sold elsewhere, but that France is the leading market for local developments.

See you on January 9th on Decideo for more details and our objective analysis of the content of this document.

View post:
January 9th: France will unveil its quantum strategy. What can we expect from this report? - Quantaneo, the Quantum Computing Source

Year 2019 in Science: History of Humans, Ebola Treatment and Quantum Computing – NewsClick

Image Courtesy: Smithsonian Magazine. Image depicts some of the skull caps excavated from Ngandong.

In development of science, what should matter the most is the findings that help the humanity, the findings that have the potential to open up new paradigms or those which change our understanding of the past or open our eyes to the future. The year 2019 also witnessed several such findings in the science world.

HUMAN HISTORY THROUGH GENETICS

Tracing human history has been achieved with the realm of genetics research as well. Year 2019 also witnessed some of the breakthroughs about human history based on analysis done on ancient DNA found on fossils and other sources.

One of such important findings has come up with a claim about the origin of modern human. What it says is that anatomically, modern humans first appeared in Southern part of Africa. A wetland that covered present day Botswana, Namibia and Zimbabwe was where the first humans lived some 200,000 years ago. Eventually, humans migrated out of this region. How was the study conducted? Researchers gathered blood samples from 200 living people in groups whose DNA is poorly known, including foragers and hunter-gatherers in Namibia and South Africa. The authors analyzed the mitochondrial DNA (mtDNA), a type of DNA inherited only from mothers, and compared it to mtDNA in databases from more than 1000 other Africans, mostly from southern Africa. Then the researchers sorted how all the samples were related to each other on a family tree. The data reveals that one mtDNA lineage in the Khoisan speakersL0is the oldest known mtDNA lineage in living people. The work also tightens the date of origin of L0 to about 200,000 years ago

Another very important and interesting finding in this field is that Homo Erectus, the closest ancestor of modern humans, marked its last presence on the island of Java, Indonesia. The team of scientists has estimated that the species existed in a place known as Ngandong near the Solo riverbased on dating of animal fossils from a bone bed where Homo Erectus skull caps and leg bones were found earlier. Scientists used to believe that Homo Erectus migrated out of Africa, into Asia, some two million years back. They also believed that the early human ancestor became extinct from the earth around 4 lakh years ago. But the new findings indicate that the species continued to exist in Ngandong even about 117,000 to 108,000 years ago.

So far, anything that is known about the Denisovans, the mysterious archaic human species, was confined to the Denisova caves in Altai Mountain in Siberia. Because the remnants of this ancient species could be discovered in the fossils of the Denisova cave only. But a recent report published in Nature about the discovery of a Denisovan jawbone in a cave in the Tibetan Plateau has revealed many interesting facts about archaic humans. The fossil has been found to be 1,60,000 years old with a powerful jaw and unusually large teeth, resembling the most primitive Neanderthals. Protein analysis of the fossil revealed that they are closer to the Siberian Denisovans.

Image Courtesy: dawn.com

QUANTUM COMPUTING AND SUPREMACY:

Image Courtesy: Quantum magazine.

Computer scientists nowadays are concentrating on going far beyond the speed that the present genre of computing can achieve. Now the principles of quantum mechanics are being tried to incorporate into the next-generation computing. There have been some advances, but the issue in this realm that has sparked controversies is Googles claim to have obtained quantum supremacy.

Sycamore, Googles 53-qubit computer has solved a problem in 200 seconds which would have taken even a supercomputer 10,000 years. In fact, it is a first step. It has shown that a quantum computer can do a functional computation and that quantum computing does indeed solve a special class of problems much faster than conventional computers.

On the other hand, IBM researchers have countered saying that Google hadnt done anything special. This clash indeed highlights the intense commercial interest in quantum computing.

NATURE, CLIMATE AND AMAZON FOREST

Image Courtesy: NASA Earth Observatory.

The man-made climate change has already reached a critical state. Climate researches have already shown how crossing the critical state would bring irreversible changes to the global climate and an accompanying disaster for humanity.

In the year 2019 also, the world has witnessed many devastations in the forms of storms, floods and wildfires.

Apart from the extreme weather events that climate change is prodding, the nature itself is in the most perilous state ever, and the reason is human-made environmental destruction.

The global report submitted by Intergovernmental Science-Policy Platform on Biodiversity and Ecosystem Services (IPBES) reviewed some 15,000 scientific papers and also researched other sources of data on trends in biodiversity and its ability to provide people everything from food and fiber to clean water and air.

The report notes that out of 8 million known species of animals and plants, almost 1 million are under the threat of getting extinct and this includes more than 40% of amphibian species and almost a third of marine mammals.

The month of August witnessed an unprecedented wildfire in Amazon rainforest, the biggest in the world. The fire was so large-scale that the smoke covered nearby cities with dark clouds. It has been reported that Brazils National Institute for Space Research (INPE) recorded over 72,000 fires this year, which is an increase of about 80% from last year. More worrisome is the fact that more than 9,000 of these fires have taken place in the last week alone.

The fires have engulfed several large Amazon states in Northwestern Brazil. NASA, on August 11 noted that the fires were huge enough to be spotted from the space.

The main reason attributable to Amazon fires is widescale deforestation due to policy-level changes made by Bolsonaro regime. Many parts of the forest are now made open for the companies to set up business ventureseven the deeper parts of the forest. This has led to massive deforestation.

NEW DIMENSION TO THE TREATMENT OF EBOLA

Image Courtesy: UN News.

In the past, there had been no drugs that could have cured Ebola.

However, two out of four experimental trials carried out in Democratic Republic of Congo were found to be highly effective in saving patients lives. The new treatment method used a combination of existing drugs and newly developed ones. Named as PALM trial, the new method uses monoclonal antibodies and antiviral agencies.

Monoclonal antibodies are antibodies that are made by identical immune cells that are all clones of a unique parent cell. The monoclonal antibodies bind to specific cells or proteins. The objective is that this treatment will stimulate the patients immune system to attack those cells.

KILOGRAM REDEFINED

Image courtesy: phys.org

Kilogram, the unit to measure mass was defined by a hunk of metal in France. This hunk of metal, also known as the International Prototype Kilogram or Big K, is a platinum-iridium alloy having a mass of 1 kilogram housed at the Bureau of Weights and Measures in France since 1889. The IPK has many copies around the world and are used to calibrate scales to make sure that the whole world follows a standard system of measurement.

But the definition of the Kilogram will no longer be the same. On the International Metrology Day this year, the way a Kilogram has been measured for more than a century has been changed completely. Now, the kilogram would be defined using the Planck constant, something that does not change.

Follow this link:
Year 2019 in Science: History of Humans, Ebola Treatment and Quantum Computing - NewsClick

Information teleported between two computer chips for the first time – New Atlas

Scientists at the University of Bristol and the Technical University of Denmark have achieved quantum teleportation between two computer chips for the first time. The team managed to send information from one chip to another instantly without them being physically or electronically connected, in a feat that opens the door for quantum computers and quantum internet.

This kind of teleportation is made possible by a phenomenon called quantum entanglement, where two particles become so entwined with each other that they can communicate over long distances. Changing the properties of one particle will cause the other to instantly change too, no matter how much space separates the two of them. In essence, information is being teleported between them.

Hypothetically, theres no limit to the distance over which quantum teleportation can operate and that raises some strange implications that puzzled even Einstein himself. Our current understanding of physics says that nothing can travel faster than the speed of light, and yet, with quantum teleportation, information appears to break that speed limit. Einstein dubbed it spooky action at a distance.

Harnessing this phenomenon could clearly be beneficial, and the new study helps bring that closer to reality. The team generated pairs of entangled photons on the chips, and then made a quantum measurement of one. This observation changes the state of the photon, and those changes are then instantly applied to the partner photon in the other chip.

We were able to demonstrate a high-quality entanglement link across two chips in the lab, where photons on either chip share a single quantum state, says Dan Llewellyn, co-author of the study. Each chip was then fully programmed to perform a range of demonstrations which utilize the entanglement. The flagship demonstration was a two-chip teleportation experiment, whereby the individual quantum state of a particle is transmitted across the two chips after a quantum measurement is performed. This measurement utilizes the strange behavior of quantum physics, which simultaneously collapses the entanglement link and transfers the particle state to another particle already on the receiver chip.

The team reported a teleportation success rate of 91 percent, and managed to perform some other functions that will be important for quantum computing. That includes entanglement swapping (where states can be passed between particles that have never directly interacted via a mediator), and entangling as many as four photons together.

Information has been teleported over much longer distances before first across a room, then 25 km (15.5 mi), then 100 km (62 mi), and eventually over 1,200 km (746 mi) via satellite. Its also been done between different parts of a single computer chip before, but teleporting between two different chips is a major breakthrough for quantum computing.

The research was published in the journal Nature Physics.

Source: University of Bristol

View post:

Information teleported between two computer chips for the first time - New Atlas

How This Breakthrough Makes Silicon-Based Qubit Chips The Future of Quantum Computing – Analytics India Magazine

Quantum computing has come a long way since its first introduction in the 1980s. Researchers have always been on a lookout for a better way to enhance the ability of quantum computing systems, whether it is in making it cheaper or the quest of making the present quantum computers last longer. With the latest technological advancements in the world of quantum computing which superconducting bits, a new way of improving the world of silicon quantum computing has come to light, making use of the silicon spin qubits for better communication.

Until now, the communication between different qubits was relatively slow. It could be done by passing the messages to the next bit to get the communication over to another chip at a relatively far distance.

Now, researches at Princeton University have explored the idea of two quantum computing silicon components known as silicon spin qubits interacting in a relatively spaced environment, that is with a relatively large distance between them. The study was presented in the journal Nature on December 25, 2019.

The silicon quantum spin qubits give the ability to the quantum hardware to interact and transmit messages across a certain distance which will provide the hardware new capabilities. With transmitting signals over a distance, multiple quantum bits can be arranged in two-dimensional grids that can perform more complex calculations than the existing hardware of quantum computers can do. This study will help in better communications of qubits not only on a chip but also from one to another, which will have a massive impact on the speed.

The computers require as many qubits as possible to communicate effectively with each other to take the full advantage of quantum computings capabilities. The quantum computer that is used by Google and IBM contains around 50 qubits which make use of superconducting circuits. Many researchers believe that silicon-based qubit chips are the future in quantum computing in the long run.

The quantum state of silicon spin qubits lasts longer than the superconducting qubits, which is one of their significant disadvantages (around five years). In addition to lasting longer, silicon which has a lot of application in everyday computers is cheaper, another advantage over the superconducting qubits because these cost a ton of money. Single qubit will cost around $10,000, and thats before you consider research and development costs. With these costs in mind a universal quantum computer hardware alone will be around at least $10bn.

But, silicon spin cubits have their challenges which are part of the fact that they are incredibly small, and by small we mean, these are made out from a single electron. This problem is a huge factor when it comes to establishing an interconnect between multiple qubits when building a large scale computer.

To counter the problem of interconnecting these extremely small silicon spin qubits, the Princeton team connected these qubits with a wire which are similar to the fibre optic (for internet delivery at houses) wires and these wires carry light. This wire contains photon that picks up a message from a single qubit and transmits it the next qubit. To understand this more accurately, if the qubits are placed at a distance of half-centimetre apart from each other for the communication, in real-world, it would be like these qubits are around 750 miles away.

The next step forward for the study was to establish a way of getting qubits and photons to communicate the same language by tuning both the qubits and the photon to the same frequency. Where previously the devices architecture allowed tuning only one qubit to one photon at a time, the team now succeeded in tuning both the qubits independent from each other while still coupling them to the photon.

You have to balance the qubit energies on both sides of the chip with the photon energy to make all three elements talk to each other,

Felix Borjans, a graduate student and first author on the study on what he describes as the challenging part of the work.

The researchers demonstrated entangling of electrons spins in silicon separated by distances more substantial than the device housing, this was a significant development when it comes to wiring these qubits and how to lay them out in silicon-based quantum microchips.

The communication between the distant silicon-based qubits devices builds on the works of Petta research team in 2010 which shows how to trap s single electron in quantum wells and also from works in the journal Nature from the year 2012 (transfer of quantum information from electron spins)

From the paper in Science 2016 (demonstrated the ability to transmit information from a silicon-based charge qubit to a photon), from Science 2017 (nearest-neighbour trading of information in qubits) and 2018 Nature (silicon spin qubit can exchange information with a photon).

This demonstration of interactions between two silicon spin qubits is essential for the further development of quantum tech. This demonstration will help technologies like modular quantum computers and quantum networks. The team has employed silicon and germanium, which is widely available in the market.

comments

Sameer is an aspiring Content Writer. Occasionally writes poems, loves food and is head over heels with Basketball.

More here:

How This Breakthrough Makes Silicon-Based Qubit Chips The Future of Quantum Computing - Analytics India Magazine