SerenityOS: Remarkable project with its own JS-capable web browser – The Register

SerenityOS, which started out as a one-man project in 2018, has now got to the point where its creator proudly announced that its web browser passes the Acid3 browser test.

This is a remarkable achievement for a relatively small, hobbyist project; SerenityOS list hundreds of contributors on its GitHub page. Acid3 is relatively old now it dates back to 2008. However it was and is quite demanding, testing Javascript, the Document Object Model and more.

Its creator and lead developer, Andreas Kling, started the open-source project as a distraction while coping with substance withdrawal, a growing issue in the tech industry even before COVID-19. Its name is a tribute to the "Serenity Prayer" used in several 12-step programs. Last year, though, he was able to quit his job to work on it full time.

Kling is no rookie: he's worked at both Nokia and Apple, and as he puts it, "There's a pretty good chance you're reading this in a browser that's slightly faster because of me. ;)"

Even writing a browser completely from scratch is a substantial task, but to support the DOM and JavaScript is astonishing. It's still beyond some niche browsers such as NetSurf or Dillo, and just getting a modern browser is a challenge for other niche OSes.

The SerenityOS desktop showing a browser window, a terminal, and a couple of demo apps

SerenityOS is not the only project of its kind: there are so many Unix-inspired hobby OSes out there that it's hard even to count them all. There are even multiple different lists of hobby OSes to choose from, and the OSDev community to help you craft your own. Many have got as far as a usable GUI.

What distinguishes SerenityOS is just how far it's got and how well it works. It's a blend of a simple, from-scratch, Unix-like kernel, plus a desktop that's reminiscent of NT 4 before the rot started to set in with Windows 98's Active Desktop.

No, it doesn't have any enterprise relevance. No, it's not ready to replace Linux. No, you can't download a ready-to-install ISO image; you have to compile it yourself, from source. In other words, it's not yet reached the significant step of self-hosting an achievement that the Haiku project justly celebrated when it got there.

But if that doesn't put you off, we suggest giving it a try. After all, Linux was at a stage like this once, and it too was "Just For Fun".

Read the rest here:

SerenityOS: Remarkable project with its own JS-capable web browser - The Register

EU/US Say They’ve Agreed To A New Privacy Shield That Doesn’t Seem To Deal With Any Of The Problems Of The Old One – Techdirt

from the lipstick-on-a-dead-pig dept

Last week, the EU and the US announced something important that sounds pretty boring a new privacy shield agreement. You should know its important, because in the midst of dealing with everything else, including the Russian invasion of Ukraine, President Biden actually made a public statement with European Commission President Ursula von der Leyen to announce it (in a speech that also included talk about the Russia/Ukraine situation). Here was the key bit:

And Im proud to announce that weve also reached another major breakthrough in transatlantic data flows. Privacy and security are key elements of my digital agenda.And today, weve agreed to unprecedented protections for data privacy and security for our citizens.This new agreement will enhance the Privacy Shield Framework; promote growth and innovation in Europe and the United States; and help companies, both small and large, compete in the digital economy.Just as we did when we resolved the Boeing-Airbus dispute and lifted the steel and aluminum tariffs, the United States and the EU are finding creative, new approaches to knit our economies and our people closer together, grounded on shared values.This framework underscores our shared commitment to privacy, to data protection, and to the rule of law. And its going to allow the European Commission to once again authorizetransatlantic data flows that help facilitate $7.1 trillion in economic relationships with the EU.

A little history if you dont follow this too closely. For years, the US and the EU had a privacy safe harbor setup, by which US internet companies were allowed to collect some data on EU users by agreeing to live up to certain standards. What this meant in practice was that every US internet company had to hire some random privacy auditor in the EU who would bless you with some sort of compliance statement. It was kind of a boondoggle (and, yes, we had to go through it ourselves).

Back in 2015, privacy advocate/perpetual thorn in the side of companies who collect data, Max Schrems, successfully challenged the legality of this agreement at the EU Court of Justice. What the EUCJ said in scrapping the privacy safe harbor agreement was that the NSAs PRISM program (exposed by Ed Snowden, and involving pressuring US internet companies to cough up information on users) violated the safe harbor.

Suddenly, it became unclear if US internet companies even could continue to collect data from EU users. There was a lot of scrambling, and in early 2016, the EU and the US announced a new privacy safe harbor, with the catchier name Privacy Shield. However, as we noted at the time, considering that the US refused to end the NSAs collection program under Section 702 of the FISA Amendments Act, it didnt seem possible that the new agreement would survive a challenge.

And, indeed, Schrems challenged the Privacy Shield again, and once again, in 2020, the EU courts rejected the Privacy Shield. In that decision, it continued to call out NSA surveillance, including executive order 12333, which, as weve noted, is actually the main source of the NSAs foreign surveillance powers, and (according to some) not subject to Congressional review.

So, now, the US and the EU claim theyve come up with a new Privacy Shield framework that will allow the data to flow freely across the Atlantic. But I dont see how thats possible. Because 12333 still exists. And, back in 2018, Congress renewed Section 702 of the FISA Amendments Act. So the two biggest reasons why the EUCJ has rejected these agreements two giant NSA spying programs still exist. I dont quite see how any new agreement is going to get around that without significantly modifying the NSAs surveillance program.

Schrems seems, lets say skeptical.

We already had a purely political deal in 2015 that had no legal basis. From what you hear we could play the same game a third time now. The deal was apparently a symbol that von der Leyen wanted, but does not have support among experts in Brussels, as the US did not move. It is especially appalling that the US has allegedly used the war on Ukraine to push the EU on this economic matter.

The final text will need more time, once this arrives we will analyze it in depth, together with our US legal experts. If it is not in line with EU law, we or another group will likely challenge it. In the end, the Court of Justice will decide a third time. We expect this to be back at the Court within months from a final decision.

It is regrettable that the EU and US have not used this situation to come to a no spy agreement, with baseline guarantees among like-minded democracies. Customers and businesses face more years of legal uncertainty.

While US tech companies have been celebrating the deal, they really shouldnt bother. Its hard to see how this survives another round in court, until the NSA has its wings clipped.

Filed Under: eo 12333, eu, executive order 12333, fisa amendments act, joe biden, max schrems, privacy, privacy safe harbor, privacy shield, section 702, surveillance, ursula von der leyen, us

Go here to read the rest:
EU/US Say They've Agreed To A New Privacy Shield That Doesn't Seem To Deal With Any Of The Problems Of The Old One - Techdirt

Marianne Williamson calls on Biden to drop efforts to extradite Assange – The Hill

Former Democratic presidential candidate Marianne Williamson told Hill.TV that the Biden administration should drop its efforts to extradite WikiLeaks founder Julian Assange from the United Kingdom.

Williamson said Assange should not be punished for releasing information on WikiLeaks that provided details on the U.S. war machine.

What Assange revealed here was torture and rape and murder. What he revealed was up to 15,000 more civilian deaths than we had even known this is about the U.S. war machine, about the fact that it is a very very big business. It is very well funded. We are not supposed to question the funding and we are not supposed to question what they do, she said.

Williamson spoke to Hill.TV shortlyafter Sigurdur Thordarson, a key witness against Assange, admitted to falsifying claims against Assange to gain American immunity. Williamson argued that this new information would destroy the U.S. case against Assange.

The U.S. government was willing to work with Thordarson to trump up these charges in order to bolster its case, in order to get the British government to let Assange come back, she said. This is all in order for the United States government to continue its cover-up This is not really about Julian Assange.

Williamson has been a vocal supporter of Assange. This past week she tweeted that Assange is being treated so harshly for one reason only: to freeze disclosure and to freeze dissent.

Assange gained media attention after releasing classified documents from the Afghanistan and Iraq wars in 2010, followed by confidential emails during Secretary of State Hillary Clintons 2016 presidential campaign.

In 2019, Assange wascharged with unlawfully obtaining and disclosing classified documents. He is being held in Belmarsh Prison in the United Kingdom. TheU.S. government is attempting to extradite him.

In the Hill.TV interview, Williamsoncalled the Assange case a constraint on journalistic freedom, saying: Theyre making it all about Julian Assange in order to freeze any journalistic questioning, any journalistic pushback or challenge to the secretiveness of the U.S. government. Thats why we have a free press.

Others have made a similar argument.

In February, The Freedom of the Press Foundation and other human rights organizations such as the American Civil Liberties Union and Amnesty International-USA, signed a letter to then-acting Attorney General Monty Wilkinson, urging federal prosecutorsto drop their indictment of Assange to protect freedom of press.

Visit link:

Marianne Williamson calls on Biden to drop efforts to extradite Assange - The Hill

The bittersweet commitment by the wife of JULIAN ASSANGE this week on 60 MINUTES – TV Blackbox

JUST MARRIED

London in spring is a perfect place and time to get married. And so it was for Stella Moris on Wednesday. Like all brides, her wedding was unforgettable, but it was also a ceremony unlike any other. Her husband is Australian Wikileaks founder Julian Assange, which meant the happy couple swapped vows in Englands toughest maximum-security jail, Belmarsh Prison. As their infant sons Gabriel and Max watched on, Stella and Julian promised to try to lead as normal a life as possible.But as Tara Brown reports, thats a bittersweet commitment, with the groom facing extradition to the United States and the prospect of 175 more years in jail if convicted of espionage.

Reporter: Tara BrownProducers: Natalie Clancy, Naomi Shivaraman

INESCAPABLE

Accused and then dubiously convicted of spying, what Australian academic Kylie Moore-Gilbert endured, locked up in brutal Iranian prisons for more than two years, would have broken most people. It wasnt just the torment of being in solitary confinement for most of that time, she also withstood relentless interrogations as well as other extreme psychological torture. But worst of all, the probable reason she remained imprisoned for so long is just bizarre. For the first time, she explains how one of her captors, a sleazy prison boss, fell for her. As Kylie tells Sarah Abo, it was a frightening attraction that plunged her into an inescapably dangerous love triangle.

Reporter: Sarah AboProducer: Garry McNab

See the rest here:

The bittersweet commitment by the wife of JULIAN ASSANGE this week on 60 MINUTES - TV Blackbox

Learn quantum computing: a field guide – IBM Quantum

Quantum theory is a revolutionary advancement in physics and chemistrythat emerged in the early twentieth century. It is an elegantmathematical theory able to explain the counterintuitive behavior ofsubatomic particles, most notably the phenomenon of entanglement. Inthe late twentieth century it was discovered that quantum theory appliesnot only to atoms and molecules, but to bits and logic operations in acomputer. This realization has brought about a revolution in thescience and technology of information processing, making possible kindsof computing and communication hitherto unknown in the Information Age.

Our everyday computers perform calculations and process information using thestandard (or classical) model ofcomputation, which dates back toTuring and vonNeumann. In thismodel, all information is reducible to bits, which can take the valuesof either 0 or 1. Additionally, all processing can be performed via simple logicgates (AND, OR, NOT, XOR, XNOR)acting on one or two bits at a time, or be entirely described by NAND (or NOR).At any point in its computation, aclassical computers state is entirely determined by the states of allits bits, so that a computer with n bits can exist in one of2^n possible states, ranging from 00...0 to11...1 .

The power of the quantum computer, meanwhile, lies in its much richerrepertoire of states. A quantum computer also has bits but instead of0 and 1, its quantum bits, or qubits, can represent a 0, 1, or linearcombination of both, which is a property known as superposition.This on its own is no special thing, since a computer whose bits can beintermediate between 0 and 1 is just an analog computer, scarcely morepowerful than an ordinary digital computer. However, a quantum computertakes advantage of a special kind of superposition that allows forexponentially many logical states at once, all the states from|00...0rangle to |11...1rangle . This is a powerfulfeat, and no classical computer can achieve it.

The vast majority of quantum superpositions, and the ones most useful for quantumcomputation, are entangled. Entangled states are states of the whole computerthat do not correspond to any assignment of digital or analog states ofthe individual qubits. A quantum computer is therefore significantly more powerfulthan any one classical computer whether it be deterministic,probabilistic, or analog.

While todays quantum processors are modest in size, their complexity growscontinuously. We believe this is the right time to build and engage a communityof new quantum learners, spark further interest in those who are curious,and foster a quantum intuition in the greater community.By making quantum concepts more widely understood even on a generallevel we can more deeply explore all the possibilities quantumcomputing offers, and more rapidly bring its exciting power to a worldwhose perspective is limited by classical physics.

With this in mind, we created the IBM Quantum Composer to provide the hands-onopportunity to experiment with operations on a real quantum computingprocessor. This field guide contains a series of topicsto accompany your journey as you create your own experiments, run them insimulation, and execute them on real quantum processorsavailable via IBM Cloud.

If quantum physics sounds challenging to you, you are not alone. But ifyou think the difficulty lies in hard math, think again. Quantum conceptscan, for the most part, be described by undergraduate-level linear algebra,so if you have ever taken a linear algebra course, the math will seem familiar.

The true challenge of quantum physics is internalizing ideas that arecounterintuitive to our day-to-day experiences in the physical world,which of course are constrained by classical physics. To comprehendthe quantum world, you must build a new intuition for a set of simple butvery different (and often surprising) laws.

The counterintuitive principles of quantum physics are:

1.A physical system in a definite state can still behaverandomly.

2.Two systems that are too far apart to influence each other cannevertheless behave in ways that, though individually random,are somehow strongly correlated.

Unfortunately, there is no single simple physicalprinciple from which these conclusions follow and we must guard againstattempting to describe quantum concepts in classical terms!The best we can do is to distill quantum mechanics down to a fewabstract-sounding mathematical laws, from which all the observed behaviorof quantum particles (and qubits in a quantum computer) can be deduced andpredicted.

Keep those two counterintuitive ideas in the back of your mind, let goof your beliefs about how the physical world works, and begin exploringthe quantum world!

Originally posted here:
Learn quantum computing: a field guide - IBM Quantum

Researchers Have Achieved Sustained Long-Distance Quantum …

Image:Yuichiro Chino via Getty Images

In a major breakthrough for the quest toward quantum internet, a technology that would revolutionize computing in myriad ways, a consortium of well-regarded institutions have announced the first demonstration of sustained, high-fidelity quantum teleportation over long distances.

Led by Caltech, a collaboration between Fermilab, AT&T, Harvard University, NASAs Jet Propulsion Laboratory, and the University of Calgary reports the successful teleportation of qubits, basic units of quantum information, across 22 kilometers of fiber in two testbeds: the Caltech Quantum Network and the Fermilab Quantum Network.

The team has been working persistently and keeping our heads down in the past few years, said Maria Spiropulu, a particle physicist at Caltech who directs the INQNET research programand co-authored the new paper, in an email.

Though the collaboration knew it had achieved significant results by the spring of 2020, Spiropulu added, they refrained from sharing the news, even informally on social media, until the publication of the full study this week.

We wanted to push the envelope for this type of research and take important steps on a path to realize both real-life applications for quantum communications and networks and test fundamental physics ideas, said Panagiotis Spentzouris, head of the Quantum Science Program at Fermilab, in an email.

So, when we finally did it, the team was elated, very proud for achieving these high-quality, record-breaking results, he continued. And we are very excited that we can move to the next phase, utilizing the know-how and the technologies from this work towards the deployment of quantum networks.

The researchers say their experiment used "off-the-shelf" equipment that is compatible with both existing telecommunications infrastructure and emerging quantum technologies. The results provide a realistic foundation for a high-fidelity quantum Internet with practical devices, according to a study released on Tuesday in the journal PRX Quantum report.

Quantum teleportation does not involve the actual transfer of matter. Rather, quantum particles are entangled (dependent on each other, even over long distances) and somehow know the property of their other half. From our explainer earlier this year:

In a way, entangled particles behave as if they are aware of how the other particle is behaving. Quantum particles, at any point, are in a quantum state of probabilities, where properties like position, momentum, and spin of the particle are not precisely determined until there is some measurement. For entangled particles, the quantum state of each depends on the quantum state of the other; if one particle is measured and changes state, for example, the other particles state will change accordingly.

The study aimed to teleport the state of quantum qubits, or "quantum bits," which are the basic units of quantum computing. According to the study, the researchers set up what is basically a compact network with three nodes: Alice, Charlie, and Bob. In this experiment, Alice sends a qubit to Charlie. Bob has an entangled pair of qubits, and also sends one qubit to Charlie, where it interferes with Alice's qubit. Charlie projects Alice's qubit onto an entangled quantum Bell State that transfers the state of Alice's original qubit to Bob's remaining qubit.

The breakthrough is notable for a few reasons. Many previous demonstrations of quantum teleportation have proven to be unstable over long distances. For example, in 2016, researchers at the University of Calgary were able to perform quantum teleportation at a distance of six kilometers. This was the world record at the time and was seen as a major achievement.

The ultimate goal is to create quantum networks that would use entanglement and superposition to vastly increase computing speed, power, and security, relative to classical computers. For example, the U.S. Department of Energy has an ambitious plan to build a quantum network between its National Laboratories.

Any field that relies on computers would be affected by the realization of this technology, though much of the focus of the future potential of quantum networks revolves around cryptography, search algorithms, financial services, and quantum simulations that could model complex phenomena.

Quantum computing has been on the horizon for years, and this study takes us one step closer to realizing it on a practical scale. But dont expect to surf a quantum internet anytime soon.

People on social media are asking if they should sign up for a quantum internet provider (jokingly of course), Spiropulu said. We need (a lot) more R&D work.

Now that Fermilab, Caltech, and its partners have demonstrated this key step toward these networks, the team plans to further develop quantum information technology by building a metropolitan-scale network, called the Illinois Express Quantum Network, around Chicago.

There are many fronts that we need to push forward, said Spentzouris, both in applications of quantum communication and network technologies and in advancing the engineering of the systems. We are already working hard on developing architecture, processes, and protocols for quantum networks and on optimizing along some metrics including rate of communications and range.

Here is the original post:
Researchers Have Achieved Sustained Long-Distance Quantum ...

Google AI Blog: Quantum Supremacy Using a Programmable …

This result is the first experimental challenge against the extended Church-Turing thesis, which states that classical computers can efficiently implement any reasonable model of computation. With the first quantum computation that cannot reasonably be emulated on a classical computer, we have opened up a new realm of computing to be explored.

The Sycamore ProcessorThe quantum supremacy experiment was run on a fully programmable 54-qubit processor named Sycamore. Its comprised of a two-dimensional grid where each qubit is connected to four other qubits. As a consequence, the chip has enough connectivity that the qubit states quickly interact throughout the entire processor, making the overall state impossible to emulate efficiently with a classical computer.

The success of the quantum supremacy experiment was due to our improved two-qubit gates with enhanced parallelism that reliably achieve record performance, even when operating many gates simultaneously. We achieved this performance using a new type of control knob that is able to turn off interactions between neighboring qubits. This greatly reduces the errors in such a multi-connected qubit system. We made further performance gains by optimizing the chip design to lower crosstalk, and by developing new control calibrations that avoid qubit defects.

We designed the circuit in a two-dimensional square grid, with each qubit connected to four other qubits. This architecture is also forward compatible for the implementation of quantum error-correction. We see our 54-qubit Sycamore processor as the first in a series of ever more powerful quantum processors.

ApplicationsThe Sycamore quantum computer is fully programmable and can run general-purpose quantum algorithms. Since achieving quantum supremacy results last spring, our team has already been working on near-term applications, including quantum physics simulation and quantum chemistry, as well as new applications in generative machine learning, among other areas.

We also now have the first widely useful quantum algorithm for computer science applications: certifiable quantum randomness. Randomness is an important resource in computer science, and quantum randomness is the gold standard, especially if the numbers can be self-checked (certified) to come from a quantum computer. Testing of this algorithm is ongoing, and in the coming months we plan to implement it in a prototype that can provide certifiable random numbers.

Whats Next?Our team has two main objectives going forward, both towards finding valuable applications in quantum computing. First, in the future we will make our supremacy-class processors available to collaborators and academic researchers, as well as companies that are interested in developing algorithms and searching for applications for todays NISQ processors. Creative researchers are the most important resource for innovation now that we have a new computational resource, we hope more researchers will enter the field motivated by trying to invent something useful.

Second, were investing in our team and technology to build a fault-tolerant quantum computer as quickly as possible. Such a device promises a number of valuable applications. For example, we can envision quantum computing helping to design new materials lightweight batteries for cars and airplanes, new catalysts that can produce fertilizer more efficiently (a process that today produces over 2% of the worlds carbon emissions), and more effective medicines. Achieving the necessary computational capabilities will still require years of hard engineering and scientific work. But we see a path clearly now, and were eager to move ahead.

AcknowledgementsWed like to thank our collaborators and contributors University of California Santa Barbara, NASA Ames Research Center, Oak Ridge National Laboratory, Forschungszentrum Jlich, and many others who helped along the way.

Today we published the results of this quantum supremacy experiment in the Nature article, Quantum Supremacy Using a Programmable Superconducting Processor. We developed a new 54-qubit processor, named Sycamore, that is comprised of fast, high-fidelity quantum logic gates, in order to perform the benchmark testing. Our machine performed the target computation in 200 seconds, and from measurements in our experiment we determined that it would take the worlds fastest supercomputer 10,000 years to produce a similar output.

Each run of a random quantum circuit on a quantum computer produces a bitstring, for example 0000101. Owing to quantum interference, some bitstrings are much more likely to occur than others when we repeat the experiment many times. However, finding the most likely bitstrings for a random quantum circuit on a classical computer becomes exponentially more difficult as the number of qubits (width) and number of gate cycles (depth) grow.

The Sycamore ProcessorThe quantum supremacy experiment was run on a fully programmable 54-qubit processor named Sycamore. Its comprised of a two-dimensional grid where each qubit is connected to four other qubits. As a consequence, the chip has enough connectivity that the qubit states quickly interact throughout the entire processor, making the overall state impossible to emulate efficiently with a classical computer.

The success of the quantum supremacy experiment was due to our improved two-qubit gates with enhanced parallelism that reliably achieve record performance, even when operating many gates simultaneously. We achieved this performance using a new type of control knob that is able to turn off interactions between neighboring qubits. This greatly reduces the errors in such a multi-connected qubit system. We made further performance gains by optimizing the chip design to lower crosstalk, and by developing new control calibrations that avoid qubit defects.

We designed the circuit in a two-dimensional square grid, with each qubit connected to four other qubits. This architecture is also forward compatible for the implementation of quantum error-correction. We see our 54-qubit Sycamore processor as the first in a series of ever more powerful quantum processors.

ApplicationsThe Sycamore quantum computer is fully programmable and can run general-purpose quantum algorithms. Since achieving quantum supremacy results last spring, our team has already been working on near-term applications, including quantum physics simulation and quantum chemistry, as well as new applications in generative machine learning, among other areas.

We also now have the first widely useful quantum algorithm for computer science applications: certifiable quantum randomness. Randomness is an important resource in computer science, and quantum randomness is the gold standard, especially if the numbers can be self-checked (certified) to come from a quantum computer. Testing of this algorithm is ongoing, and in the coming months we plan to implement it in a prototype that can provide certifiable random numbers.

Whats Next?Our team has two main objectives going forward, both towards finding valuable applications in quantum computing. First, in the future we will make our supremacy-class processors available to collaborators and academic researchers, as well as companies that are interested in developing algorithms and searching for applications for todays NISQ processors. Creative researchers are the most important resource for innovation now that we have a new computational resource, we hope more researchers will enter the field motivated by trying to invent something useful.

Second, were investing in our team and technology to build a fault-tolerant quantum computer as quickly as possible. Such a device promises a number of valuable applications. For example, we can envision quantum computing helping to design new materials lightweight batteries for cars and airplanes, new catalysts that can produce fertilizer more efficiently (a process that today produces over 2% of the worlds carbon emissions), and more effective medicines. Achieving the necessary computational capabilities will still require years of hard engineering and scientific work. But we see a path clearly now, and were eager to move ahead.

AcknowledgementsWed like to thank our collaborators and contributors University of California Santa Barbara, NASA Ames Research Center, Oak Ridge National Laboratory, Forschungszentrum Jlich, and many others who helped along the way.

Read the rest here:
Google AI Blog: Quantum Supremacy Using a Programmable ...

Quantum computing has a hype problem – MIT Technology Review

The qubit systems we have today are a tremendous scientific achievement, but they take us no closer to having a quantum computer that can solve a problem that anybody cares about. It is akin to trying to make todays best smartphones using vacuum tubes from the early 1900s. You can put 100 tubes together and establish the principle that if you could somehow get 10 billion of them to work together in a coherent, seamless manner, you could achieve all kinds of miracles. What, however, is missing is the breakthrough of integrated circuits and CPUs leading to smartphonesit took 60 years of very difficult engineering to go from the invention of transistors to the smartphone with no new physics involved in the process.

There are in fact ideas, and I played some role in developing the theories for these ideas, for bypassing quantum error correction by using far-more-stable qubits, in an approach called topological quantum computing. Microsoft is working on this approach. But it turns out that developing topological quantum-computing hardware is also a huge challenge. It is unclear whether extensive quantum error correction or topological quantum computing (or something else, like a hybrid between the two) will be the eventual winner.

Physicists are smart as we all know (disclosure: I am a physicist), and some physicists are also very good at coming up with substantive-sounding acronyms that stick. The great difficulty in getting rid of decoherence has led to the impressive acronym NISQ for noisy intermediate scale quantum computerfor the idea that small collections of noisy physical qubits could do something useful and better than a classical computer can. I am not sure what this object is: How noisy? How many qubits? Why is this a computer? What worthy problems can such a NISQ machine solve?

A recent laboratory experiment at Google has observed some predicted aspects of quantum dynamics (dubbed time crystals) using 20 noisy superconducting qubits. The experiment was an impressive showcase of electronic control techniques, but it showed no computing advantage over conventional computers, which can readily simulate time crystals with a similar number of virtual qubits. It also did not reveal anything about the fundamental physics of time crystals. Other NISQ triumphs are recent experiments simulating random quantum circuits, again a highly specialized task of no commercial value whatsoever.

Using NISQ is surely an excellent new fundamental research ideait could help physics research in fundamental areas such as quantum dynamics. But despite a constant drumbeat of NISQ hype coming from various quantum computing startups, the commercialization potential is far from clear. I have seen vague claims about how NISQ could be used for fast optimization or even for AI training. I am no expert in optimization or AI, but I have asked the experts, and they are equally mystified. I have asked researchers involved in various startups how NISQ would optimize any hard task involving real-world applications, and I interpret their convoluted answers as basically saying that since we do not quite understand how classical machine learning and AI really work, it is possible that NISQ could do this even faster. Maybe, but this is hoping for the best, not technology.

There are proposals to use small-scale quantum computers for drug design, as a way to quickly calculate molecular structure, which is a baffling application given that quantum chemistry is a minuscule part of the whole process. Equally perplexing are claims that near-term quantum computers will help in finance. No technical papers convincingly demonstrate that small quantum computers, let alone NISQ machines, can lead to significant optimization in algorithmic trading or risk evaluation or arbitrage or hedging or targeting and prediction or asset trading or risk profiling. This however has not prevented several investment banks from jumping on the quantum-computing bandwagon.

A real quantum computer will have applications unimaginable today, just as when the first transistor was made in 1947, nobody could foresee how it would ultimately lead to smartphones and laptops. I am all for hope and am a big believer in quantum computing as a potentially disruptive technology, but to claim that it would start producing millions of dollars of profit for real companies selling services or products in the near future is very perplexing to me. How?

Quantum computing is indeed one of the most important developments not only in physics, but in all of science. But entanglement and superposition are not magic wands that we can shake and expect to transform technology in the near future. Quantum mechanics is indeed weird and counterintuitive, but that by itself does not guarantee revenue and profit.

A decade and more ago, I was often asked when I thought a real quantum computer would be built. (It is interesting that I no longer face this question as quantum-computing hype has apparently convinced people that these systems already exist or are just around the corner). My unequivocal answer was always that I do not know. Predicting the future of technology is impossibleit happens when it happens. One might try to draw an analogy with the past. It took the aviation industry more than 60 years to go from the Wright brothers to jumbo jets carrying hundreds of passengers thousands of miles. The immediate question is where quantum computing development, as it stands today, should be placed on that timeline. Is it with the Wright brothers in 1903? The first jet planes around 1940? Or maybe were still way back in the early 16th century, with Leonardo da Vincis flying machine? I do not know. Neither does anybody else.

Sankar Das Sarma is the director of the Condensed Matter Theory Center at the University of Maryland, College Park.

See more here:
Quantum computing has a hype problem - MIT Technology Review

Q-CTRL Partners with The Paul Scherrer Institute to Support the Scale-Up of Quantum Computers – HPCwire

SYDNEY, March 30, 2022 Q-CTRL, a global leader in developing useful quantum technologies, today announced a partnership with The Paul Scherrer Institute (PSI), Switzerlands largest research institute for natural and engineering sciences, to pioneer R&D in the scale-up of quantum computers. The strategic partnership will leverage Q-CTRL and PSIs combined expertise to deliver transformational capabilities to the broader research community.

This partnership builds on the collaboration of PSI and ETH Zurich, one of the worlds premier public research universities and a quantum science powerhouse, who formed the ETH Zurich PSI Quantum Computing Hub in May 2021 on PSIs campus in Villigen. Both are working to translate groundbreaking quantum computing research into building systems at scale. Theyve now partnered with Q-CTRL to provide the critical infrastructure software tools for system characterization, AI-based automation, and hardware optimization that are essential for large-scale quantum computing to become reality.

Q-CTRLs focus on solving the automation and performance challenges in large-scale quantum computing align perfectly with the PSI Quantum Computing Hubs mission, said Q-CTRL Founder and CEO Professor Michael J. Biercuk. Were honored to partner with the exceptional engineers and researchers at PSI to combine their system engineering prowess with infrastructure software to truly move the research field forward.

As PSI seeks to scale up quantum hardware, Q-CTRLs unique expertise in quantum control and AI-based automation makes the company a natural fit to help accelerate the pathway to the first useful quantum computers. Both teams have extensive experience in quantum computing based on trapped ions, including specialized approaches in error correction leveraging the unique properties of trapped ions. Together, PSI and Q-CTRL will aim to solve the critical challenges enabling large-scale, quantum-error-corrected quantum computing to become a reality.

Q-CTRLs hardware agnostic, yet hardware-aware tools will be very valuable in finding optimal control solutions that ensure uniform performance across larger qubit arrays, said Dr. Cornelius Hempel, Group head, Ion Trap Quantum Computing, Paul Scherrer Institute. As we go to larger and larger machines and continuous operation of testbeds, efficient and automated tuneup and calibration procedures become an essential aspect of day-to-day operations its just not possible to continue using brute-force approaches at scale. Our team is very excited to leverage the tools the Q-CTRL team has developed in this space.

The computational power of quantum computing is expected to deliver transformational capabilities in applications ranging from drug discovery and enterprise logistics to finance. However, the underlying hardware is extremely unstable and fragile, hampering these machines from reaching their full potential. Q-CTRL is focused on delivering hardware-agnostic and fully automated error-suppressing enterprise software that will enable useful quantum computing for organizations around the world. Its team was recently awarded a US SBIR grant from the Department of Energy focused on quantum computer automation, and this partnership will build on those research developments.

To learn more about Q-CTRL, please visit: q-ctrl.com.

About Q-CTRL

Q-CTRL is building the quantum technology industry by overcoming the fundamental challenge in the field hardware error and instability. Q-CTRLs quantum control infrastructure software for R&D professionals and quantum computing end users delivers the highest performance error-correcting and suppressing techniques globally, and provides a unique capability accelerating the pathway to the first useful quantum computers. This foundational technology also applies to a new generation of quantum sensors, and enables Q-CTRL to shape and underpin every application of quantum technology.

Q-CTRL has assembled the worlds foremost team of expert quantum-control engineers, providing solutions to many of the most advanced quantum computing and sensing teams globally. Q-CTRL has been an inaugural member of the IBM Quantum Startup network since 2018, and recently announced a partnership with Transport for NSW, delivering its enterprise infrastructure software to transport data scientists exploring quantum computing. Q-CTRL is funded by SquarePeg Capital, Sierra Ventures, Sequoia Capital China, Data Collective, Horizons Ventures, Main Sequence Ventures, In-Q-Tel, Airbus Ventures, and Ridgeline Partners. The company has international headquarters in Sydney, Los Angeles, and Berlin.

About PSI

The Paul Scherrer Institute PSI is the largest research institute for natural and engineering sciences in Switzerland, conducting cutting-edge research in three main fields: matter and materials, energy and the environment and human health. PSI develops, builds and operates complex large research facilities such as the synchrotron Swiss Light Source (SLS), the free-electron X-ray laser SwissFEL and the SINQ neutron source. PSI employs 2100 people and is primarily financed by the Swiss Confederation. The institution provides access to its large research facilities via a User Service to researchers from universities, other research centers and industry.

Source: Q-CTRL

View post:
Q-CTRL Partners with The Paul Scherrer Institute to Support the Scale-Up of Quantum Computers - HPCwire

D-Wave’s cross platform quantum services are bridge to the future – The Next Web

While Im convinced 2011 will ultimately go down in history as the year the groundbreaking motion picture Cowboys & Aliens was released, it bears mentioning that it was also the year in which the first commercial quantum computer officially went online.

You can dispute whether Daniel Craigs turn as an alien-fighting gold thief with amnesia is worthy of such high praise, but theres no debating that D-Waves a bonafide pioneer in the world of quantum computing.

Dubbed the D-Wave One (two years before the Xbox One gaming system came out), the companys first production model was a quantum annealing system designed to attack optimization problems.

Over a decade later, the company is working on the Advantage Two. Not counting prototypes, itll be the outfits sixth major quantum computing system.

Advantage Two will be a quantum-annealing system featuring a whopping 7,000 functioning qubits.

For folks whove followed quantum computing news, that 7,000 functioning qubits figure might look like a typo. The largest gate-based model were aware of is QuEras 256-qubit neutral atom system.

But D-Waves system uses a different technology.

As Rebel Brown, a marketer whose blog I found at random, explains quite eloquently:

One way to understand the difference between the two types of quantum computer is that the gate model quantum computers require problems to be expressed in terms of quantum gates, and the quantum annealing computer requires problems to be expressed in the language of operations research problems.

But dont just take Browns word for it. The two kinds of quantum computers are as different as night and day. Where gate-based models are still more research than function, D-Waves annealing systems have been solving problems for decades.

As Murray Thom, VP of product management for D-Wave, put it in a recent interview with Neural:

Our focus is 100% on commercial use-cases and bringing value to our customers.

And that means using quantum computers to provide solutions right now. Quantum annealing does that because, as Thom told us, its really the only way to approach optimization problems.

However there are more than just optimization problems out there that need solving. Advantage Two should be able to, for example, help medical facilities optimize nurse and physician schedules across massive geographic areas during disasters and outbreaks.

But it wont be as useful as a gate-based quantum computer when it comes to running quantum simulations for challenging problems such as drug discovery.

Ideally, youd be able to use both. But gate-based systems are experimental at best. Until recently, with the launch of its Clarity Roadmap, D-Waves been content to be a quantum-annealing company in the streets and a cutting-edge research org in the lab.

That all changed last year when D-Wave unveiled its ambitions to combine gate-based technologies with annealing systems using cloud-based portals and tailored software solutions.

Thom told us that D-Wave is convinced that the time is now. Not just for its own stockholders (the companys in the process of going public) but for the entire industry.

According to Thom:

From 2017-2018 to now there has been this explosion in quantum computing tools and getting people access to them. This next phase is going to be the rapid expansion point.

The quantum computing market is expected to triple in the next three years. While theres certainly room for everyone, not all market shares are created equal.

D-Waves already secured its position as the front-runner in quantum optimization solutions. The addition of gate-based systems through separate or integrated stacks could potentially provide its customers with the worlds only one-stop shop for spooky-action-at-a-distance-as-a-service.

Neurals take: Itll be interesting to see if D-Waves ambitions and experience can overcome Googles hunger and bankroll or IBMs sheer tenacity when it comes to pressing an advantage in the field.

At the end of the day, a rising tide lifts all vessels. Were probably further away from quantum computing companies competing for clients than we are from useful gate-based systems. For now at least, theres plenty of quantum problems to go around.

Read this article:
D-Wave's cross platform quantum services are bridge to the future - The Next Web