The Prometheus League
Breaking News and Updates
- Abolition Of Work
- Ai
- Alt-right
- Alternative Medicine
- Antifa
- Artificial General Intelligence
- Artificial Intelligence
- Artificial Super Intelligence
- Ascension
- Astronomy
- Atheism
- Atheist
- Atlas Shrugged
- Automation
- Ayn Rand
- Bahamas
- Bankruptcy
- Basic Income Guarantee
- Big Tech
- Bitcoin
- Black Lives Matter
- Blackjack
- Boca Chica Texas
- Brexit
- Caribbean
- Casino
- Casino Affiliate
- Cbd Oil
- Censorship
- Cf
- Chess Engines
- Childfree
- Cloning
- Cloud Computing
- Conscious Evolution
- Corona Virus
- Cosmic Heaven
- Covid-19
- Cryonics
- Cryptocurrency
- Cyberpunk
- Darwinism
- Democrat
- Designer Babies
- DNA
- Donald Trump
- Eczema
- Elon Musk
- Entheogens
- Ethical Egoism
- Eugenic Concepts
- Eugenics
- Euthanasia
- Evolution
- Extropian
- Extropianism
- Extropy
- Fake News
- Federalism
- Federalist
- Fifth Amendment
- Fifth Amendment
- Financial Independence
- First Amendment
- Fiscal Freedom
- Food Supplements
- Fourth Amendment
- Fourth Amendment
- Free Speech
- Freedom
- Freedom of Speech
- Futurism
- Futurist
- Gambling
- Gene Medicine
- Genetic Engineering
- Genome
- Germ Warfare
- Golden Rule
- Government Oppression
- Hedonism
- High Seas
- History
- Hubble Telescope
- Human Genetic Engineering
- Human Genetics
- Human Immortality
- Human Longevity
- Illuminati
- Immortality
- Immortality Medicine
- Intentional Communities
- Jacinda Ardern
- Jitsi
- Jordan Peterson
- Las Vegas
- Liberal
- Libertarian
- Libertarianism
- Liberty
- Life Extension
- Macau
- Marie Byrd Land
- Mars
- Mars Colonization
- Mars Colony
- Memetics
- Micronations
- Mind Uploading
- Minerva Reefs
- Modern Satanism
- Moon Colonization
- Nanotech
- National Vanguard
- NATO
- Neo-eugenics
- Neurohacking
- Neurotechnology
- New Utopia
- New Zealand
- Nihilism
- Nootropics
- NSA
- Oceania
- Offshore
- Olympics
- Online Casino
- Online Gambling
- Pantheism
- Personal Empowerment
- Poker
- Political Correctness
- Politically Incorrect
- Polygamy
- Populism
- Post Human
- Post Humanism
- Posthuman
- Posthumanism
- Private Islands
- Progress
- Proud Boys
- Psoriasis
- Psychedelics
- Putin
- Quantum Computing
- Quantum Physics
- Rationalism
- Republican
- Resource Based Economy
- Robotics
- Rockall
- Ron Paul
- Roulette
- Russia
- Sealand
- Seasteading
- Second Amendment
- Second Amendment
- Seychelles
- Singularitarianism
- Singularity
- Socio-economic Collapse
- Space Exploration
- Space Station
- Space Travel
- Spacex
- Sports Betting
- Sportsbook
- Superintelligence
- Survivalism
- Talmud
- Technology
- Teilhard De Charden
- Terraforming Mars
- The Singularity
- Tms
- Tor Browser
- Trance
- Transhuman
- Transhuman News
- Transhumanism
- Transhumanist
- Transtopian
- Transtopianism
- Ukraine
- Uncategorized
- Vaping
- Victimless Crimes
- Virtual Reality
- Wage Slavery
- War On Drugs
- Waveland
- Ww3
- Yahoo
- Zeitgeist Movement
-
Prometheism
-
Forbidden Fruit
-
The Evolutionary Perspective
Daily Archives: April 24, 2020
Chess greats face off online, webcams, arbiters to watch moves – The Indian Express
Posted: April 24, 2020 at 2:44 pm
Written by Shivani Naik, Sandip G | Mumbai, New Delhi | Updated: April 24, 2020 5:46:34 pm Viswanathan Anand, Gary Kasparov and Vladimir Kramnik are set to play in the Online Nations Cup
A well-lit room, smartly-placed roving Webcams, a Screen-share setting on Skype, an arbiter sitting remotely while accessing the players computer and alert to any out-of-place ambient sound, and oodles of trust in the top respected names of the game: Those are the logistics of the Online Nations Cup in chess, perhaps the highest profile sporting action thatll take place between May 5-10 at multiple venues coinciding with chess famous residential addresses.
A set of arbiters will also monitor every move (speed of reactions and patterns), vetting them on anti-cheating software to look for engine aids.
Six teams Russia, the USA, China, India, best of Europe and Rest of the world, will go up against each other in a double round robin, blitz 25-minute team format, as the cerebral sport mainstreams playing arenas hitherto frequented by amateurs which will now be headlined by the pros: online chess rooms sitting at home. This was necessitated by a world cornered into restrictive lockdowns forced by the Covid-19 pandemic.
While players and fans are happy that atleast chess can stay afloat while global travel is at a standstill and all sporting action has paused, the biggest test for the organisers and those overseeing the competition fashioned to be like chesss Ryder Cup will be to ensure that the online setting leaves no doubts about fair play, and cheating is completely ruled out.
It is an elite event. Only the best players in the world. They would never risk their reputation, says David Llada of FIDE, while adding that the world body will do everything technology permits to guarantee that conditions for the event cannot be corrupted by anyone participating. With names like Gary Kasparov, Vladimir Kramnik and Indias Viswanathan Anand set to be involved in playing and guiding capacities, FIDE is banking on reputations to ensure nothing silly is indulged in.
However the online setting will keep everyone on their toes, given the sheer scope for manipulations. The players will be playing from home. They are required to have a couple of webcams, so the arbiters can see their room and their computer screen. These will be rapid games, so there are no toilet breaks which would pose a challenge. Chess.com also has some anti-cheating systems that are able to track, with a high degree of reliability, when a player is making too may moves that coincide with the engines recommendations. All things combine we can guarantee the conditions for the event to be considered safe, Llada adds.
Online chess is not a novelty and FIDE says an estimated 16 million games were played online since the lockdown began. But this will be the biggest event to take off on an online platform. Prof Anantharam, one of Indias leading arbiters whos also on the FIDE panel says that backroom work for arbiters will now include tracking every move and tallying it with engine databases in forensic ways.
At this high level, players wont take risks. But as a governing body, we have to take all precautions. There are a couple of checking tools a fast one which flags a move for the fair play team if the software suggests the players analysis could be engine-aided within seconds and then detailed checks monitoring the next moves in a match. Noone is naive to think cheating doesnt happen in online chess.
READ | Georgian Grandmaster caught cheating
Just that technology catches it, he says, adding that cash prizes upto Rs 10000 were withdrawn after some lower-rung amateurs were found cheating. Both chess.com and the other popular website lichess.com have developed anti-cheating provisions now.
Magnus Carlsen, the biggest name in chess, is currently hosting another online tournament, and besides the livestreams and live commentary, the emphasis is on a 360 degree monitoring of the rooms via webcams. The sport is not immune to mischief with an IM using the pretext of a weak bladder to take help from his phone hidden in a dustbin in face to face chess to win an Abu Dhabi tournament. The professor, one of the sharpest arbiters in the country once caught a player in Kochi using some kind of transmissions through a Bluetooth device.
Itll be best if cameras used for the FIDE event have audio feeds. As an arbiter of online games, one would need to observe players movements to see if they are checking tools as well as their computer screens to make it footproof, he says. While reiterating that cheating is unlikely at this level, Prof Anantharam maintains that as arbiters theyll need to be on their toes nevertheless. Someone holding a placard but not in cameras view is not a stunt anyone will try, but after a GM recently found intrepid new ways of cheating, FIDE wont take any chances. One person will need to keep watching the player, he says.
GM Dibyendu Barua, while stressing that no top players will attempt cheating, adds that safety measures are important so that no one can blame anyone later.
Keeping the webcam on for the whole duration, a well lit room and supervision will happen ofcourse, he says adding that hes heard of mobile devices being used for cheating at lower rungs. Itll be exciting to watch mixed teams (1 woman out of 4 players mandatory) and besides watching strong chess countries like China, interest will be high in certain matchups. Just seeing Anand match wits with Kasparov again for the younger generation, or Kasparov-captaining against the young Iranian sensation Alireza. Mischief wont be on top of anyones mind, but we dont want exceptions, he says.
Chennai-based RB Ramesh, a former player and current coach, says that the shorter time controls (25 minute rapid) dont lend themselves to any hanky-panky. With technology, cheating can be curtailed in blitz games. But if its longer formats with 300-400 playing, its tough to monitor. But its why online tournaments are not rated officially and FIDE has software like the plagiarism tracker so to some extent you can mitigate cheating, he says.
Captaining a FIDE age-group team once, he insists that both video and audio cameras need to be live, to erase suspicion of someone standing off camera and suggesting moves. One cant obviously say definitively that cheating doesnt happen online, just the chances here are very low, he says.
While FIDE has reacted quickly to the dramatically altered sporting scene globally,concerns of fair play remain at the back of their mind. 50 years ago there was this big USSR vs Rest of the World that was really massive: Now, the world is differentIndia, China they are claiming their place not only in the chess board, but also as superpowers. India is rubbing shoulders with the US, Russia and Europe, Llada says, adding top players were all a bit worried about having all the over to beard tournaments cancelled. But they were relieved to see that chess organizers, including FIDE, reacted very quickly and are now holding all top-level competitions online.
The last word of wisdom are reserved for Viswanathan Anand, who stresses that due caution will need to be taken. Logistics is easier, but they should do some testing. Like some anti-cheating measures because you are after all playing at home.
The idea is you will share your screen with arbiters. In Skye you have a setting, screen-share, you can do that. The arbiter will have access to your computer and know what you are doing. He knows whats happening in the room. The best would be if we didnt have a lockdown, you could arrange a chess player to go to everyones house as an arbiter and make sure that nothing is going on. But right now I think every once in a while they will look around the room and something like that, he explains.
The multiple world champion from India offers an insight into the future saying online chess simplifies a lot of things for the organisers: travel and venue and other minor practical aspects, while hoping online is not forced into becoming the norm. Its turned out very useful in this situation. Having said that, the whole world hopes that we are not living with Corona for the next 10 years. Hope it would not become necessary to play online all the time. For the moment, it keeps us going.
While stressing that online chess has been around for ages and the world neednt act as if its a new thing invented now, he reckons trust in fellow players is the key.
I think its a question of trust. I think chess players trust each other. If someone is caught, its the end of his career, so everyone will hope it wont happen. If they are obsessed with checking every single thing, it will get unpleasant very fast, he says of the mild irritants thatll be so crucial to maintain fair play in the taut-nerved sport.
The Indian Express is now on Telegram. Click here to join our channel (@indianexpress) and stay updated with the latest headlines
For all the latest Sports News, download Indian Express App.
More:
Chess greats face off online, webcams, arbiters to watch moves - The Indian Express
Posted in Chess Engines
Comments Off on Chess greats face off online, webcams, arbiters to watch moves – The Indian Express
Superintelligence: Nick Bostrom, Napoleon Ryan …
Posted: at 2:43 pm
"I highly recommend this book" --Bill Gates"Terribly important ... groundbreaking... extraordinary sagacity and clarity, enabling him to combine his wide-ranging knowledge over an impressively broad spectrum of disciplines - engineering, natural sciences, medicine, social sciences and philosophy - into a comprehensible whole... If this book gets the reception that it deserves, it may turn out the most important alarm bell since Rachel Carson's Silent Spring from 1962, or ever."--Olle Haggstrom, Professor of Mathematical Statistics"Nick Bostrom's excellent book "Superintelligence" is the best thing I've seen on this topic. It is well worth a read." --Sam Altman, President of Y Combinator and Co-Chairman of OpenAI"Worth reading.... We need to be super careful with AI. Potentially more dangerous than nukes"--Elon Musk, Founder of SpaceX and Tesla"Nick Bostrom makes a persuasive case that the future impact of AI is perhaps the most important issue the human race has ever faced. Instead of passively drifting, we need to steer a course.Superintelligencecharts the submerged rocks of the future with unprecedented detail. It marks the beginning of a new era."--Stuart Russell, Professor of Computer Science, University of California, Berkley"This superb analysis by one of the world's clearest thinkers tackles one of humanity's greatest challenges: if future superhuman artificial intelligence becomes the biggest event in human history, then how can we ensure that it doesn't become the last?"--Professor Max Tegmark, MIT"Valuable. The implications of introducing a second intelligent species onto Earth are far-reaching enough to deserve hard thinking" --The Economist"There is no doubting the force of [Bostrom's] arguments...the problem is a research challenge worthy of the next generation's best mathematical talent. Human civilisation is at stake." --Clive Cookson,Financial Times"Those disposed to dismiss an 'AI takeover' as science fiction may think again after reading this original and well-argued book."--Martin Rees, Past President, Royal Society
"Every intelligent person should read it." --Nils Nilsson, Artificial Intelligence Pioneer, Stanford University
He is recipient of a Eugene R. Gannon Award, and has been listed on Foreign Policy's Top 100 Global Thinkers list twice. He was included on Prospect's World Thinkers list, the youngest person in the top 15. His writings have been translated into 28 languages, and there have been more than 100 translations and reprints of his works. He is a repeat TED speaker and has done more than 2,000 interviews with television, radio, and print media. As a graduate student he dabbled in stand-up comedy on the London circuit, but he has since reconnected with the doom and gloom of his Swedish roots.
See the original post here:
Posted in Superintelligence
Comments Off on Superintelligence: Nick Bostrom, Napoleon Ryan …
Netflix Wins Rights to Ted Melfis The Starling With Melissa McCarthy and Kevin Kline – TheWrap
Posted: at 2:43 pm
Netflix has won the global rights in an auction to The Starling, a dramedy directed by Ted Melfi that stars Melissa McCarthy, Kevin Kline, Timothy Olyphant, Chris ODowd and Daveed Diggs, the streamer announced Monday.
The Starling is described as a heartwarming and comic story about a woman who suffers through a hardship and then becomes obsessed with trying to kill a small starling bird nested in her backyard that harasses and attacks her, something that now represents all her problems. In the process, she forms an unlikely friendship with a quirky psychologist turned veterinarian with his own troubled past.
Melfi is directing the film from a script by Matt Harris, and hell also produce alongside Kimberly Quinn and Limelights Dylan Sellers and Chris Parker. The Starling comes from Limelight, Entertainment One (eOne) and Boies Schiller Entertainment. The executive producers are Boies Schiller Entertainments Zack Schiller and David Boies and eOnes Jen Gorton and Zev Foreman.
Also Read: Melissa McCarthy's 'Superintelligence' Bumped From Holiday Theatrical Release to Premiere on HBO Max
Also co-starring in the film are Skyler Gisondo, Loretta Devine, Laura Harrier, Rosalind Chao and Kimberly Quinn. The Starling is currently in post-production.
McCarthy, who starred in Melfis St. Vincent back in 2014, has made a dramatic turn in recent years after work in Can You Ever Forgive Me? and The Kitchen. Shell also next appear in the live-action remake of Disneys The Little Mermaid.
The Starling is Melfis first feature since 2016s Best Picture-nominated Hidden Figures.
CAA Media Finance and UTA Independent negotiated the deal with Netflix.
Deadline first reported news of the deal.
The actress has come a long way since her days playing Sookie
"Go" (1999)
McCarthy made her feature film debut with a supporting role in "Go," directed by Doug Liman.
"Charlie's Angels: Full Throttle" (2000)The actress had a small role as Doris, a woman flirting with Jimmy Bosley at the crime scene.
"Gilmore Girls" (2000-2007)
McCarthy was cast as Sookie St. James, the best friend of Lorelai Gilmore, in the WB television series. The series ended in 2007, and McCarthy was not asked to return for the reboot announced in February.
"Curb Your Enthusiasm" (2004)McCarthy played a saleswoman in an episode titled "The Surrogate," in which Larry David gets a heart monitor and uses the device to get out of uncomfortable situations.
In 2005, McCarthy married Ben Falcone, fellow actor and future "Bridesmaids" co-star (seen here at the 2007 Sundance Film Festival).
"Mike &Molly" (2010-2015)"Mike & Molly" premiered on CBS in 2010 and starred McCarthy and Billy Gardell as a couple who fall in love. The show was cancelled in January 2016.
McCarthy even earned an Oscar nomination for her role in "Bridesmaids," and presented at the 2012 ceremony with co-star Rose Byrne.
"This Is 40" (2012)
With Paul Rudd and Leslie Mann in the leads of Judd Apatow's comedy, McCarthy played a kid's mom who gets in a verbal argument with Rudd's character, Pete, at school.
"Identity Thief" (2013)
The film was a surprise hit at the box office, debuting to $34.5 million and grossing $134.5 million although it received terrible reviews. Jason Bateman starred in the film about a man getting his identity stolen by a woman.
"The Heat" (2013)Directed by Paul Feig, McCarthy teamed up with Sandra Bullock to take down a mobster. The film grossed $230 million globally from a $43 million budget.
"Tammy" (2014)
The film, which received mixed reviews, had McCarthy in the role of a recently-unemployed woman who goes on a road trip with her alcoholic grandmother. The film made $84.5 million domestically.
"The Boss" (2016)
McCarthy stars as a disgraced industry titan who goes to prison for insider trading. She then tries to redeem herself by starting a new empire with brownies.
The actress has come a long way since her days playing Sookie
The actress has come a long way since her days playing Sookie
Continue reading here:
Netflix Wins Rights to Ted Melfis The Starling With Melissa McCarthy and Kevin Kline - TheWrap
Posted in Superintelligence
Comments Off on Netflix Wins Rights to Ted Melfis The Starling With Melissa McCarthy and Kevin Kline – TheWrap
Google’s Head of Quantum Computing Hardware Resigns – WIRED
Posted: at 2:41 pm
In late October 2019, Google CEO Sundar Pichai likened the latest result from the companys quantum computing hardware lab in Santa Barbara, California, to the Wright brothers first flight.
One of the labs prototype processors had achieved quantum supremacyevocative jargon for the moment a quantum computer harnesses quantum mechanics to do something seemingly impossible for a conventional computer. In a blog post, Pichai said the milestone affirmed his belief that quantum computers might one day tackle problems like climate change, and the CEO also name-checked John Martinis, who had established Googles quantum hardware group in 2014.
Heres what Pichai didnt mention: Soon after the team had first got its quantum supremacy experiment working a few months earlier, Martinis says, he had been reassigned from a leadership position to an advisory one. Martinis tells WIRED that the change led to disagreements with Hartmut Neven, the longtime leader of Googles quantum project.
Martinis resigned from Google early this month. Since my professional goal is for someone to build a quantum computer, I think my resignation is the best course of action for everyone, he adds.
A Google spokesman did not dispute this account, and says that the company is grateful for Martinis contributions and that Neven continues to head the companys quantum project. Parent company Alphabet has a second, smaller, quantum computing group at its X Labs research unit. Martinis retains his position as a professor at the UC Santa Barbara, which he held throughout his tenure at Google, and says he will continue to work on quantum computing.
Googles quantum computing project was founded by Neven, who pioneered Googles image search technology, in 2006, and initially focused on software. To start, the small group accessed quantum hardware from Canadian startup D-Wave Systems, including in collaboration with NASA.
Everything you ever wanted to know about qubits, superpositioning, and spooky action at a distance.
The project took on greater scale and ambition when Martinis joined in 2014 to establish Googles quantum hardware lab in Santa Barbara, bringing along several members of his university research group. His nearby lab at UC Santa Barbara had produced some of the most prominent work in the field over the past 20 years, helping to demonstrate the potential of using superconducting circuits to build qubits, the building blocks of quantum computers.
Qubits are analogous to the bits of a conventional computer, but in addition to representing 1s and 0s, they can use quantum mechanical effects to attain a third state, dubbed a superposition, something like a combination of both. Qubits in superposition can work through some very complex problems, such as modeling the interactions of atoms and molecules, much more efficiently than conventional computer hardware.
How useful that is depends on the number and reliability of qubits in your quantum computing processor. So far the best demonstrations have used only tens of qubits, a far cry from the hundreds or thousands of high quality qubits experts believe will be needed to do useful work in chemistry or other fields. Googles supremacy experiment used 53 qubits working together. They took minutes to crunch through a carefully chosen math problem the company calculated would take a supercomputer on the order of 10,000 years, but does not have a practical application.
Martinis leaves Google as the company and rivals that are working on quantum computing face crucial questions about the technologys path. Amazon, IBM, and Microsoft, as well as Google offer their prototype technology to companies such as Daimler and JP Morgan so they can run experiments. But those processors are not large enough to work on practical problems, and it is not clear how quickly they can be scaled up.
When WIRED visited Googles quantum hardware lab in Santa Barbara last fall, Martinis responded optimistically when asked if his hardware team could see a path to making the technology practical. I feel we know how to scale up to hundreds and maybe thousands of qubits, he said at the time. Google will now have to do it without him.
More Great WIRED Stories
The rest is here:
Comments Off on Google’s Head of Quantum Computing Hardware Resigns – WIRED
Wiring the Quantum Computer of the Future: aNovel Simple Build with Existing Technology – Analytics Insight
Posted: at 2:41 pm
Wiring the Quantum Computer of the Future: a Novel Simple Build with Existing Technology
The basic units of a quantum computer can be rearranged in 2D to solve typical design and operation challenges
Efficient quantum computing is expected to enable advancements that are impossible with classical computers. Scientists from Japan and Sydney have collaborated and proposed a novel two-dimensional design that can be constructed using existing integrated circuit technology. This design solves typical problems facing the current three-dimensional packaging for scaled-up quantum computers, bringing the future one step closer.
Quantum computing is increasingly becoming the focus of scientists in fields such as physics and chemistry,and industrialists in the pharmaceutical, airplane, and automobile industries. Globally, research labs at companies like Google and IBM are spending extensive resources on improving quantum computers, and with good reason. Quantum computers use the fundamentals of quantum mechanics to process significantly greater amounts of information much faster than classical computers. It is expected that when error-corrected and fault-tolerant quantum computation is achieved, scientific and technological advancement will occur at an unprecedented scale.
But, building quantum computers for large-scale computation is proving to be a challenge in terms of their architecture. The basic units of a quantum computer are the quantum bits or qubits. These are typically atoms, ions, photons, subatomic particles such as electrons,or even larger elements that simultaneously exist in multiple states, making it possible to obtain several potential outcomes rapidly for large volumes of data. The theoretical requirement for quantum computers is that these are arranged in two-dimensional (2D) arrays, where each qubit is both coupled with its nearest neighbor and connected to the necessary external control lines and devices. When the number of qubits in an array is increased, it becomes difficult to reach qubits in the interior of the array from the edge. The need to solve this problem has so far resulted in complex three-dimensional (3D) wiring systems across multiple planes in which many wires intersect,making their construction a significant engineering challenge.
A group of scientists from Tokyo University of Science, Japan, RIKEN Centre for Emergent Matter Science, Japan, and University of Technology, Sydney, led by Prof Jaw-Shen Tsai, proposes a unique solution to this qubit accessibility problem by modifying the architecture of the qubit array. Here, we solve this problem and present a modified superconducting micro-architecture that does not require any 3D external line technology and reverts to a completely planar design, they say. This study has been published in the New Journal of Physics.
The scientists began with a qubit square lattice array and stretched out each column in the 2D plane. They then folded each successive column on top of each other, forming a dual one-dimensional array called a bi-linear array. This put all qubits on the edge and simplified the arrangement of the required wiring system.The system is also completely in 2D. In this new architecture, some of the inter-qubit wiringeach qubit is also connected to all adjacent qubits in an arraydoes overlap, but because these are the only overlaps in the wiring, simple local 3D systems such as airbridges at the point of overlap are enough and the system overall remains in 2D. As you can imagine, this simplifies its construction considerably.
The scientists evaluated the feasibility of this new arrangement through numerical and experimental evaluation in which they tested how much of a signal was retained before and after it passed through an airbridge. Results of both evaluations showed that it is possible to build and run this system using existing technology and without any 3D arrangement.
The scientists experiments also showed them that their architecture solves several problems that plague the 3D structures: they are difficult to construct, there is crosstalk or signal interference between waves transmitted across two wires, and the fragile quantum states of the qubits can degrade. The novel pseudo-2D design reduces the number of times wires cross each other,thereby reducing the crosstalk and consequently increasing the efficiency of the system.
At a time when large labs worldwide are attempting to find ways to buildlarge-scale fault-tolerant quantum computers, the findingsof this exciting new study indicate that such computers can be built using existing 2D integrated circuit technology. The quantum computer is an information device expected to far exceed the capabilities of modern computers, Prof Tsai states.The research journey in this direction has only begun with this study, and Prof Tsai concludes by saying, We are planning to construct a small-scale circuit to further examine and explore the possibility.
###
ReferenceTitle of original paper: Pseudo-2D superconducting quantum computing circuit for the surface code: the proposal and preliminary tests
Journal:New Journal of Physics
DOI:10.1088/1367-2630/ab7d7d
Tokyo University of Science (TUS) is a well-known and respected university, and the largest science-specialized private research university in Japan, with four campuses in central Tokyo and its suburbs and in Hokkaido. Established in 1881, the university has continually contributed to Japans development in science through inculcating the love for science in researchers, technicians, and educators.
With a mission of Creating science and technology for the harmonious development of nature, human beings, and society, TUS has undertaken a wide range of research from basic to applied science. TUS has embraced a multidisciplinary approach to research and undertaken intensive study in some of todays most vital fields. TUS is a meritocracy where the best in science is recognized and nurtured. It is the only private university in Japan that has produced a Nobel Prize winner and the only private university in Asia to produce Nobel Prize winners within the natural sciences field.
Website:https://www.tus.ac.jp/en/mediarelations/
Dr Jaw-Shen Tsai is currently a Professor at the Tokyo University of Science, Japan. He began research in Physics in 1975 and continues to hold interest in areas such as superconductivity, the Josephson effect, quantum physics, coherence, qubits, and artificial atoms. He has 160+ research publications to his credit and serves as the lead author in this paper. He has also won several awards, including Japans Medal of Honor, the Purple Ribbon Award.
Professor Jaw-Shen Tsai
Department of Physics
Tokyo University of Science
Tsutomu Shimizu
Public Relations Divisions
Tokyo University of Science
Email: mediaoffice@admin.tus.ac.jp
Website: https://www.tus.ac.jp/en/mediarelations/
Share This ArticleDo the sharing thingy
Read the original post:
Comments Off on Wiring the Quantum Computer of the Future: aNovel Simple Build with Existing Technology – Analytics Insight
Quantum Computing Is Hot And Noisy, But Zapata Opens Early Access – Forbes
Posted: at 2:41 pm
Zapata's quantum coders, ready for a hot & noisy ride.
Were on the road to quantum computing. But these massively powerful machines are still in somewhat embryonic prototype stages and we still have several key challenges to overcome before we can start to build more of them.
As a quantum reminder: traditional computers compute on the basis of binary 1s and 0s, so all values and mathematical logic are essentially established from a base of those two values quantum superposition particles (known as qubits) can be 1 or 0, or anywhere in between and the value expressed can be differentiated depending upon what angle the qubit is viewed from so with massively more breadth, we can create a lot more algorithmic logic and computing power.
One of the main challenges associated with building quantum computing machines is the massive heat they generate. Scientists have been working with different semiconducting materials such as so-called quantum dots to help overcome the heat challenge. This issue is that qubits are special, qubits are powerful, but qubits are also fragile... and heat is one of their sworn enemies.
Another core challenge is noise.
As computations pass through the quantum gates that make up the quantum circuits in our new super quantum machines they create a lot of noise disturbance (think of an engine revving louder as it speeds up), so this means we have come to define and accept the term NISQ-based quantum applications i.e. Noisy Intermediate-Scale Quantum (NISQ).
As beautifully clarified by theoretical physicist John Preskill in this 2018 paper, Noisy Intermediate-Scale Quantum (NISQ) technology will be available in the near future. Quantum computers with 50-100 qubits may be able to perform tasks which surpass the capabilities of todays classical digital computers, but noise in quantum gates will limit the size of quantum circuits that can be executed reliably. Quantum technologists should continue to strive for more accurate quantum gates and, eventually, fully fault-tolerant quantum computing.
The fact that we know about the heat and noise challenges hasnt stopped companies like Strangeworks, D-Wave Systems, Coldquanta and others (including usual suspects Intel, IBM and Microsoft) forging on with development in the quantum space. Joining that list is Boston-headquartered Zapata Computing, Inc. The company describes itself as the quantum software company for near-term/NISQ-based quantum applications empowering enterprise teams. Near-term in this case meaning, well, now i.e. quantum stuff we can actually use on quantum devices of about 100-300 qubits.
Zapatas latest quantum leap in quantum (pun absolutely intended) is an early access program to Orquestra, its platform for quantum-enabled workflows. The company claims to have provided a software- and hardware-interoperable enterprise quantum toolset i.e. again, quantum tools we can actually use in modern day enterprise IT departments.
Using Zapatas unified Quantum Operating Environment, users can build, run and analyze quantum and quantum-inspired workflows. This toolset will empower enterprises and institutions to make their quantum mark on the world, enabling them to develop quantum capabilities and foundational IP today while shoring up for derivative IP for tomorrow, says CEO Christopher Savoie. It is a new computing paradigm, built on a unified enterprise framework that spans quantum and classical programming and hardware tools. With Orquestra, we are accelerating quantum experiments at scale.
Zapatas Early Access Program to Orquestra is aimed at users with backgrounds in software engineering, machine learning, physics, computational chemistry or quantum information theory working on the most computationally complex problems.
Orquestra is agnostic across the entire software and hardware stack. It offers an extensible library of open source and Zapata-created components for writing, manipulating and optimizing quantum circuits and running them across quantum computers, quantum simulators and classical computing resources. It comes equipped with a versatile workflow system and Application Programming Interfaces (APIs) to connect all modes of quantum devices.
We developed Orquestra to scale our own work for our customers and then realized the quantum community needs it, too. Orquestra is the only system for managing quantum workflows, said Zapata CTO Yudong Cao. The way we design and deploy computing solutions is changing. Orquestras interoperable nature enables extensible and modular implementations of algorithms and workflows across platforms and unlocks fast, fluid repeatability of experiments at scale.
So were on a journey. The journey is the road from classical-to-quantum and the best advice is to insist upon an interoperable vehicle (as Zapata has provided here) and to take a modular and extensible approach. In car analogy theory, that would mean break your journey up into bite-size chunks and make sure you have enough gas for the long haul when it comes. The quantum software parallel is obvious enough not to even explain.
Even when quantum evolves to become more ubiquitously available, many people think it will still be largely delivered as a cloud computing Quantum-as-a-Service (QaaS) package, but understanding the noisy overheated engine room in the meantime makes for a fascinating movie preview.
See the original post here:
Quantum Computing Is Hot And Noisy, But Zapata Opens Early Access - Forbes
Comments Off on Quantum Computing Is Hot And Noisy, But Zapata Opens Early Access – Forbes
Will Quantum Computing Really Change The World? Facts And Myths – Analytics India Magazine
Posted: at 2:41 pm
In recent years, some big tech companies like IBM, Microsoft, Intel, or Google have been working in relative silence on something that sounds great: quantum computing. The main problem with this is that it is difficult to know what exactly it is and what it can be useful for.
There are some questions that can be easily solved. For example, quantum computing is not going to help you have more FPS on your graphics card at the moment. Nor will it be as easy as changing the CPU of your computer for a quantum to make it hyperfast. Quantum computing is fundamentally different from the computing we are used to, but how?
At the beginning of the 20th century, Planck and Einstein proposed that light is not a continuous wave (like the waves in a pond) but that it is divided into small packages or quanta. This apparently simple idea served to solve a problem called the ultraviolet catastrophe. But over the years other physicists developed it and came to surprising conclusions about the matter, of which we will be interested in two: the superposition of states and entanglement.
To understand why we are interested, lets take a short break and think about how a classic computer works. The basic unit of information is the bit, which can have two possible states (1 or 0) and with which we can perform various logical operations (AND, NOT, OR). Putting together n bits we can represent numbers and operate on those numbers, but with limitations: we can only represent up to 2 different states, and if we want to change x bits we have to perform at least x operations on them: there is no way to magically change them without touching them.
Well, superposition and entanglement allow us to reduce these limitations: with superposition, we can store many more than just 2 ^ n states with n quantum bits (qubits), and entanglement maintains certain relations between qubits in such a way that the operations in one qubit they forcefully affect the rest.
Overlapping, while looking like a blessing at first glance, is also a problem. As Alexander Holevo showed in 1973, even though we have many more states than we can save in n qubits, in practice we can only read 2 ^ n different ones. As we saw in an article in Genbeta about the foundations of quantum computing: a qubit is not only worth 1 or 0 as a normal bit, but it can be 1 in 80% and 0 in 20%. The problem is that when we read it we can only obtain either 1 or 0, and the probabilities that each value had of leaving are lost because when we measured it we modified it.
This discrepancy between the information kept by the qubits and what we can read led Benioff and Feynman to demonstrate that a classical computer would not be able to simulate a quantum system without a disproportionate amount of resources, and to propose models for a quantum computer that did. was able to do that simulation.
Those quantum computers would probably be nothing more than a scientific curiosity without the second concept, entanglement, which allows two quite relevant algorithms to be developed: quantum tempering in 1989 and Shors algorithm in 1994. The first allows finding minimum values of functions, which So said, it does not sound very interesting but it has applications in artificial intelligence and machine learning, as we discussed in another article. For example, if we manage to code the error rate of a neural network as a function to which we can apply quantum quenching, that minimum value will tell us how to configure the neural network to be as efficient as possible.
The second algorithm, the Shor algorithm, helps us to decompose a number into its prime factors much more efficiently than we can achieve on a normal computer. So said, again, it doesnt sound at all interesting. But if I tell you that RSA, one of the most used algorithms to protect and encrypt data on the Internet, is based on the fact that factoring numbers are exponentially slow (adding a bit to the key implies doubling the time it takes to do an attack by force) then the thing changes. A quantum computer with enough qubits would render many encryption systems completely obsolete.
Until now, quantum computing is a field that hasnt been applied much in the real world. To give us an idea, with the twenty qubits of the commercial quantum computer announced by IBM, we could apply Shors factorization algorithm only to numbers less than 1048576, which as you can imagine is not very impressive.
Still, the field has a promising evolution. In 1998 the first ord quantum drive (only two qubits, and needed a nuclear magnetic resonance machine to solve a toy problem (the so-called Deutsch-Jozsa problem). In 2001 Shors algorithm was run for the first time. Only 6 years later, in 2007, D-Wave presented its first computer capable of executing quantum quenching with 16 qubits. This year, the same company announced a 2000 qubit quantum quenching computer. On the other hand, the new IBM computers, although with fewer qubits, they are able to implement generic algorithms and not only that of quantum quenching. In short, it seems that the push is strong and that quantum computing will be increasingly applicable to real problems.
What can those applications be? As we mentioned before, the quantum tempering algorithm is very appropriate for machine learning problems, which makes the computers that implement it extremely useful, although the only thing they can do is run that single algorithm. If systems can be developed that, for example, are capable of transcribing conversations or identifying objects in images and can be translated to train them in quantum computers, the results could be orders of magnitude better than those that already exist. The same algorithm could also be used to find solutions to problems in medicine or chemistry, such as finding the optimal treatment methods for a patient or studying the possible structures of complex molecules.
Generic quantum computers, which have fewer qubits right now, could run more algorithms. For example, they could be used to break much of the crypto used right now as we discussed earlier (which explains why the NSA wanted to have a quantum computer). They would also serve as super-fast search engines if Grovers search algorithm can be implemented, and for physics and chemistry, they can be very useful as efficient simulators of quantum systems.
Unfortunately, algorithms and codes for classic computers couldnt be used on quantum computers and magically get an improvement in speed: you need to develop a quantum algorithm (not a trivial thing) and implement it in order to get that improvement. That, at first, greatly restricts the applications of quantum computers and will be a problem to overcome when those systems are more developed.
However, the main problem facing quantum computing is building computers. Compared to a normal computer, a quantum computer is an extremely complex machine: they operate at a temperature close to absolute zero (-273 C), the qubits support are superconducting and the components to be able to read and manipulate the qubits are not simple either.
What can a non-quantum quantum computer be like? As we have explained before, the two relevant concepts of a quantum computer are superposition and entanglement, and without them, there cannot be the speed improvements that quantum algorithms promise. If computer disturbances modify overlapping qubits and bring them to classical states quickly, or if they break the interweaving between several qubits, what we have is not a quantum computer but only an extremely expensive computer that only serves to run a handful of algorithms. equivalent to a normal computer (and will probably give erroneous results).
Of the two properties, entanglement is the most difficult to maintain and prove to exist. The more qubits there are, the easier it is for one of them to deinterlace (which explains why increasing the number of qubits is not a trivial task). And it is not enough to build the computer and see that correct results come out to say that there are intertwined qubits: looking for evidence of entanglement is a task in itself and in fact, the lack of evidence was one of the main criticisms of D-systems. Wave in its beginnings.
A priori and with the materials that quantum computers are being built with, it does not seem that miniaturization is too feasible. But there is already research on new materials that could be used to create more accessible quantum computers. Who knows if fifty years from now we will be able to buy quantum CPUs to improve the speed of our computers.
comments
The rest is here:
Will Quantum Computing Really Change The World? Facts And Myths - Analytics India Magazine
Comments Off on Will Quantum Computing Really Change The World? Facts And Myths – Analytics India Magazine
Google’s top quantum computing brain may or may not have quit – Fudzilla
Posted: at 2:41 pm
We will know when someone opens his office door
John Martinis, who had established Googles quantum hardware group in 2014, has cleaned out his office, put the cats out and left the building.
Martinis says a few months after he got Googles now legendary quantum computing experiment to go he was reassigned from a leadership position to an advisory one.
Martinis told Wired that the change led to disagreements with Hartmut Neven, the longtime leader of Googles quantum project.
Martinis said he had to go because his professional goal is for someone to build a quantum computer.
Google has not disputed this account, and says that the company is grateful for Martinis contributions and that Neven continues to head the companys quantum project.
Martinis retains his position as a professor at the UC Santa Barbara, which he held throughout his tenure at Google, and says he will continue to work on quantum computing.
To be fair, Googles quantum computing project was founded by Neven, who pioneered Googles image search technology, and got enough cats together.
The project took on greater scale and ambition when Martinis joined in 2014 to establish Googles quantum hardware lab in Santa Barbara, bringing along several members of his university research group. His nearby lab at UC Santa Barbara had produced some of the most prominent work in the field over the past 20 years, helping to demonstrate the potential of using superconducting circuits to build qubits, the building blocks of quantum computers.
Googles ground-breaking supremacy experiment used 53 qubits working together. They took minutes to crunch through a carefully chosen math problem the company calculated would take a supercomputer 10,000 years to work out. It still does not have a practical use, and the cats were said to be bored with the whole thing.
Continued here:
Google's top quantum computing brain may or may not have quit - Fudzilla
Comments Off on Google’s top quantum computing brain may or may not have quit – Fudzilla
Devs: How the Quantum Computer Works & Mysteries That Remain – Screen Rant
Posted: at 2:41 pm
Devs' final episode answered many of fans' questions about the quantum computer at the heart of the Devs project, but mysteries about the computer and the characters' fates remain. The miniseries is an exclusive production between FX and Hulu, as part of the new FX on Hulu banner. Written and directed by Alex Garland, the show dives into some heady existential ideas, using the central mystery of Lily Chan (Sonoya Mizuno) investigating the mysterious death of her boyfriend as a vehicle to explore themes of quantum physics, determinism, and free will - all being manipulated behind the scenes of thenefarious tech company that she works for.
From the very first episode of the series, audiences were keyed into the fact that the tech corporation Amaya was working on asecretive project in their Devs division. The reveal came sooner than expected, as episode 2 confirmed the suspicions of the show's most ardent fans: the Devs team is working on an extremely powerful quantum computer,the purpose of which far exceeds the limitations of the real-world quantum computers being worked on at IBM and Google. Amaya's computer runs a specific set of data and code; more directly,the quantum computer is capable of distilling the universe down to matters of cause and effect, making it essentially able to predict the future.
Related: Devs Ending & NewTitle Meaning Explained
The quantum computer's reliance on determinism, which focuses on a myopic cause-and-effect-dependent view of reality, has been the center ofDevs' intra-character conflicts. Forest (Nick Offerman) fired Lyndon (Cailee Spaeny) because of their disagreement about the Many Worlds Theory and a Determinist understanding of reality, and leading into the finale, audiences were keenly aware of a statement made by Katie (Alison Pill) back in episode 6: the quantum computer can't see past a fixed point, one that involves Lily in the Devs laboratory. Episode 8's reveals answered fans' questions about the computer's functionality, but not all of the explanations hold up.
Throughout the series, Forest and Katie have ascertained that the quantum computer is determinist in theory and that no variations can occur because their reality is set in stone. This falls in line with Forest's personal philosophy and his reasons for clinging to determinism: if everything is predetermined, then he has no personal culpability in the death ofForest's wife and child. This extends to the murder of Sergei (Karl Glusman) and Katie's assistance in Lyndon's unexpected death. But after Lyndon improved the Devs projections by introducing the Many-Worlds Theory, it became clear that Forest and Katie were adhering to the quantum computer's projections not because they hadto, but because they wantedto.
However,afterLily arrives at Devs, in episode 8, she sees the future predicted for her by Forest's deterministic projection: on the Devs screen, she shoots Forest in the face, and the bullet pierces the lift's glass, breaking the airtight seal that keeps the lift afloat. Lily plunges to her death. As the scene plays out, however, she tosses away the gun as the lift's doors close, ensuring that she won't follow the same sequence the computer predicted. Her choice breaks the deterministic frameworkthat Forest and Katie have clung to throughout the series, and when Forest is reincarnated in the Devs simulation, he realizes that determinism was a faulty philosophy, a way of looking at the world that fails to fit with the data.
Lily's choicesupports two concomitant theories in quantum physics. The Many-Worlds Theory,posited by Hugh Everett, has already been debated throughout the show's run, butsince Lily's choice was motivated by her observation of the outcome, the Copenhagen Theory also has merit. As described by Katie's teacher in episode 5, the Copenhagen Theory "suggests that the act of measurement affects the system." Despite Katie scoffing at this theory,Devs' finaleoffers evidence for both the Copenhagen and Many-Worlds Theories.
Related: What To Expect From Devs Season 2
There's a popular fan theory regarding the show that originated on Reddit, from user emf1200, that suggests the entire show takes place within a simulation. This comes from the fact that the projection software works by simulating events through the usage of the predictive algorithm: the Devs team isn't technically peering backwards through time; they're reconstructing time and viewing it like a movie. Episode 7 has a scene where Stewart(Stephen McKinley Henderson)shows off the computer's predictive capabilities to a group of employees, and casually mentions how "somewhere in each box, there's another box." This implies that within the simulation the Devs team is watching, there's another version of the Devs team watching another simulation, and so on and so forth. By this logic, there's enough evidence to suggest that the show fans are watching is not the prime universe, but simply a simulation somewhere within a stack of simulations.
Though the finale did not strictly follow this theory - there was indeed a prime reality - when Forest and Lily are reincarnated in the Devs computer, a life that Katie characterizes as "indistinguishable" from reality, they essentially enter the "box within a box." They each become Neo (Keanu Reeves), without superpowers,fromThe Matrix, knowing that they are in a simulation with the power to exercise free will within each reality.
Up until episode 4, the Devs team was convinced (at Forest's insistence) that the universe operated on the De Brogile-Bohm theory, a deterministic interpretation of quantum physics that suggests events are set in stone as the result of cause and effect. This produced some results, namely - the preliminary version of the projection that could only render hazy, static-filled visions of the past and future. However, in episode 7, Stewart and the rest of the team perfected the quantum computerby switching out Forest's determinist theory with Lyndon's Many-Worlds theory, which Stewart says "is the universe as is."But once the quantum computer operated under the Many-Worlds Theory, why did it fail to predict Lily's decision to throw away the gun? According to Stewart and Lyndon, the multiverse exists,but the predictions made by the project are only of one universe. All Lyndon's algorithm did was clear up the static, not change the nature of the prediction.
But this raises yet another question:why did the deterministic computer projection stop accurately predicting the future at the point of Lily's death, when the actual moment that violated the laws of determinism was her decision to toss the weapon, not when she died? This question brings up afrustratingissue with the show's conclusion. Each adherent to all of the theories about quantum superposition can find evidence to support his/her position, andDevs offers no definitive conclusion. Copenhagen enthusiasts note that Lily's observation affected the outcome, Many-Worlds theorists are pleased with the free will implications of Lily's decision, and Determinists note that despite Lily tossing away the gun, Forest and Lily still died in the samemannerthe computer predicted.
Related: Devs: What Stewart's Poem Is & Why Forest Gets It Wrong
In the simulation Forest states that he "exercised a little free will" by giving each version of Lily and Forest in the Devs simulation knowledge of other worlds. This effectively deals with the show's recurrent themes aboutForest using determinism as a scapegoat for personal accountability because each version of Forest must reckon with the knowledge that other Forests are living under better or worse outcomes. However, while the show's conclusion holds up thematically, the failure of the quantum computer to accurately predict the final episode's outcome remains a mystery, one thatDevsas a whole didn't adequately address.
More: Why Hulu's Devs Represents A New Era Of Cyberpunk
Star-Lord's Infinity War Mistake Happened Before (Except He Wasn't Wrong)
Chrishaun Baker is a Feature Writer for Screen Rant, with a host of interests ranging from horror movies to video games to superhero films. A soon-to-be graduate of Western Carolina University, he spends his time reading comic books and genre fiction, directing short films, writing screenplays, and getting increasingly frustrated at the state of film discourse in 2020. You can find him discussing movies on Letterboxd or working up a migraine over American politics on Twitter.
Go here to see the original:
Devs: How the Quantum Computer Works & Mysteries That Remain - Screen Rant
Comments Off on Devs: How the Quantum Computer Works & Mysteries That Remain – Screen Rant
The future of quantum computing in the cloud – TechTarget
Posted: at 2:41 pm
AWS, Microsoft and other IaaS providers have jumped on the quantum computing bandwagon as they try to get ahead of the curve on this emerging technology.
Developers use quantum computing to encode problems as qubits, which compute multiple combinations of variables at once rather than exploring each possibility discretely. In theory, this could allow researchers to quickly solve problems involving different combinations of variables, such as breaking encryption keys, testing the properties of different chemical compounds or simulating different business models. Researchers have begun to demonstrate real-world examples of how these early quantum computers could be put to use.
However, this technology is still being developed, so experts caution that it could take more than a decade for quantum computing to deliver practical value. In the meantime, there are a few cloud services, such as Amazon Bracket and Microsoft Quantum, that aim to get developers up to speed on writing quantum applications.
Quantum computing in the cloud has the potential to disrupt industries in a similar way as other emerging technologies, such as AI and machine learning. But quantum computing is still being established in university classrooms and career paths, said Bob Sutor, vice president of IBM Quantum Ecosystem Development. Similarly, major cloud providers are focusing primarily on education at this early stage.
"The cloud services today are aimed at preparing the industry for the soon-to-arrive day when quantum computers will begin being useful," said Itamar Sivan, co-founder and CEO of Quantum Machines, an orchestration platform for quantum computing.
There's still much to iron out regarding quantum computing and the cloud, but the two technologies appear to be a logical fit, for now.
Cloud-based quantum computing is more difficult to pull off than AI, so the ramp up will be slower and the learning curve steeper, said Martin Reynolds, distinguished vice president of research at Gartner. For starters, quantum computers require highly specialized room conditions that are dramatically different from how cloud providers build and operate their existing data centers.
Reynolds believes practical quantum computers are at least a decade away. The biggest drawback lies in aligning the quantum state of qubits in the computer with a given problem, especially since quantumcomputersstill haven't been proven to solve problems better than traditional computers.
Coders also must learn new math and logic skills to utilize quantum computing. This makes it hard for them since they can't apply traditional digital programming techniques. IT teams need to develop specialized skills to understand how to apply quantum computing in the cloud so they can fine tune the algorithms, as well as the hardware, to make this technology work.
Current limitations aside, the cloud is an ideal way to consume quantum computing, because quantum computing has low I/O but deep computation, Reynolds said. Because cloud vendors have the technological resources and a large pool of users, they will inevitably be some of the first quantum-as-a-service providers and will look for ways to provide the best software development and deployment stacks.
Quantum computing could even supplement general compute and AI services cloud providers currently offer, said Tony Uttley, president of Honeywell Quantum Solutions.In that scenario, the cloud would integrate with classical computing cloud resources in a co-processing environment.
The cloud plays two key roles in quantum computing today, according to Hyoun Park, CEO and principal analyst at Amalgam Insights. The first is to provide an application development and test environment for developers to simulate the use of quantum computers through standard computing resources.
The second is to offer access to the few quantum computers that are currently available, in the way mainframe leasing was common a generation ago. This improves the financial viability of quantum computing, since multiple users can increase machine utilization.
It takes significant computing power to simulate quantum algorithm behavior from a development and testing perspective. For the most part, cloud vendors want to provide an environment to develop quantum algorithms before loading these quantum applications onto dedicated hardware from other providers, which can be quite expensive.
However, classical simulations of quantum algorithms that use large numbers of qubits are not practical. "The issue is that the size of the classical computer needed will grow exponentially with the number of qubits in the machine," said Doug Finke, publisher of the Quantum Computing Report.So, a classical simulation of a 50-qubit quantum computer would require a classical computer with roughly 1 petabyte of memory. This requirement will double with every additional qubit.
Nobody knows which approach is best, or which materials are best. We're at the Edison light bulb filament stage. Martin ReynoldsDistinguished vice president of research at Gartner
But classical simulations for problems using a smaller number of qubits are useful both as a tool to teach quantum algorithms to students and also for quantum software engineers to test and debug algorithms with "toy models" for their problem, Finke said.Once they debug their software, they should be able to scale it up to solve larger problems on a real quantum computer.
In terms of putting quantum computing to use, organizations can currently use it to support last-mile optimization, encryption and other computationally challenging issues, Park said. This technology could also aid teams across logistics, cybersecurity, predictive equipment maintenance, weather predictions and more. Researchers can explore multiple combinations of variables in these kinds of problems simultaneously, whereas a traditional computer needs to compute each combination separately.
However, there are some drawbacks to quantum computing in the cloud. Developers should proceed cautiously when experimenting with applications that involve sensitive data, said Finke. To address this, many organizations prefer to install quantum hardware in their own facilities despite the operational hassles, Finke said.
Also, a machine may not be immediately available when a quantum developer wants to submit a job through quantum services on the public cloud. "The machines will have job queues and sometimes there may be several jobs ahead of you when you want to run your own job," Finke said. Some of the vendors have implemented a reservation capability so a user can book a quantum computer for a set time period to eliminate this problem.
IBM was first to market with its Quantum Experience offering, which launched in 2016 and now has over 15 quantum computers connected to the cloud. Over 210,000 registered users have executed more than 70 billion circuits through the IBM Cloud and published over 200 papers based on the system, according to IBM.
IBM also started the Qiskit open source quantum software development platform and has been building an open community around it. According to GitHub statistics, it is currently the leading quantum development environment.
In late 2019, AWS and Microsoft introduced quantum cloud services offered through partners.
Microsoft Quantum provides a quantum algorithm development environment, and from there users can transfer quantum algorithms to Honeywell, IonQ or Quantum Circuits Inc. hardware. Microsoft's Q# scripting offers a familiar Visual Studio experience for quantum problems, said Michael Morris, CEO of Topcoder, an on-demand digital talent platform.
Currently, this transfer involves the cloud providers installing a high-speed communication link from their data center to the quantum computer facilities, Finke said. This approach has many advantages from a logistics standpoint, because it makes things like maintenance, spare parts, calibration and physical infrastructure a lot easier.
Amazon Braket similarly provides a quantum development environment and, when generally available, will provide time-based pricing to access D-Wave, IonQ and Rigetti hardware. Amazon says it will add more hardware partners as well. Braket offers a variety of different hardware architecture options through a common high-level programming interface, so users can test out the machines from the various partners and determine which one would work best with their application, Finke said.
Google has done considerable core research on quantum computing in the cloud and is expected to launch a cloud computing service later this year. Google has been more focused on developing its in-house quantum computing capabilities and hardware rather than providing access to these tools to its cloud users, Park said. In the meantime, developers can test out quantum algorithms locally using Google's Circ programming environment for writing apps in Python.
In addition to the larger offerings from the major cloud providers, there are several alternative approaches to implementing quantum computers that are being provided through the cloud.
D-Wave is the furthest along, with a quantum annealer well-suited for many optimization problems. Other alternatives include QuTech, which is working on a cloud offering of its small quantum machine utilizing its spin qubits technology. Xanadu is another and is developing a quantum machine based on a photonic technology.
Researchers are pursuing a variety of approaches to quantum computing -- using electrons, ions or photons -- and it's not yet clear which approaches will pan out for practical applications first.
"Nobody knows which approach is best, or which materials are best. We're at the Edison light bulb filament stage, where Edison reportedly tested thousands of ways to make a carbon filament until he got to one that lasted 1,500 hours," Reynolds said. In the meantime, recent cloud offerings promise to enable developers to start experimenting with these different approaches to get a taste of what's to come.
Visit link:
Comments Off on The future of quantum computing in the cloud – TechTarget