The Prometheus League
Breaking News and Updates
- Abolition Of Work
- Ai
- Alt-right
- Alternative Medicine
- Antifa
- Artificial General Intelligence
- Artificial Intelligence
- Artificial Super Intelligence
- Ascension
- Astronomy
- Atheism
- Atheist
- Atlas Shrugged
- Automation
- Ayn Rand
- Bahamas
- Bankruptcy
- Basic Income Guarantee
- Big Tech
- Bitcoin
- Black Lives Matter
- Blackjack
- Boca Chica Texas
- Brexit
- Caribbean
- Casino
- Casino Affiliate
- Cbd Oil
- Censorship
- Cf
- Chess Engines
- Childfree
- Cloning
- Cloud Computing
- Conscious Evolution
- Corona Virus
- Cosmic Heaven
- Covid-19
- Cryonics
- Cryptocurrency
- Cyberpunk
- Darwinism
- Democrat
- Designer Babies
- DNA
- Donald Trump
- Eczema
- Elon Musk
- Entheogens
- Ethical Egoism
- Eugenic Concepts
- Eugenics
- Euthanasia
- Evolution
- Extropian
- Extropianism
- Extropy
- Fake News
- Federalism
- Federalist
- Fifth Amendment
- Fifth Amendment
- Financial Independence
- First Amendment
- Fiscal Freedom
- Food Supplements
- Fourth Amendment
- Fourth Amendment
- Free Speech
- Freedom
- Freedom of Speech
- Futurism
- Futurist
- Gambling
- Gene Medicine
- Genetic Engineering
- Genome
- Germ Warfare
- Golden Rule
- Government Oppression
- Hedonism
- High Seas
- History
- Hubble Telescope
- Human Genetic Engineering
- Human Genetics
- Human Immortality
- Human Longevity
- Illuminati
- Immortality
- Immortality Medicine
- Intentional Communities
- Jacinda Ardern
- Jitsi
- Jordan Peterson
- Las Vegas
- Liberal
- Libertarian
- Libertarianism
- Liberty
- Life Extension
- Macau
- Marie Byrd Land
- Mars
- Mars Colonization
- Mars Colony
- Memetics
- Micronations
- Mind Uploading
- Minerva Reefs
- Modern Satanism
- Moon Colonization
- Nanotech
- National Vanguard
- NATO
- Neo-eugenics
- Neurohacking
- Neurotechnology
- New Utopia
- New Zealand
- Nihilism
- Nootropics
- NSA
- Oceania
- Offshore
- Olympics
- Online Casino
- Online Gambling
- Pantheism
- Personal Empowerment
- Poker
- Political Correctness
- Politically Incorrect
- Polygamy
- Populism
- Post Human
- Post Humanism
- Posthuman
- Posthumanism
- Private Islands
- Progress
- Proud Boys
- Psoriasis
- Psychedelics
- Putin
- Quantum Computing
- Quantum Physics
- Rationalism
- Republican
- Resource Based Economy
- Robotics
- Rockall
- Ron Paul
- Roulette
- Russia
- Sealand
- Seasteading
- Second Amendment
- Second Amendment
- Seychelles
- Singularitarianism
- Singularity
- Socio-economic Collapse
- Space Exploration
- Space Station
- Space Travel
- Spacex
- Sports Betting
- Sportsbook
- Superintelligence
- Survivalism
- Talmud
- Technology
- Teilhard De Charden
- Terraforming Mars
- The Singularity
- Tms
- Tor Browser
- Trance
- Transhuman
- Transhuman News
- Transhumanism
- Transhumanist
- Transtopian
- Transtopianism
- Ukraine
- Uncategorized
- Vaping
- Victimless Crimes
- Virtual Reality
- Wage Slavery
- War On Drugs
- Waveland
- Ww3
- Yahoo
- Zeitgeist Movement
-
Prometheism
-
Forbidden Fruit
-
The Evolutionary Perspective
Category Archives: Quantum Computing
No One Gets Quantum Computing, Least Of All America’s National Institute of Standards and Technology – PC Perspective
Posted: August 6, 2022 at 7:47 pm
The only good news about Americas National Institute of Standards and Technology new Supersingular Isogeny Key Encapsulation, designed to be unbreakable by a quantum computer, is that it was subjected to extra testing before it became one of their four new quantum encryption algorithms. As it turns out, two Belgians named Wouter Castryck and Thomas Decru were able to break the Microsoft SIKE in under five minutes using a Intel Xeon CPU E5-2630v2 at 2.60GHz.
Indeed, they did it with a single core, which makes sense for security researchers well aware of the risks of running multithreaded; though why they stuck with a 22nm Ivy Bridge processor almost 10 years old is certainly a question. What makes even less sense is that encryption designed to resist quantum computing could be cracked by a traditional piece of silicon before the heat death of the universe.
This particular piece of quantum encryption has four parameter sets, called SIKEp434, SIKEp503, SIKEp610 and SIKEp751. The $50,000 bounty winners were able to crack SIKEp434 parameters in about 62 minutes. Two related instances, $IKEp182 and $IKEp217 they were able to crack in about 4 minutes and 6 minutes respectively. There are three other quantum encryption standards proposed along with this one, so there is some hope that they will be useful for now at least.
If you would like to read more about quantum computing, encryption as well as Richelot isogenies and abelian surfaces then read on at The Register.
Read more:
Posted in Quantum Computing
Comments Off on No One Gets Quantum Computing, Least Of All America’s National Institute of Standards and Technology – PC Perspective
One of the biggest names in quantum computing could have just cracked open the multibillion-dollar market with a new breakthrough – Fortune
Posted: at 7:47 pm
Quantinuum, the quantum computing company spun out from Honeywell, said this week that it had made a breakthrough in the technology that should help accelerate commercial adoption of quantum computers.
It has to do with real-time correction of errors.
One of the biggest issues with using quantum computers for any practical purpose is that the circuits in a quantum computer are highly susceptible to all kinds of electromagnetic interference, which causes errors in its calculations. These calculation errors must be corrected, either by using software, often after a calculation has run, or by using other physical parts of the quantum circuitry to check for and correct the errors in real time. So far, while scientists have theorized ways for doing this kind of real-time error correction, few of the methods had been demonstrated in practice on a real quantum computer.
The theoretically game-changing potential of quantum computers stems from their ability to harness the strange properties of quantum mechanics. These machines may also speed up the time it takes to run some calculations that can be done today on supercomputers, but which take hours or days. In order to achieve those results, though, ironing out the calculation errors is of utmost importance. In 2019, Google demonstrated that a quantum computer could perform one esoteric calculation in 200 seconds that it estimated would have taken a traditional supercomputer more than 10,000 years to compute. In the future, scientists think quantum computers will help make the production of fertilizer much more efficient and sustainable as well as create new kinds of space-age materials.
Thats why it could be such a big deal that Quantinuum just said it has demonstrated two methods for doing real-time error correction of the calculations a quantum computer runs.
Tony Uttley, Quantinuums chief operations officer, says the error-correction demonstration is an important proof point that the company is on track to being able to deliver a quantum advantage for some real-world commercial applications in the next 18 to 24 months. That means businesses will able to run some calculationspossibly for financial risk or logistics routingsignificantly faster, and perhaps with better results, by using quantum computers for at least part of the calculation than they could by just using standard computer hardware. This lends tremendous credibility to our road map, Uttley said.
Theres a lot of money in Quantinuums road map. This past February, the firms majority shareholder, Honeywell, foresaw revenue in Quantinuums future of $2 billion by 2026. That future could have just drawn nearer.
Uttley says that today, there is a wide disparity in the amount of money different companies, even direct competitors in the same industry, are investing in quantum computing expertise and pilot projects. The reason, he says, is that there are widely varying beliefs in how soon quantum computers will be able to run key business processes faster or better than existing methods on standard computers. Some people think it will happen in the next two years. Others think these nascent machines will only start to realize their business potential a decade from now. Uttley says he hopes this weeks error-correction breakthrough will help tip more of Quantinuums potential customers into the two-year camp.
A $2 billion market opportunity
Honeywells projection of at least $2 billion in revenue from quantum computing by 2026 was a revisiona year earlier than it had previously forecast. The error-correction breakthrough ought to give Honeywell more confidence in that projection.Quantinuum is one of the most prominent players in the emerging quantum computer industry, with Honeywell having made a bold and so far successful bet on one particular way of creating a quantum computer. That method is based on using powerful electromagnets to trap and manipulate ions. Others, such as IBM , Google, and Rigetti Computing, have created quantum computers using superconducting materials. Microsoft has been trying to create a variation of this superconducting-based quantum computer but using a slightly different technology that would be less prone to errors. Still others are creating quantum computers using lasers and photons. And some companies, such as Intel, have been working on quantum computers where the circuits are built using more conventional semiconductors.
The ability to perform real-time error correction could be a big advantage for Quantinuum and its trapped-ionbased quantum computers as it competes for a commercial edge over competing quantum computer companies. But Uttley points out that besides selling access to its own trapped-ion quantum computers through the cloud, Quantinuum also helps customers run algorithms on IBMs superconducting quantum computers. (IBM is also an investor in Quantinuum.)
Different kinds of algorithms and calculations may be better suited to one kind of quantum computer over another. Trapped ions tend to remain in a quantum state for relatively long periods of timewith the record being an hour. Superconducting circuits, on the other hand, tend to stay in a quantum state for a millisecond or less. But this also means that it takes much longer for a trapped-ion quantum computer to run a calculation than for a superconducting one, Uttley says. He envisions a future of hybrid computing where different parts of an algorithm are run on different machines in the cloudpartially on a traditional computer, partly on a trapped-ion quantum computer, and partly on a superconducting quantum computer.
In a standard computer, information is represented in a binary form, either a 0 or a 1, called a bit. Quantum computers use the principles of quantum mechanics to form their circuits, with each unit of the circuit called a qubit. Qubits can represent both 0 and 1 simultaneously. This means that each additional qubit involved in performing calculations doubles the power of a quantum computer. This doubling of power for every additional qubit is one reason that quantum computers will, in theory, be far more powerful than even todays largest supercomputers. But this is only true if the issue of error-correction can be successfully tackled and if scientists can figure out how to successfully link enough qubits together to exceed the power of existing standard high-performance computing clusters.
Quantinuum demonstrated two different error-correction methodsone called the five-qubit code and the other called the Steane code. Both methods use multiple physical qubits to represent one logical part of the circuit, with some of those qubits actually performing the calculation and the others checking and correcting errors in the calculation. As the name suggests, the five-qubit code uses five qubits, while the Steane code uses seven qubits. Uttley says that Quantinuum discovered that the Steane code worked significantly better than the five-qubit code.
That may mean it will become the dominant form of error correction, at least for trapped-ion quantum computers, going forward.
Sign up for theFortune Features email list so you dont miss our biggest features, exclusive interviews, and investigations.
Read more here:
Posted in Quantum Computing
Comments Off on One of the biggest names in quantum computing could have just cracked open the multibillion-dollar market with a new breakthrough – Fortune
D-Wave and DPCM Complete Their Business Combination – Quantum Computing Report
Posted: at 7:47 pm
D-Wave and DPCM Complete Their Business Combination
The companies announced that their SPAC merger has been approved and that D-Wave will become a public company and will be listed on the New York Stock Exchange (NYSE) under the ticker symbols QBTS for the common stock and QBTS WS for the warrants. Members of the companys management will ring the opening bell of the NYSE when trading starts on Monday, August 8. The transaction was first announced in February of this year and a shareholder vote to approve it occurred earlier this week. Shareholders of DPCM Capitals Class A Common Stock had the right to redeem their shares for pro rata portion of the funds in the companys trust account. The shareholders elected to redeem about 29 million of these shares out of the 37.5 million total requiring a total payment of $291 million for the redemptions. So those funds will not be available to D-Wave for working capital. Additional information about the completion of this business combination is available in a press release that can be seen here and also the Form 8-K the companies have filed with the Securities and Exchange Commission (SEC) here.
August 5, 2022
This site uses Akismet to reduce spam. Learn how your comment data is processed.
Go here to read the rest:
D-Wave and DPCM Complete Their Business Combination - Quantum Computing Report
Posted in Quantum Computing
Comments Off on D-Wave and DPCM Complete Their Business Combination – Quantum Computing Report
Explosive growth of faculty, courses and research signal new era for Computer Science at Yale – Yale University
Posted: at 7:47 pm
With numerous new courses, new faculty members, and a wider range of research fields, Computer Science (CS) at Yale is better positioned than ever to take on emerging challenges, and to meet the needs of students, interdisciplinary research on campus, and industry.
The CS department has recently hired nine tenure track faculty members and four teaching track lecturers to its ranks. These hires are in addition to an earlier round of 11 new tenure track faculty members and two lecturers hired in the last few years. The boost in hiring accomplishes a number of long-term goals, including expanding the department's areas of expertise. Also, as Computer Science has emerged as the second-most popular major (just behind economics) at Yale, it will go a long way toward meeting students' curriculum needs.
"Our new faculty members were chosen for the excellence of their research, as well as for their fields that they represent, all of which have been in high demand by both our students and faculty on campus as well as the industry," said Zhong Shao, the Thomas L. Kempner Professor of Computer Science and department chair. "The range of their expertise addresses some of the most critical challenges that we face today."
SEAS Dean Jeffrey Brock said the new faculty will be critical to realizing the ambitious goals set out in SEAS' Strategic Vision, particularly in the areas of artificial intelligence and robotics, while building in key areas like cybersecurity and distributed computing.
"This exciting cohort of new faculty stands to transform our CS department," Brock said. "During our recruiting season, they sensed Yale's momentum in CS and in engineering, ultimately turning down excellent offers at other top schools to join our faculty. Their presence will allow Yale CS to expand their course offerings, as well as to establish critical mass in core and cutting-edge research areas."
Many of the new faculty members, like Fan Zhang, cited the department's "fast growth in recent years." Others said that they were drawn by the collaborative environment at Yale, especially considering that Yale is ranked at or near the top in numerous research areas. Daniel Rakita, for instance, said he's looking forward to working with the Yale Medical School to see how his lab's robotics research can assist in hospital or home care settings, as well as working with the Wu Tsai Institute on Brain-Machine Interface technologies.
"Many people I spoke with indicated that there are no boundaries between departments at Yale, and interdisciplinary research is not just encouraged here, but is a 'way of life,'" Rakita said. Many of the new faculty have already engaged with key academic leaders around the campus, from medicine, to economics, to quantum computing.
As part of this boost in hiring, the department strategically targeted certain research areas, including artificial intelligence, trustworthy computing, robotics, quantum computing, and modeling.
The nine new tenure-track faculty hires, and their areas of research are below.
[We spoke to these new faculty members about their research, their motivations, potential collaborations, and much more. Click here to learn more about each of our latest faculty]
The four new teaching-track lecturer hires, and their areas of research are:
This hiring season marks the first since the changes in structure that made SEAS more independent, granting more faculty lines for growth.
"Our independence and ability to be opportunistic were key elements in our ability to realize this transformational growth of Computer Science at Yale," Brock said. "As CS plays such a critical role in an increasingly broad range of disciplines, the size and breadth of CS is critical to our strategy for SEAS. I'm thrilled to be able to take the first step in realizing that vision for a SEAS that is well integrated within its host University and aligned with its mission."
SEAS became independent from the Faculty of Arts and Sciences in July of 2022.
A curriculum to meet the needs of students and industry
Increasing the department's curriculum has also been in the planning stages for a while, a goal made possible by the recent hires of new faculty and lecturers. Shao said there was a concerted effort to meet the high demand in areas such as artificial intelligence, blockchain, machine learning, introductory programming and CS courses for non-majors.
"This has been on the to-do list for the department for many years, but we just didn't have the manpower," Shao said. "And finally, with the new faculty hires, we can actually offer these courses."
Ben Fisch, for instance, will be teaching a new course on blockchains for both graduate students and advanced undergraduates in computer science. Tesca Fitzgerald will introduce a new graduate-level seminar on Interactive Robot Learning. And Katerina Sotiraki will teach classes in theoretical and applied cryptography, at both the undergraduate and graduate level. These are just a few of the new courses that will be available.
Responding to industry needs, the department has also added courses focused on what's known as full stack web programming - that is, the set of skills needed to develop the interface as well as the coding behind building a complete web application. One of the department's most popular courses, on software engineering, will now be offered for both semesters of the year, instead of one. Both, Shao said, are specifically aimed at the needs of industry and students.
"As new challenges emerge, Computer Science at Yale will continue to adapt," Shao said. "We're excited about the future of our department, and these new additions to our faculty and our curriculum are going to be a major part of it."
Read the original:
Posted in Quantum Computing
Comments Off on Explosive growth of faculty, courses and research signal new era for Computer Science at Yale – Yale University
CXL Brings Datacenter-sized Computing with 3.0 Standard, Thinks Ahead to 4.0 – HPCwire
Posted: at 7:47 pm
A new version of a standard backed by major cloud providers and chip companies could change the way some of the worlds largest datacenters and fastest supercomputers are built.
The CXL Consortium on Tuesday announced a new specification called CXL 3.0 also known as Compute Express Link 3.0 that eliminates more chokepoints that slow down computation in enterprise computing and datacenters.
The new spec provides a communication link between chips, memory and storage in systems, and it is two times faster than its predecessor called CXL 2.0.
CXL 3.0 also has improvements for more fine-grained pooling and sharing of computing resources for applications such as artificial intelligence.
CXL 3.0 is all about improving bandwidth and capacity, and can better provision and manage computing, memory and storage resources, said Kurt Lender, the co-chair of the CXL marketing work group (and senior ecosystem manager at Intel), in an interview with HPCwire.
Hardware and cloud providers are coalescing around CXL, which has steamrolled other competing interconnects. This week, OpenCAPI, an IBM-backed interconnect standard, merged with CXL Consortium, following the footsteps of Gen-Z, which did the same in 2020.
CXL released the first CXL 1.0 specification in 2019, and quickly followed it up with CXL 2.0, which supported PCIe 5.0, which is found in a handful of chips such as Intels Sapphire Rapids and Nvidias Hopper GPU.
The CXL 3.0 spec is based on PCIe 6.0, which was finalized in January. CXL has a data transfer speed of up to 64 gigatransfers per second, which is the same as PCIe 6.0.
The CXL interconnect can link up chips, storage and memory that are near and far from each other, and that allows system providers to build datacenters as one giant system, said Nathan Brookwood, principal analyst at Insight 64.
CXLs ability to support the expansion of memory, storage and processing in a disaggregated infrastructure gives the protocol a step-up over rival standards, Brookwood said.
Datacenter infrastructures are moving to a decoupled structure to meet the growing processing and bandwidth needs for AI and graphics applications, which require large pools of memory and storage. AI and scientific computing systems also require processors beyond just CPUs, and organizations are installing AI boxes, and in some cases, quantum computers, for more horsepower.
CXL 3.0 improves bandwidth and capacity with better switching and fabric technologies, CXL Consortiums Lender said.
CXL 1.1 was sort of in the node, then with 2.0, you can expand a little bit more into the datacenter. And now you can actually go across racks, you can do decomposable or composable systems, with the fabric technology that weve brought with CXL 3.0, Lender said.
At the rack level, one can make CPU or memory drawers as separate systems, and improvements in CXL 3.0 provide more flexibility and options in switching resources compared to previous CXL specifications.
Typically, servers have a CPU, memory and I/O, and can be limited in physical expansion. In disaggregated infrastructure, one can take a cable to a separate memory tray through a CXL protocol without relying on the popular DDR bus.
You can decompose or compose your datacenter as you like it. You have the capability of moving resources from one node to another, and dont have to do as much overprovisioning as we do today, especially with memory, Lender said, adding its a matter of you can grow systems and sort of interconnect them now through this fabric and through CXL.
The CXL 3.0 protocol uses the electricals of the PCI-Express 6.0 protocol, along with its protocols for I/O and memory. Some improvements include support for new processors and endpoints that can take advantage of the new bandwidth. CXL 2.0 had single-level switching, while 3.0 has multi-level switching, which provides more latency on the fabric.
You can actually start looking at memory like storage you could have hot memory and cold memory, and so on. You can have different tiering and applications can take advantage of that, Lender said.
The protocol also accounts for the ever-changing infrastructure of datacenters, providing more flexibility on how system administrators want to aggregate and disaggregate processing units, memory and storage. The new protocol opens more channels and resources for new types of chips that include SmartNICs, FPGAs and IPUs that may require access to more memory and storage resources in datacenters.
HPC composable systems youre not bound by a box. HPC loves clusters today. And [with CXL 3.0] now you can do coherent clusters and low latency. The growth and flexibility of those nodes is expanding rapidly, Lender said.
The CXL 3.0 protocol can support up to 4,096 nodes, and has a new concept of memory sharing between different nodes. That is an improvement from a static setup in older CXL protocols, where memory could be sliced and attached to different hosts, but could not be shared once allocated.
Now we have sharing where multiple hosts can actually share a segment of memory. Now you can actually look at quick, efficient data movement between hosts if necessary, or if you have an AI-type application that you want to hand data from one CPU or one host to another, Lender said.
The new feature allows peer-to-peer connection between nodes and endpoints in a single domain. That sets up a wall in which traffic can be isolated to move only between nodes connected to each other. That allows for faster accelerator-to-accelerator or device-to-device data transfer, which is key in building out a coherent system.
If you think about some of the applications and then some of the GPUs and different accelerators, they want to pass information quickly, and now they have to go through the CPU. With CXL 3.0, they dont have to go through the CPU this way, but the CPU is coherent, aware of whats going on, Lender said.
The pooling and allocation of memory resources is managed by a software called Fabric Manager. The software can sit anywhere in the system or hosts to control and allocate memory, but it could ultimately impact software developers.
If you get to the tiering level, and when you start getting all the different latencies in the switching, thats where there will have to be some application awareness and tuning of application. I think we certainly have that capability today, Lender said.
It could be two to four years before companies start releasing CXL 3.0 products, and the CPUs will need to be aware of CXL 3.0, Lender said. Intel built in support for CXL 1.1 in its Sapphire Rapids chip, which is expected to start shipping in volume later this year. The CXL 3.0 protocol is backward compatible with the older versions of the interconnect standard.
CXL products based on earlier protocols are slowly trickling into the market. SK Hynix this week introduced its first DDR5 DRAM-based CXL (Compute Express Link) memory samples, and will start manufacturing CXL memory modules in volume next year. Samsung has also introduced CXL DRAM earlier this year.
While products based on CXL 1.1 and 2.0 protocols are on a two-to-three-year product release cycle, CXL 3.0 products could take a little longer as it takes on a more complex computing environment.
CXL 3.0 could actually be a little slower because of some of the Fabric Manager, the software work. Theyre not simple systems when you start getting into fabrics, people are going to want to do proof of concepts and prove out the technology first. Its going to probably be a three-to-four year timeframe, Lender said.
Some companies already started work on CXL 3.0 verification IP six to nine months ago, and are finetuning the tools to the final specification, Bender said.
The CXL has a board meeting in October to discuss the next steps, which could also involve CXL 4.0. The standards organization for PCIe, called the PCI-Special Interest Group, last month announced it was planning PCIe 7.0, which increases the data transfer speed to 128 gigatransfers per second, which is double that of PCIe 6.0.
Lender was cautious about how PCIe 7.0 could potentially fit into a next-generation CXL 4.0. CXL has its own set of I/O, memory and cache protocols.
CXL sits on the electricals of PCIe so I cant commit or absolutely guarantee that [CXL 4.0] will run on 7.0. But thats the intent to use the electricals, Lender said.
Under that case, one of the tenets of CXL 4.0 will be to double the bandwidth by going to PCIe 7.0, but beyond that, everything else will be what we do more fabric or do different tunings, Lender said.
CXL has been on an accelerated pace, with three specification releases since its formation in 2019. There was confusion in the industry on the best high-speed, coherent I/O bus, but the focus has now coagulated around CXL.
Now we have the fabric. There are pieces of Gen-Z and OpenCAPI that arent even in CXL 3.0, so will we incorporate those? Sure, well look at doing that kind of work moving forward, Lender said.
Continued here:
CXL Brings Datacenter-sized Computing with 3.0 Standard, Thinks Ahead to 4.0 - HPCwire
Posted in Quantum Computing
Comments Off on CXL Brings Datacenter-sized Computing with 3.0 Standard, Thinks Ahead to 4.0 – HPCwire
IonQ to Participate in Third Annual North American Conference on Trapped Ions – HPCwire
Posted: August 2, 2022 at 2:44 pm
COLLEGE PARK, Md., Aug. 2, 2022 IonQ, an industry leader in quantum computing, today announced its participation in the third annual North American Conference on Trapped Ions (NACTI). The event will take place at Duke University on August 1-4, 2022, and brings together dozens of the worlds leading quantum scientists and researchers to discuss the latest advancements in the field of quantum.
Participating for the third time at this event, IonQ co-founder and CTO Jungsang Kim will speak on the latest IonQ Aria performance updates, IonQ Forte gate results, and the importance of an industry-wide benchmarks based on a collection of real-world algorithms such as algorithmic qubits (#AQ) that can better represent any quantum computers performance and utility.
Other topics on the agenda for NACTI include: quantum scaling and architectures, including networking; fabrication and development of new traps; increasing accessibility; control hardware and software for trapped ions; new qub(d)its and gates; quantum computing and simulation employing ion trapping techniques; looking beyond atomic ions; precision measurements and clocks; among others.
To learn more about IonQ Aria with details on performance and its technical prowess, click the link here for more information.
About IonQ
IonQ, Inc. is a leader in quantum computing, with a proven track record of innovation and deployment. IonQs current generation quantum computer, IonQ Forte, is the latest in a line of cutting-edge systems, including IonQ Aria, a system that boasts industry-leading 20 algorithmic qubits. Along with record performance, IonQ has defined what it believes is the best path forward to scale. IonQ is the only company with its quantum systems available through the cloud on Amazon Braket, Microsoft Azure, and Google Cloud, as well as through direct API access. IonQ was founded in 2015 by Christopher Monroe and Jungsang Kim based on 25 years of pioneering research. To learn more, visit http://www.ionq.com.
Source: IonQ
Go here to see the original:
IonQ to Participate in Third Annual North American Conference on Trapped Ions - HPCwire
Posted in Quantum Computing
Comments Off on IonQ to Participate in Third Annual North American Conference on Trapped Ions – HPCwire
Phasecraft receives two research grants as part of the Commercialising Quantum Technologies Challenge at UK Research and Innovation – PR Web
Posted: at 2:44 pm
"Were excited to be working with world experts on telecommunications networks at BT, and extending our ongoing partnership with Rigetti, to apply quantum algorithms to optimisation problems," says Phasecraft co-founder, Ashley Montanaro.
BRISTOL, England (PRWEB) August 02, 2022
Today Phasecraft, the quantum algorithms company, announced that it has jointly received two research grants from UK Research and Innovation (UKRI) as part of the Commercialising Quantum Technologies Challenge delivered by Innovate UK.
In collaboration with BT and Rigetti, Phasecraft will lead a grant-funded project focused on the development of near-term quantum computing for solving hard optimisation problems and constraint satisfaction problems. Computational problems in an array of fields including network design, electronic design automation, logistics, and scheduling are characterised by needing to find a solution among exponentially many potential solutions. Such problems are, therefore, exceptionally challenging, yet their applications and commercial potential are vast.
Phasecrafts goal is to significantly reduce the timescale for quantum advantage in several critical areas, says Phasecraft co-founder, Ashley Montanaro. Were excited to be working with world experts on telecommunications networks at BT, and extending our ongoing partnership with Rigetti, to apply quantum algorithms to optimisation problems. This project will build on our expertise in key underlying technologies, enabling us to determine whether near-term quantum computing could outperform classical methods in this application domain.
The second grant awarded to Phasecraft supports the development of near-term quantum computing to simulate currently intractable problems in materials modelling for photovoltaics. Leading this project in collaboration with UCL and Oxford PV a leading company pioneering the commercialisation of perovskite photovoltaics this award will enable the development of a modelling capability that is tailored to the real-world needs of the photovoltaics industry.
Phasecraft has already proven that quantum computers have the potential to revolutionise materials modelling, even before fully scalable, fault-tolerant quantum computers become available, says Phasecraft co-founder Toby Cubitt. The results we have obtained for battery materials are hugely encouraging and show how our work can really make the difference in critically important areas. We know that photovoltaics has a crucial role to play in the transition to green energy, and we are hugely excited to be the ones making quantum computing part of the green revolution.
Phasecrafts team brings together many of the worlds leading quantum scientists and engineers, partnering with the worlds leading developers of quantum hardware. The teams research has led to fundamental breakthroughs in quantum science, and Phasecraft is the market leader in quantum IP.
To learn more about our scientific research, business partnerships, career opportunities, and fellowships, please visit phasecraft.io.
About Phasecraft
Phasecraft is the quantum algorithms company. Were building the mathematical foundations for quantum computing applications that solve real-world problems.
Our team brings together many of the worlds leading quantum scientists, including founders Toby Cubitt, Ashley Montanaro, and John Morton and quantum consultant Andrew Childs.
Through our partnerships with Google, IBM, and Rigetti we enjoy unprecedented access to todays best quantum computers, which provides us with unique opportunities to develop foundational IP, inform the development of next-generation quantum hardware, and accelerate commercialization of high-value breakthroughs.
We are always looking for talented research scientists and partners interested in joining us on the front lines of quantum computing. To learn more about our scientific research, business partnerships, career opportunities, and fellowships, please visit phasecraft.io.
Share article on social media or email:
Read the original post:
Posted in Quantum Computing
Comments Off on Phasecraft receives two research grants as part of the Commercialising Quantum Technologies Challenge at UK Research and Innovation – PR Web
The Story of IQIM: Institute for Quantum Information and Matter Caltech Magazine – Caltech
Posted: at 2:44 pm
Then, in 2000, Preskill and Kimble received a grant from the National Science Foundation, which they used to form the Institute for Quantum Information (IQI) that same year.
NSF got a surge of funding for a program they called Information Technology Research, which included a lot of practical things but also sort of a lunatic fringe of blue-sky research. And thats what we were part of, Preskill told AIP. We had an amazing group of young people in the early 2000s who came through, many of whom are leaders of research in quantum information now, like Patrick Hayden, and Guifr Vidal, and Frank Verstraete, and quite a few others.
Vidal (postdoc 0105), now a senior staff research scientist at Google, recalled those early days as a Caltech postdoc during a Heritage Project interview: John had the vision ... to hire interesting young people for [IQI], then apply a hands-off approach. Hes not the type of person who needs to control everything and everyone.
Dave Bacon (BS 97), a former IQI postdoc, remembered IQI as a leading hub for quantum computing research:
John literally started inviting everybody in the field to come visit. It was like all of quantum computing was flowing through that place, and I was in the main place we'd have the group meetings, he said in a Heritage Project interview. It felt like everybody would come in and give a talk right outside my office. It was perfect.
Liang Jiang (BS 04), a former IQI postdoc and current professor at the University of Chicago, told Zierler during a Heritage Project interview that weekly meetings were so full of discussion and questions that Preskill had to impose a time limit: You could only talk for one minute because some group members would get really excited with the results and would talk a lot about their research.
By 2011, advances in quantum computing hardware, such as superconducting circuits and qubits (the quantum mechanical analogue of a classical bit) gave Preskill and Kimble the impetus to apply for more NSF funding as a means to broaden the IQIs scope to include experimental work. They received that funding and, in 2011, changed its name to the Institute for Quantum Information and Matter, for which Preskill serves as the Allen V. C. Davis and Lenabelle Davis Leadership Chair of the Institute for Quantum Science and Technology.
Spiros Michalakis, staff researcher and manager of outreach at IQIM, described this name change in a recent Heritage Project interview as a visionary move, one that is still paying off: We attach Mmatterand it really mattered because we started to have conversations with how you can implement certain things and how you can convert some of the theories into experiments. I didnt know many physicists or many people who were part of physics or even mathematical physics who were not, basically, in one way or another, associated with IQIM. If you look at the roster, even now, for the second iteration of IQIM, the second cycle we have, theres a pretty cool medley of people.
As a sign of quantum computings progression at Caltech and beyond, the Institute partnered with Amazon to build the AWS Center for Quantum Computing, which opened on campus last year. The goal of the collaboration is to create quantum computers and related technologies that have the potential to revolutionize data security, machine learning, medicine development, sustainability practices, and more.
It is wonderful to see many of the graduate students and postdocs from the early days of IQIM come back to campus as senior research scientists at the AWS Center for Quantum Computing, Michalakis says. IQIM brought together theorists and experimentalists with a vision toward a transformative future for all. Amazingly, we are reaping the benefits of that vision already, as the era of quantum information science and engineering unfolds before our eyes at an unprecedented pace. What an exciting time to be alive.
Read this article:
The Story of IQIM: Institute for Quantum Information and Matter Caltech Magazine - Caltech
Posted in Quantum Computing
Comments Off on The Story of IQIM: Institute for Quantum Information and Matter Caltech Magazine – Caltech
Coding the future | Business | insidetucsonbusiness.com – Inside Tucson Business
Posted: July 31, 2022 at 8:36 pm
At Quantum Quest, an all-girls quantum computing camp, 20 teenage female students recently stood on the precipice of a brand new technology: quantum coding.
(Scientists) use quantum computers, program manager Gabbie Meis said. (Quantum computers) actually use quantum mechanics to solve some of the worlds largest problems, like things with lots of data or simulations that our classical computers just dont have enough power to do. Instead of our classical computers, quantum computers are actually an entirely different type of machine that is still being developed today.
This kind of computer requires quantum coding and when programmed could be used to help solve problems like mitigating the impacts of climate change; transportation mapping, such as figuring out how to remap the entire country of Australia with more efficient roadways; or even biomedical research, such as protein folding for vaccine development or drug discovery research.
Back in 2019 Google ran a problem on their quantum computer that they estimated would take the most powerful supercomputer about 10,000 years to solve, Meis said. They said they got their (quantum) computers to solve it in less than two days.
During the camp, students learned the programming language, Qiskit, an open source (free) software development kit. Meis called it a Python-backed library, Python being a programming language. Qiskit allows the students classical computers the kind most of use at home to communicate with quantum computers. Ironically, although the students all had their laptops open, the learning was done on dry erase boards.
Quantum is interdisciplinary so theyre learning the basics in linear algebra, Meis said. Theyre learning computer science and how to code in Python, and theyre learning quantum physics, all wrapped in this single week.
The Coding School, located in Southern California, has a quantum coding initiative called Qubit by Qubit, the most basic unit of information in quantum computing. The initiative seeks to make quantum computing education accessible to students in K-12, because as it stands right now, according to Meis, students dont usually see quantum computing until they are graduate students.
To bring quantum coding to the masses, the school developed the Quantum Quest camp and partners with other organizations to offer it locally. For Tucson, they partnered with the University of Arizonas Office of Societal Impact and the Girl Scouts of Southern Arizona (GSSA).
When this all came about it was the perfect marriage between the Coding School, the U of A and the Girl Scouts in trying to bring accessibility to this more advanced part of STEM, said Colleen McDonald, director of staff supported programs for the GSSA. As Girl Scouts we see ourselves as the connector. We want to make sure that all girls have access to it.
The Coding School has been offering this camp for some time this is its 10th camp but its the first time its been offered in Tucson. Camp topics included everything from foundational concepts that make up the quantum world such as entanglement and qubits, and end with teaching girls how to code real quantum computers.
Its all new science. These students are at the very foundation of quantum coding, according to Meis, and that is part of why it is so important to offer this to young women. One, they are introduced to quantum computing, but two, so they are not alone and do not feel alone in their interest in this field, Meis said.
This is a hard science, right? Meis said. We really want our students to feel that theres a place in this for girls. Were really trying to empower them now while theyre still in high school.
Ive worked with girls for two decades doing STEM with them and one of the biggest things I hear is they think that theyre alone in liking STEM, that they dont realize there are other girls who are also willing to push themselves, Michelle Higgins added. Shes the associate director of the Office of Societal Impact.
The lead instructor for this camp is herself an example to these students. Emily Van Milligen is a doctoral student at the UArizona department of physics. Her field of study is quantum entanglement and routing protocols. She noticed that not one student fell behind; they all listened.
They love it, Van Milligen said. They like the lectures Im giving, which is exciting because that means they enjoy the content. Im not doing anything that special.
One student, 18-year-old Sagan Friskey and future Pima Community College student, spoke enthusiastically about the camp.
I think its super interesting to learn about, especially since were at the very beginning of it becoming a part of something that you can learn about and work with, she said.
Gabriela Malo-Molina, 14, and a student at Catalina Foothills High School, said shes never seen this before but could be interested in looking deeper into it.
I think this is a very special opportunity, and that this field will definitely be more commonly used in the future, she said. And quantum computing in the future will be very helpful for discoveries, especially in the medical field.
Visit link:
Coding the future | Business | insidetucsonbusiness.com - Inside Tucson Business
Posted in Quantum Computing
Comments Off on Coding the future | Business | insidetucsonbusiness.com – Inside Tucson Business
Multiverse Collaborating with Bosch to Optimize Quality, Efficiency, and Performance in an Automotive Electronic Components Manufacturing Plant -…
Posted: at 8:36 pm
Multiverse Collaborating with Bosch to Optimize Quality, Efficiency, and Performance in an Automotive Electronic Components Manufacturing Plant
Multiverse and Bosch will be working to create a quantum computing model of the machinery and process flow in at one of Boschs manufacturing plants in a process known as digital twin. This is a technique where a model of the activities in the facility will be created inside the computer and then enable various simulations and optimizations to be performed which can predict how the plant will perform under different scenarios. The companies will be using both customized quantum and quantum inspired algorithms developed by Multiverse in order to model an automotive electronic components plants located in Madrid, Spain. The companies hope to have first results of this pilot implementation by the end of the year with a goal of finding ways to enhance quality control, improve overall efficiencies, minimize waste, and lower energy usage. Bosch has a total of 240 manufacturing plants that include over 120,000 machines and 250,000 devices which are connected together to provide them with digital control and sensing to optimize performance. So a successful implementation of this digital twin concept could be extended to many more factories and provide Bosch with a significant productivity advantage in the future. A news release from Multiverse about this collaboration can be accessed on their website here.
July 30, 2022
This site uses Akismet to reduce spam. Learn how your comment data is processed.
Go here to read the rest:
Posted in Quantum Computing
Comments Off on Multiverse Collaborating with Bosch to Optimize Quality, Efficiency, and Performance in an Automotive Electronic Components Manufacturing Plant -…