Page 78«..1020..77787980..90100..»

Category Archives: Quantum Computing

$100 Million to Advance Duke Science and Technology Research – Duke Today

Posted: June 9, 2021 at 3:12 am

The Duke Endowment of Charlotte, N.C., is supporting Duke Universitys efforts to expand its faculty in computation, materials science and the resilience of the body and brain by completing the second phase of a $100 million investment.

This is the largest award Duke University has ever received. Advancing Science and Technology

Better designs to capture the full potential of carbon-neutral energy. Harnessing the brain's resilience to fight Alzheimer's Disease. Developing Develop cybersecurity tools to defend us from future threats. Read about these and other investments Duke is making in science and technology research and teaching.

The funds form the base of Duke Science and Technology, a faculty-hiring and fund-raising effort designed to elevate excellence in the sciences at Duke. They will be used to accelerate and expand the recruitment of new faculty in science, medicine, technology, engineering and mathematics. The funds will also expand core research strengths that allow Duke faculty to address difficult global challenges and prepare Duke students to be the leaders of the future.

This extraordinary gift from The Duke Endowment advances our universitys position as a destination for exceptional and visionary faculty in a competitive global market, said Duke President Vincent E. Price. These scholars will accelerate discovery and collaborative research across our campus and around the world. Dukes next century will be one of unbounded intellectual curiosity in which uniquely talented and creative scientists come together in new ways to ask the most difficult questions and try to tackle the most critical challenges of our day.

The first $50 million of The Duke Endowments historic commitment to support Duke Science and Technology was announced in 2019.

Minor Shaw, chair of the Endowments Board of Trustees, said The Duke Endowments founder, James B. Duke, was a visionary leader in business and philanthropy who seized opportunities to experiment and innovate. Advancements in science and technology will transform our world, Shaw said. By investing in the next generation of faculty at Duke, we can achieve a better future for us all.

The funding comes at a time when Duke is placing big bets on emerging technologies like quantum computing and addressing global challenges such as climate change and pandemic disease.

The faculty we are able to recruit thanks to this investment from The Duke Endowment have enormous quality and potential, said Provost Sally Kornbluth, the universitys chief academic officer. We are confident that their work will result in increased impact, elevate Duke to new levels of scientific discovery and improve health outcomes for the citizens of North Carolina and beyond. We want to continue to build on this success.

In the two years since the university announced the first half of this $100 million award, the Duke Endowments investment has been used to recruit and retain some of the countrys leading scholar-scientists in a range of disciplines.

At Duke, we are redefining what is possible in preventing and treating a range of health conditions from cancer, brain disorders and infectious diseases to behavioral health issues, said A. Eugene Washington, M.D., chancellor for health affairs and president and chief executive officer of the Duke University Health System. This generous gift ensures that our exceptional research community will continue to thrive with the very best scientists who value collaboration and interdisciplinarity, and drive bold ideas."

Duke will continue a targeted effort to recruit scientist-scholars at all levels in its strategic areas. The hiring effort is expected to continue over the next few years.

--- --- ---

Based in Charlotte and established in 1924 by industrialist and philanthropist James B. Duke, The Duke Endowment is a private foundation that strengthens communities in North Carolina and South Carolina by nurturing children, promoting health, educating minds and enriching spirits. Since its founding, it has distributed more than $4 billion in grants. The Endowment shares a name with Duke University and Duke Energy, but all are separate organizations.

Go here to read the rest:

$100 Million to Advance Duke Science and Technology Research - Duke Today

Posted in Quantum Computing | Comments Off on $100 Million to Advance Duke Science and Technology Research – Duke Today

Looking to the future of quantum cloud computing – Siliconrepublic.com – Siliconrepublic.com

Posted: June 4, 2021 at 3:51 pm

Trinity College Dublins Dan Kilper and University of Arizonas Saikat Guha discuss the quantum cloud and how it could be achieved.

Quantum computing has been receiving a lot of attention in recent years as several web-scale providers race towards so-called quantum advantage the point at which a quantum computer is able to exceed the computing abilities of classical computing.

Large public sector investments worldwide have fuelled research activity within the academic community. The first claim of quantum advantage emerged in 2019 when Google, NASA and Oak Ridge National Laboratory (ORNL) demonstrated a computation that the quantum computer completed in 200 seconds and that the ORNL supercomputer verified up to the point of quantum advantage, estimated to require 10,000 years to complete to the end.

Roadmaps that take quantum computers even further into this regime are advancing steadily. IBM has made quantum computers available for online access for many years now and recently Amazon and Microsoft started cloud services to provide access for users to several different quantum computing platforms. So, what comes next?

The step beyond access to a single quantum computer is access to a network of quantum computers. We are starting to see this emerge from the web or cloud-based quantum computers offered by cloud providers effectively quantum computing as a service, sometimes referred to as cloud-based quantum computing.

This consists of quantum computers connected by classical networks and exchanging classical information in the form of bits, or digital ones and zeros. When quantum computers are connected in this way, they each can perform separate quantum computations and return the classical results that the user is looking for.

It turns out that with quantum computers, there are other possibilities. Quantum computers perform operations on quantum bits, or qubits. It is possible for two quantum computers to exchange information in the form of qubits instead of classical bits. We refer to networks that transport qubits as quantum networks. If we can connect two or more quantum computers over a quantum network, then they will be able to combine their computations such that they might behave as a single larger quantum computer.

Quantum computing distributed over quantum networks thus has the potential to significantly enhance the computing power of quantum computers. In fact, if we had quantum networks today, many believe that we could immediately build large quantum computers far into the advantage regime simply by connecting many instances of todays quantum computers over a quantum network. With quantum networks built, and interconnected at various scales, we could build a quantum internet. And at the heart of this quantum internet, one would expect to find quantum computing clouds.

At present, scientists and engineers are still working on understanding how to construct such a quantum computing cloud. The key to quantum computing power is the number of qubits in the computer. These are typically micro-circuits or ions kept at cryogenic temperatures, near minus 273 degrees Celsius.

While these machines have been growing steadily in size, it is expected that they will eventually reach a practical size limit and therefore further computing power is likely to come from network connections across quantum computers within the data centre, very much like todays current classical computing data centres. Instead of racks of servers, one would expect rows of cryostats.

Quantum computing distributed over quantum networks has the potential to significantly enhance the computing power of quantum computers

Once we start imagining a quantum internet, we quickly realise that there are many software structures that we use in the classical internet that might need some type of analogue in the quantum internet.

Starting with the computers, we will need quantum operating systems and computing languages. This is complicated by the fact that quantum computers are still limited in size and not engineered to run operating systems and programming the way that we do in classical computers. Nevertheless, based on our understanding of how a quantum computer works, researchers have developed operating systems and programming languages that might be used once a quantum computer of sufficient power and functionality is able to run them.

Cloud computing and networking rely on other software technologies such as hypervisors, which manage how a computer is divided up into several virtual machines, and routing protocols to send data over the network. In fact, research is underway to develop each of these for the quantum internet. With quantum computer operating systems still under development, it is difficult to develop a hypervisor to run multiple operating systems on the same quantum computer as a classical hypervisor would.

By understanding the physical architecture of quantum computers, however, one can start to imagine how it might be organised to support different subsets of qubits to effectively run as separate quantum computers, potentially using different physical qubit technologies and employing different sub-architectures, within a single machine.

One important difference between quantum and classical computers and networks is that quantum computers can make use of classical computers to perform many of their functions. In fact, a quantum computer in itself is a tremendous feat of classical system engineering with many complex controls to set up and operate the quantum computations. This is a very different starting point from classical computers.

The same can be said for quantum networks, which have the classical internet to provide control functions to manage the network operations. It is likely that we will rely on classical computers and networks to operate their quantum analogues for some time. Just as a computer motherboard has many other types of electronics other than the microprocessor chip, it is likely that quantum computers will continue to rely on classical processors to do much of the mundane work behind their operation.

With the advent of the quantum internet, it is presumable that a quantum-signalling-equipped control plane might be able to support certain quantum network functions even more efficiently.

When talking about quantum computers and networks, scientists often refer to fault-tolerant operations. Fault tolerance is a particularly important step toward realising quantum cloud computing. Without fault tolerance, quantum operations are essentially single-shot computations that are initialised and then run to a stopping point that is limited by the accumulation of errors due to quantum memory lifetimes expiring as well as the noise that enters the system with each step in the computation.

Fault tolerance would allow for quantum operations to continue indefinitely with each result of a computation feeding the next. This is essential, for example, to run a computer operating system.

In the case of networks, loss and noise limit the distance that qubits can be transported on the order of 100km today. Fault tolerance through operations such as quantum error correction would allow for quantum networks to extend around the world. This is quite difficult for quantum networks because, unlike classical networks, quantum signals cannot be amplified.

We use amplifiers everywhere in classical networks to boost signals that are reduced due to losses, for example, from traveling down an optical fibre. If we boost a qubit signal with an optical amplifier, we would destroy its quantum properties. Instead, we need to build quantum repeaters to overcome signal losses and noise.

Together we have our sights set on realising the networks that will make up the quantum internet

If we can connect two fault-tolerant quantum computers at a distance that is less than the loss limits for the qubits, then the quantum error correction capabilities in the computers can in principle recover the quantum signal. If we build a chain of such quantum computers each passing quantum information to the next, then we can achieve the fault-tolerant quantum network that we need. This chain of computers linking together is reminiscent of the early classical internet when computers were used to route packets through the network. Today we use packet routers instead.

If you look under the hood of a packet router, it is composed of many powerful microprocessors that have replaced the computer routers and are much more efficient at the specific routing tasks involved. Thus, one might imagine a quantum analogue to the packet router, which would be a small purpose-built quantum computer designed for recovering and transmitting qubits through the network. These are what we refer to today as quantum repeaters, and with these quantum repeaters we could build a global quantum internet.

Currently there is much work underway to realise a fault-tolerant quantum repeater. Recently a team in the NSF Center for Quantum Networks (CQN)achieved an important milestone in that they were able to use a quantum memory to transmit a qubit beyond its usual loss limit. This is a building block for a quantum repeater. The SFI Connect Centre in Ireland is also working on classical network control systems that can be used to operate a network of such repeaters.

Together we have our sights set on realising the networks that will make up the quantum internet.

By Dan Kilper and Saikat Guha

Dan Kilper is professor of future communication networks at Trinity College Dublin and director of the Science Foundation Ireland (SFI) Connect research centre.

Saikat Guha is director of the NSF-ERC Center for Quantum Networks and professor of optical sciences, electrical and computer engineering, and applied mathematics at the University of Arizona.

Read the rest here:

Looking to the future of quantum cloud computing - Siliconrepublic.com - Siliconrepublic.com

Posted in Quantum Computing | Comments Off on Looking to the future of quantum cloud computing – Siliconrepublic.com – Siliconrepublic.com

Swedish university is behind quantum computing breakthrough – ComputerWeekly.com

Posted: at 3:51 pm

Swedens Chalmers University of Technology has achieved a quantum computing efficiency breakthrough through a novel type of thermometer that is capable of simplifying and rapidly measuring temperatures during quantum calculations.

The discovery adds a more advanced benchmarking tool that will accelerate Chalmers work in quantum computing development.

The novel thermometer is the latest innovation to emerge from the universitys research to develop an advanced quantum computer. The so-called OpenSuperQ project at Chalmers is coordinated with technology research organisation the Wallenberg Centre for Quantum Technology (WACQT), which is the OpenSuperQ projects main technology partner.

WACQT has set the goal of building a quantum computer capable of performing precise calculations by 2030. The technical requirements behind this ambitious target are based on superconducting circuits and developing aquantum computer with at least 100 well-functioning qubits. To realise this ambition, the OpenSuperQ project will require a processor working temperature close to absolute zero, ideally as low as 10 millikelvin (-273.14 C).

Headquartered at Chalmers Universitys research hub in Gothenburg, the OpenSuperQ project, launched in 2018, is intended to run until 2027. Working alongside the university in Gothenburg, WACQT is also operating support projects being run at the Royal Institute of Technology (Kungliga Tekniska Hgskolan) in Stockholm and collaborating universities in Lund, Stockholm, Linkping and Gothenburg.

Pledged capital funding for the WACQT-managed OpenSuperQ project which has been committed by the Knut and Alice Wallenberg Foundation together with 20 other private corporations in Sweden, currently amounts to SEK1.3bn (128m). In March, the foundation scaled up its funding commitment to WACQT, doubling its annual budget to SEK80m over the next four years.

The increased funding by the foundation will lead to the expansion of WACQTs QC research team, and the organisation is looking to recruit a further 40 researchers for the OpenSuperQ project in 2021-2022. A new team is to be established to study nanophotonic devices, which can enable the interconnection of several smaller quantum processors into a large quantum computer.

The Wallenberg sphere incorporates 16 public and private foundations operated by various family members. Each year, these foundations allocate about SEK2.5bn to research projects in the fields of technology, natural sciences and medicine in Sweden.

The OpenSuperQ project aims to take Sweden to the forefront of quantum technologies, including computing, sensing, communications and simulation, said Peter Wallenberg, chairman of the Knut and Alice Wallenberg Foundation.

Quantum technology has enormous potential, so it is vital that Sweden has the necessary expertise in this area. WACQT has built up a qualified research environment and established collaborations with Swedish industry. It has succeeded in developing qubits with proven problem-solving ability. We can move ahead with great confidence in what WACQT will go on to achieve.

The novel thermometer breakthrough opens the door to experiments in the dynamic field of quantum thermodynamics, said Simone Gasparinetti, assistant professor at Chalmers quantum technology laboratory.

Our thermometer is a superconducting circuit and directly connected to the end of the waveguide being measured, said Gasparinetti. It is relatively simple and probably the worlds fastest and most sensitive thermometer for this particular purpose at the millikelvin scale.

Coaxial cables and waveguides the structures that guide waveforms and serve as the critical connection to the quantum processor remain key components in quantum computers. The microwave pulses that travel down the waveguides to the quantum processor are cooled to extremely low temperatures along the way.

For researchers, a fundamental goal is to ensure that these waveguides are not carrying noise due to the thermal motion of electrons on top of the pulses that they send. Precise temperature measurement readings of the electromagnetic fields are needed at the cold end of the microwave waveguides, the point where the controlling pulses are delivered to the computers qubits.

Working at the lowest possible temperature minimises the risk of introducing errors in the qubits. Until now, researchers have only been able to measure this temperature indirectly, and with relatively long delays. Chalmers Universitys novel thermometer enables very low temperatures to be measured directly at the receiving end of the waveguide with elevated accuracy and with extremely high time resolution.

The novel thermometer developed at the university provides researchers with a value-added tool to measure the efficiency of systems while identifying possible shortcomings, said Per Delsing, a professor at the department of microtechnology and nanoscience at Chalmers and director of WACQT.

A certain temperature corresponds to a given number of thermal photons, and that number decreases exponentially with temperature, he said. If we succeed in lowering the temperature at the end where the waveguide meets the qubit to 10 millikelvin, the risk of errors in our qubits is reduced drastically.

The universitys primary role in the OpenSuperQ project is to lead the work on developing the application algorithms that will be executed on the OpenSuperQ quantum computer. It will also support the development of algorithms for quantum chemistry, optimisation and machine learning.

Also, Chalmers will head up efforts to improve quantum coherence in chips with multiple coupled qubits, including device design, process development, fabrication, packaging and testing. It will also conduct research to evaluate the performance of 2-qubit gates and develop advanced qubit control methods to mitigate systematic and incoherent errors to achieve targeted gate fidelities.

See the original post:

Swedish university is behind quantum computing breakthrough - ComputerWeekly.com

Posted in Quantum Computing | Comments Off on Swedish university is behind quantum computing breakthrough – ComputerWeekly.com

Global IT giant to partner with U of C on quantum computing centre – Calgary Herald

Posted: at 3:51 pm

Breadcrumb Trail Links

A global IT giant has announced plans to partner with the University of Calgary to create a centre of excellence for quantum computing in the city.

Author of the article:

A global IT giant has announced plans to partner with the University of Calgary to create a centre of excellence for quantum computing in the city.

Bangalore-based Mphasis Ltd., a provider of IT outsourcing services, announced Wednesday that it will set up a Canadian headquarters in Calgary. The move is expected to create 500 to 1,000 local jobs within the next two to three years, according to company CEO Nitin Rakesh.

The company will also establish what it dubs the Quantum City Centre of Excellence at the University of Calgary to serve as a hub for companies focused on the commercial development of quantum technologies. Mphasis will be the anchor tenant and will work to draw in other companies working in the field.

Quantum computing uses the principles of quantum physics to solve problems. It is considered to be a huge leap forward from traditional computer technology, and has futuristic applications in the fields of medicine, energy, fintech, logistics and more.

This advertisement has not loaded yet, but your article continues below.

In a virtual news conference Wednesday, Premier Jason Kenney called quantum computing one of the most promising emerging high-tech sectors. He said the partnership between Mphasis and the University of Calgary will help make Alberta a destination of choice for investment capital and talent in this growing field.

The goal is to make Alberta a force to be reckoned with in quantum computing, machine learning and AI economically, but also intellectually, Kenney said. Post-secondary students will have incredible opportunities to master the most sought-after skills through this venture.

Mphasis also announced its plans to establish Sparkle Calgary, which will offer training in artificial intelligence and automation technology for Albertans seeking a career transition. Rakesh said through this platform, Mphasis hopes to help address the skills shortage that currently plagues Albertas tech sector, while at the same time helping out-of-work Albertans find a place in the new economy.

Theres a ton of data expertise that sits at the heart of the oil and gas industry, Rakesh said. So can we take that ability to apply data knowledge, data science, and really re-skill (those workers) toward cloud computing . . . Thats the vision we want to see.

The University of Calgary has been working for some time to help establish Alberta as a leader for quantum computing research through its Institute for Quantum Science and Technology a multidisciplinary group of researchers from the areas of computer science, mathematics, chemistry and physics. The U of C is also a member of Quantum Alberta, which aims to accelerate Quantum Science research, development and commercialization in the province.

This advertisement has not loaded yet, but your article continues below.

U of C president Ed McCauley said Wednesday he hopes that the partnership with Mphasis will lead to the birth of a new wave of startup companies in Calgary, ones that will use cutting-edge technology developed on campus.

This (quantum) technology will not only create its own industry, but it will fuel advances in others, McCauley said. Calgary will not only be an energy capital, it will be a quantum capital, too.

The federal government has identified quantum computing as critically important to the future economy. The most recent federal budget includes $360 million for a National Quantum Strategy encompassing funding for research, students and skills development.

Mphasis is the second major Indian IT company in recent months to announce it will set up shop in Calgary. In March, Infosys a New York Stock Exchange-listed global consulting and IT services firm with more than 249,000 employees worldwide said it will bring 500 jobs to the city over the next three years as part of the next phase of its Canadian expansion.

Like Mphasis, Infosys has formed partnerships with Calgarys post-secondary institutions to invest jointly in training programs that will help to develop a local technology talent pool.

astephenson@postmedia.com

This advertisement has not loaded yet, but your article continues below.

Sign up to receive daily headline news from the Calgary Herald, a division of Postmedia Network Inc.

A welcome email is on its way. If you don't see it, please check your junk folder.

The next issue of Calgary Herald Headline News will soon be in your inbox.

We encountered an issue signing you up. Please try again

Postmedia is committed to maintaining a lively but civil forum for discussion and encourage all readers to share their views on our articles. Comments may take up to an hour for moderation before appearing on the site. We ask you to keep your comments relevant and respectful. We have enabled email notificationsyou will now receive an email if you receive a reply to your comment, there is an update to a comment thread you follow or if a user you follow comments. Visit our Community Guidelines for more information and details on how to adjust your email settings.

Read the rest here:

Global IT giant to partner with U of C on quantum computing centre - Calgary Herald

Posted in Quantum Computing | Comments Off on Global IT giant to partner with U of C on quantum computing centre – Calgary Herald

What is Thermodynamic Computing and Could It Become Important? – HPCwire

Posted: at 3:51 pm

What, exactly, is thermodynamic computing? (Yes, we know everything obeys thermodynamic laws.) A trio of researchers from Microsoft, UC San Diego, and Georgia Tech have written an interesting viewpoint in the June issue of Communications of ACM A Vision to Compute like Nature: Thermodynamically.

Arguing that traditional computing is approaching hard limits for many familiar reasons, Todd Hylton (UCSD), Thomas Conte (Georgia Tech), and Mark Hill (Microsoft) sketch out this idea that it may be possible to harness thermodynamic computing to solve many currently difficult problem sets and to do so with lower power and better performance.

Animals, plants, bacteria, and proteins solve problems by spontaneously finding energy-efficient configurations that enable them to thrive in complex, resource-constrained environments. For example, proteins fold naturally into a low-energy state in response to their environment, write the researchers. In fact, all matter evolves toward low-energy configurations in accord with the Laws of Thermodynamics. For near-equilibrium systems these ideas are well known and have been used extensively in the analysis of computational efficiency and in machine learning techniques, write the researchers in their paper.

Theres a nice, summary description of the TC notion on a Computing Community Consortium (CCC) blog this week:

What if we designed computing systems to solve problems through a similar process? The writers envision a thermodynamic computing system (TCS) as a combination of a conventional computing system and novel TC hardware. The conventional computer is a host through which users can access the TC and define a problem for the TC to solve. The TC, on the other hand, is an open thermodynamic system directly connected to real-world input potentials (for example, voltages), which drive the adaptation of its internal organization via the transport of charge through it to relieve those potentials.

In the ACM Viewpoint, the researchers say, [W]e advocate a new, physically grounded, computational paradigm centered on thermodynamics and an emerging understanding of using thermodynamics to solve problems that we call Thermodynamic Computing or TC. Like quantum computers, TCs are distinguished by their ability to employ the underlying physics of the computing substrate to accomplish a task. (See the figure below from the paper)

The recent Viewpoint is actually the fruit of a 2019 thermodynamic computing workshop sponsored by CCC and organized by the ACM Viewpoint authors. In many ways, their idea sounds somewhat similar to adiabatic quantum computing (e.g. D-Wave Systems) but without the need to maintain quantum state coherence during computation.

Among existing computing systems, TC is perhaps most similar to neuromorphic computing, except that it replaces rule-driven adaptation and neuro-biological emulation with thermo-physical evolution, is how the researchers describe TC.

The broad idea to let a system seek thermodynamic equilibrium to compute isnt new and has been steadily advancing, as they note in their paper:

The idea of using the physics of self-organizing electronic or ionic devices to solve computational problems has shown dramatic progress in recent years. For example, networks of oscillators built from devices exhibiting metal-insulator transitions have been shown to solve computational problems in the NP-hard class.Memristive devices have internal state dynamics driven by complex electronic, ionic, and thermodynamic considerations,which, when integrated into networks, result in large-scale complex dynamics that can be employed in applications such as reservoir computing.Other systems of memristive devices have been shown to implement computational models such as Hopfield networks and to build neural networks capable of unsupervised learning.

Today we see opportunity to couple these recent experimental resultswith the new theories of non-equilibrium systems through both existing (for example, Boltzmann Machines) and newer (for example, Thermodynamic Neural Network) model systems.

The researchers say thermodynamic computing approaches are particularly well-suited for searching complex energy landscapes that leverage both rapid device fluctuations and the ability to search a large space in parallel, and addressing NP-complete combinatorial optimization problems or sampling many-variable probability distributions.

They suggest a three-prong TC development roadmap:

At least initially, we expect that TC will enable new computing opportunities rather than replace Classical Computing at what Classical Computing does well (enough), following the disruption path articulated by Christensen.These new opportunities will likely enable orders of magnitude more energy efficiency and the ability to self-organize across scales as an intrinsic part of their operation. These may include self-organizing neuromorphic systems and the simulation of complex physical or biological domains, but the history of technology shows that compelling new applications often emerge after the technology is available.

The viewpoint is fascinating and best read directly.

Link to ACM Thermodynamic Computing Viewpoint: https://cacm.acm.org/magazines/2021/6/252841-a-vision-to-compute-like-nature/fulltext

Link to CCC blog: https://us5.campaign-archive.com/?e=afe05237d1&u=3403318289e02657adfc0822d&id=7b8ae80cfa

Read more here:

What is Thermodynamic Computing and Could It Become Important? - HPCwire

Posted in Quantum Computing | Comments Off on What is Thermodynamic Computing and Could It Become Important? – HPCwire

IBM has partnered with IITs, others to advance training, research in quantum computing – Elets

Posted: at 3:51 pm

Share

Share

Share

Listen to this Article

The institutions which have been selected, the respective faculty and students will be able to access IBM quantum systems, quantum learning resources and, quantum tools over IBM Cloud for education and research purposes. This will allow these institutions to work on actual quantum computers and program these using the Qiskit open-source framework.

The selected institutions are Indian Institute of Science Education & Research (IISER) Pune, IISER Thiruvananthapuram, Indian Institute of Science (IISc) Bangalore, Indian Institute of Technology (IIT) Jodhpur, IIT- Kanpur, IIT Kharagpur, IIT Madras, Indian Statistical Institute (ISI) Kolkata, Indraprastha Institute of Information Technology (IIIT) Delhi, Tata Institute of Fundamental Research (TIFR) Mumbai and the University of Calcutta.

The collaboration with Indias top institutions is a part IBM Quantum Educators program that helps faculty in the quantum field connect with others. The program offers multiple benefits like additional access to systems beyond IBMs open systems, pulse access on the additional systems, priority considerations when in queue and private collaboration channels with other educators in the program, read an IBM notice.

Follow and connect with us on Facebook, Twitter, LinkedIn, Elets video

See the article here:

IBM has partnered with IITs, others to advance training, research in quantum computing - Elets

Posted in Quantum Computing | Comments Off on IBM has partnered with IITs, others to advance training, research in quantum computing – Elets

Malta Becomes Newest Participant in the EuroHPC Joint Undertaking – HPCwire

Posted: at 3:51 pm

June 4, 2021 Malta joins the European High Performance Computing Joint Undertaking (EuroHPC JU), a joint initiative between the EU, European countries, and private partners that pools resources to develop a world-class supercomputing ecosystem in Europe.

Malta, formally an Observer on the EuroHPC JU Governing, will now be a full Member, along side the other 32 Participating States.

Anders Dam Jensen, the European High Performance Computing Joint Undertaking (EuroHPC JU) Executive Director, said:

We are delighted to welcome Malta to the EuroHPC Joint Undertaking family. Malta is joining the JU at an exciting moment for European digital autonomy, with the recent inauguration of the Vega supercomputer in Slovenia, and two more supercomputers will reinforce Europes supercomputer ecosystem shortly, MeluXina in Luxembourg and Karolina in Bulgaria. The coming years will seefurther acceleration and development of the EuroHPC JU project, as we strive towards Europes ambition to become a world leader in high-performance computing, and we are thrilled that Malta is joining us on this journey.

Background information

The EuroHPC Joint Undertaking wasestablishedin 2018 andis autonomous since September 2020.

The EuroHPC JU is currently equipping the EU with an infrastructure of petascaleand precursor of exascale supercomputers, and developing the necessary technologies, applications and skills for reaching full exascale capabilities by 2023.One supercomputer is currently operational in Slovenia (Vega); another one (MeluXina)will be officially inaugurated in Luxembourg on 7 June 2021. Five more EuroHPC supercomputers have been procured and will be operational in 2021:Discoverer(Bulgaria),Karolina(Czech Republic),Deucalion(Portugal),Leonardo(Italy), andLumi(Finland).

In addition,through its research and innovation agenda, the EuroHPC JU is also strengthening the European knowledge base in HPC technologies and bridging the digital skills gap, notably through the creation of a network of national HPC CompetenceCentresand other pan-European education initiatives.

The EuroHPC machines will be available to European researchers, industry, public administrations and SMEs. They will be a strategic resource for Europe, underpinning advances in sectors such asbio-engineering, weather forecasting, the fight against climate change, personalized medicine, as well as in the discovery of new materials and drugs that will benefit EU citizens.

Anew regulationis currently being discussed at EU level and is expected to enter into force in the coming months, aiming to enable a further investment of EUR 7 billion in the next generation of supercomputers,such asexascale, post-exascaleand quantum computers and an ambitious R&Iprogramme.

Source: EuroHPC JU

See the original post:

Malta Becomes Newest Participant in the EuroHPC Joint Undertaking - HPCwire

Posted in Quantum Computing | Comments Off on Malta Becomes Newest Participant in the EuroHPC Joint Undertaking – HPCwire

Quantum Computing – Intel

Posted: May 20, 2021 at 4:43 am

Ongoing Development in Partnership with Industry and AcademiaThe challenges in developing functioning quantum computing systems are manifold and daunting. For example, qubits themselves are extremely fragile, with any disturbance including measurement causing them to revert from their quantum state to a classical (binary) one, resulting in data loss. Tangle Lake also must operate at profoundly cold temperatures, within a small fraction of one kelvin from absolute zero.

Moreover, there are significant issues of scale, with real-world implementations at commercial scale likely requiring at least one million qubits. Given that reality, the relatively large size of quantum processors is a significant limitation in its own right; for example, Tangle Lake is about three inches square. To address these challenges, Intel is actively developing design, modeling, packaging, and fabrication techniques to enable the creation of more complex quantum processors.

Intel began collaborating with QuTech, a quantum computing organization in the Netherlands, in 2015; that involvement includes a US$50M investment by Intel in QuTech to provide ongoing engineering resources that will help accelerate developments in the field. QuTech was created as an advanced research and education center for quantum computing by the Netherlands Organisation for Applied Research and the Delft University of Technology. Combined with Intels expertise in fabrication, control electronics, and architecture, this partnership is uniquely suited to the challenges of developing the first viable quantum computing systems.

Currently, Tangle Lake chips produced in Oregon are being shipped to QuTech in the Netherlands for analysis. QuTech has developed robust techniques for simulating quantum workloads as a means to address issues such as connecting, controlling, and measuring multiple, entangled qubits. In addition to helping drive system-level design of quantum computers, the insights uncovered through this work contribute to faster transition from design and fabrication to testing of future generations of the technology.

In addition to its collaboration with QuTech, Intel Labs is also working with other ecosystem members both on fundamental and system-level challenges on the entire quantum computing stack. Joint research being conducted with QuTech, the University of Toronto, the University of Chicago, and others builds upward from quantum devices to include mechanisms such as error correction, hardware- and software-based control mechanisms, and approaches and tools for developing quantum applications.

Beyond Superconduction: The Promise of Spin QubitsOne approach to addressing some of the challenges that are inherent to quantum processors such as Tangle Lake that are based on superconducting qubits is the investigation of spin qubits by Intel Labs and QuTech. Spin qubits function on the basis of the spin of a single electron in silicon, controlled by microwave pulses. Compared to superconducting qubits, spin qubits far more closely resemble existing semiconductor components operating in silicon, potentially taking advantage of existing fabrication techniques. In addition, this promising area of research holds the potential for advantages in the following areas:

Operating temperature:Spin qubits require extremely cold operating conditions, but to a lesser degree than superconducting qubits (approximately one degree kelvin compared to 20 millikelvins); because the difficulty of achieving lower temperatures increases exponentially as one gets closer to absolute zero, this difference potentially offers significant reductions in system complexity.

Stability and duration:Spin qubits are expected to remain coherent for far longer than superconducting qubits, making it far simpler at the processor level to implement them for algorithms.

Physical size:Far smaller than superconducting qubits, a billion spin qubits could theoretically fit in one square millimeter of space. In combination with their structural similarity to conventional transistors, this property of spin qubits could be instrumental in scaling quantum computing systems upward to the estimated millions of qubits that will eventually be needed in production systems.

To date, researchers have developed a spin qubit fabrication flow using Intels 300-millimeter process technology that is enabling the production of small spin-qubit arrays in silicon. In fact, QuTech has already begun testing small-scale spin-qubit-based quantum computer systems. As a publicly shared software foundation, QuTech has also developed the Quantum Technology Toolbox, a Python package for performing measurements and calibration of spin-qubits.

Originally posted here:

Quantum Computing - Intel

Posted in Quantum Computing | Comments Off on Quantum Computing – Intel

Cloud-Based Quantum Computing, Explained | Freethink

Posted: at 4:43 am

Name a scientific breakthrough made in the last 50 years, and a computer probably played a role in it. Now, consider what sorts of breakthroughs may be possible with quantum computers.

These next-gen systems harness the weird physics of the subatomic world to complete computations far faster than classical computers, and that processing power promises to revolutionize everything from finance and healthcare to energy and aerospace.

But today's quantum computers are complex, expensive devices, not unlike those first gigantic modern computers a person can't exactly pop down to an electronics retailer to pick one up (not yet, anyways).

However, there is a way for us to get a taste of that future, today: cloud-based quantum computing.

Cloud computing is the delivery of computing resources data storage, processing power, software, etc. on-demand over the internet.

Today, there are countless cloud computing service providers, but a few of the biggest are Amazon Web Services (AWS), Microsoft Azure, and Google Cloud.

Amazon, Microsoft, and Google are massive companies, and their computing resources are equally expansive AWS alone offers more than 175 cloud computing services, supported by more than 100 data centers across the globe.

A person might never be able to buy the resources those companies own, but through cloud computing, they can essentially rent them, only paying for what they actually use.

A scientist, for example, could pay AWS for 10 hours of access to one of the company's powerful virtual computers to run an experiment, rather than spending far more money to buy a comparable system.

That's just one use, though, and there are countless real-world examples of people using cloud computing services. The shows you watch on Netflix? The company stores them in a database in the cloud. The emails in your Gmail inbox? They're in the cloud, too.

Cloud-based quantum computing combines the benefits of the cloud with the next generation of computers.

In 2016, IBM connected a small quantum computer to the cloud, giving people their first chance to create and run small programs on a quantum computer online.

Since then, IBM has expanded its cloud-based quantum computing offerings and other companies, including Amazon, have developed and launched their own services.

In 2019, Microsoft announced one such service, Azure Quantum, which includes access to quantum algorithms, hardware, and software. It made that service available to a small number of partners during a limited preview in May 2020.

"Azure Quantum enables every developer, every person, every enterprise to really tap in and succeed in their businesses and their endeavors with quantum solutions," Krysta Svore, the GM of Microsoft Quantum, said in 2019. "And that's incredibly powerful."

Now, Microsoft has expanded its Azure Quantum preview to the public, giving anyone with the funds access to the cutting-edge quantum resources.

Researchers at Case Western Reserve University have already used Microsoft's cloud-based quantum computing service to develop a way to improve the quality and speed of MRI scans for cancer.

Ford, meanwhile, is using it to try to solve the problem of traffic congestion.

Now that anyone can use the services, we could be seeing far more projects like these in the near future rather than years from now when quantum systems are (maybe) as accessible as classical computers are today.

We'd love to hear from you! If you have a comment about this article or if you have a tip for a future Freethink story, please email us at [emailprotected].

Here is the original post:

Cloud-Based Quantum Computing, Explained | Freethink

Posted in Quantum Computing | Comments Off on Cloud-Based Quantum Computing, Explained | Freethink

Google wants to build a useful quantum computer by 2029 – The Verge

Posted: at 4:43 am

Google is aiming to build a useful, error-corrected quantum computer by the end of the decade, the company explained in a blog post. The search giant hopes the technology will help solve a range of big problems like feeding the world and climate change to developing better medicines. To develop the technology, Google has unveiled a new Quantum AI campus in Santa Barbara containing a quantum data center, hardware research labs, and quantum processor chip fabrication facilities. It will spend billions developing the technology over the next decade, The Wall Street Journal reports.

The target announced at Google I/O on Tuesday comes a year and a half after Google said it had achieved quantum supremacy, a milestone where a quantum computer has performed a calculation that would be impossible on a traditional classical computer. Google says its quantum computer was able to perform a calculation in 200 seconds that would have taken 10,000 years or more on a traditional supercomputer. But competitors racing to build quantum computers of their own cast doubt on Googles claimed progress. Rather than taking 10,000 years, IBM argued at the time that a traditional supercomputer could actually perform the task in 2.5 days or less.

This extra processing power could be useful to simulate molecules, and hence nature, accurately, Google says. This might help us design better batteries, creating more carbon-efficient fertilizer, or develop more targeted medicines, because a quantum computer could run simulations before a company invests in building real-world prototypes. Google also expects quantum computing to have big benefits for AI development.

Despite claiming to have hit the quantum supremacy milestone, Google says it has a long way to go before such computers are useful. While current quantum computers are made up of less than 100 qubits, Google is targeting machine built with 1,000,000. Getting there is a multistage process. Google says it first needs to cut down on the errors qubits make, before it can think about building 1,000 physical qubits together into a single logical qubit. This will lay the groundwork for the quantum transistor, a building block of future quantum computers.

Despite the challenges ahead, Google is optimistic about its chances. We are at this inflection point, the scientist in charge of Googles Quantum AI program, Hartmut Neven, told the Wall Street Journal, We now have the important components in hand that make us confident. We know how to execute the road map. Googles eventually plans to offer quantum computing services over the cloud.

Originally posted here:

Google wants to build a useful quantum computer by 2029 - The Verge

Posted in Quantum Computing | Comments Off on Google wants to build a useful quantum computer by 2029 – The Verge

Page 78«..1020..77787980..90100..»