Quantum Computing Is Hot And Noisy, But Zapata Opens Early Access – Forbes

Zapata's quantum coders, ready for a hot & noisy ride.

Were on the road to quantum computing. But these massively powerful machines are still in somewhat embryonic prototype stages and we still have several key challenges to overcome before we can start to build more of them.

As a quantum reminder: traditional computers compute on the basis of binary 1s and 0s, so all values and mathematical logic are essentially established from a base of those two values quantum superposition particles (known as qubits) can be 1 or 0, or anywhere in between and the value expressed can be differentiated depending upon what angle the qubit is viewed from so with massively more breadth, we can create a lot more algorithmic logic and computing power.

One of the main challenges associated with building quantum computing machines is the massive heat they generate. Scientists have been working with different semiconducting materials such as so-called quantum dots to help overcome the heat challenge. This issue is that qubits are special, qubits are powerful, but qubits are also fragile... and heat is one of their sworn enemies.

Another core challenge is noise.

As computations pass through the quantum gates that make up the quantum circuits in our new super quantum machines they create a lot of noise disturbance (think of an engine revving louder as it speeds up), so this means we have come to define and accept the term NISQ-based quantum applications i.e. Noisy Intermediate-Scale Quantum (NISQ).

As beautifully clarified by theoretical physicist John Preskill in this 2018 paper, Noisy Intermediate-Scale Quantum (NISQ) technology will be available in the near future. Quantum computers with 50-100 qubits may be able to perform tasks which surpass the capabilities of todays classical digital computers, but noise in quantum gates will limit the size of quantum circuits that can be executed reliably. Quantum technologists should continue to strive for more accurate quantum gates and, eventually, fully fault-tolerant quantum computing.

The fact that we know about the heat and noise challenges hasnt stopped companies like Strangeworks, D-Wave Systems, Coldquanta and others (including usual suspects Intel, IBM and Microsoft) forging on with development in the quantum space. Joining that list is Boston-headquartered Zapata Computing, Inc. The company describes itself as the quantum software company for near-term/NISQ-based quantum applications empowering enterprise teams. Near-term in this case meaning, well, now i.e. quantum stuff we can actually use on quantum devices of about 100-300 qubits.

Zapatas latest quantum leap in quantum (pun absolutely intended) is an early access program to Orquestra, its platform for quantum-enabled workflows. The company claims to have provided a software- and hardware-interoperable enterprise quantum toolset i.e. again, quantum tools we can actually use in modern day enterprise IT departments.

Using Zapatas unified Quantum Operating Environment, users can build, run and analyze quantum and quantum-inspired workflows. This toolset will empower enterprises and institutions to make their quantum mark on the world, enabling them to develop quantum capabilities and foundational IP today while shoring up for derivative IP for tomorrow, says CEO Christopher Savoie. It is a new computing paradigm, built on a unified enterprise framework that spans quantum and classical programming and hardware tools. With Orquestra, we are accelerating quantum experiments at scale.

Zapatas Early Access Program to Orquestra is aimed at users with backgrounds in software engineering, machine learning, physics, computational chemistry or quantum information theory working on the most computationally complex problems.

Orquestra is agnostic across the entire software and hardware stack. It offers an extensible library of open source and Zapata-created components for writing, manipulating and optimizing quantum circuits and running them across quantum computers, quantum simulators and classical computing resources. It comes equipped with a versatile workflow system and Application Programming Interfaces (APIs) to connect all modes of quantum devices.

We developed Orquestra to scale our own work for our customers and then realized the quantum community needs it, too. Orquestra is the only system for managing quantum workflows, said Zapata CTO Yudong Cao. The way we design and deploy computing solutions is changing. Orquestras interoperable nature enables extensible and modular implementations of algorithms and workflows across platforms and unlocks fast, fluid repeatability of experiments at scale.

So were on a journey. The journey is the road from classical-to-quantum and the best advice is to insist upon an interoperable vehicle (as Zapata has provided here) and to take a modular and extensible approach. In car analogy theory, that would mean break your journey up into bite-size chunks and make sure you have enough gas for the long haul when it comes. The quantum software parallel is obvious enough not to even explain.

Even when quantum evolves to become more ubiquitously available, many people think it will still be largely delivered as a cloud computing Quantum-as-a-Service (QaaS) package, but understanding the noisy overheated engine room in the meantime makes for a fascinating movie preview.

Read the original here:
Quantum Computing Is Hot And Noisy, But Zapata Opens Early Access - Forbes

Will Quantum Computing Really Change The World? Facts And Myths – Analytics India Magazine

In recent years, some big tech companies like IBM, Microsoft, Intel, or Google have been working in relative silence on something that sounds great: quantum computing. The main problem with this is that it is difficult to know what exactly it is and what it can be useful for.

There are some questions that can be easily solved. For example, quantum computing is not going to help you have more FPS on your graphics card at the moment. Nor will it be as easy as changing the CPU of your computer for a quantum to make it hyperfast. Quantum computing is fundamentally different from the computing we are used to, but how?

At the beginning of the 20th century, Planck and Einstein proposed that light is not a continuous wave (like the waves in a pond) but that it is divided into small packages or quanta. This apparently simple idea served to solve a problem called the ultraviolet catastrophe. But over the years other physicists developed it and came to surprising conclusions about the matter, of which we will be interested in two: the superposition of states and entanglement.

To understand why we are interested, lets take a short break and think about how a classic computer works. The basic unit of information is the bit, which can have two possible states (1 or 0) and with which we can perform various logical operations (AND, NOT, OR). Putting together n bits we can represent numbers and operate on those numbers, but with limitations: we can only represent up to 2 different states, and if we want to change x bits we have to perform at least x operations on them: there is no way to magically change them without touching them.

Well, superposition and entanglement allow us to reduce these limitations: with superposition, we can store many more than just 2 ^ n states with n quantum bits (qubits), and entanglement maintains certain relations between qubits in such a way that the operations in one qubit they forcefully affect the rest.

Overlapping, while looking like a blessing at first glance, is also a problem. As Alexander Holevo showed in 1973, even though we have many more states than we can save in n qubits, in practice we can only read 2 ^ n different ones. As we saw in an article in Genbeta about the foundations of quantum computing: a qubit is not only worth 1 or 0 as a normal bit, but it can be 1 in 80% and 0 in 20%. The problem is that when we read it we can only obtain either 1 or 0, and the probabilities that each value had of leaving are lost because when we measured it we modified it.

This discrepancy between the information kept by the qubits and what we can read led Benioff and Feynman to demonstrate that a classical computer would not be able to simulate a quantum system without a disproportionate amount of resources, and to propose models for a quantum computer that did. was able to do that simulation.

Those quantum computers would probably be nothing more than a scientific curiosity without the second concept, entanglement, which allows two quite relevant algorithms to be developed: quantum tempering in 1989 and Shors algorithm in 1994. The first allows finding minimum values of functions, which So said, it does not sound very interesting but it has applications in artificial intelligence and machine learning, as we discussed in another article. For example, if we manage to code the error rate of a neural network as a function to which we can apply quantum quenching, that minimum value will tell us how to configure the neural network to be as efficient as possible.

The second algorithm, the Shor algorithm, helps us to decompose a number into its prime factors much more efficiently than we can achieve on a normal computer. So said, again, it doesnt sound at all interesting. But if I tell you that RSA, one of the most used algorithms to protect and encrypt data on the Internet, is based on the fact that factoring numbers are exponentially slow (adding a bit to the key implies doubling the time it takes to do an attack by force) then the thing changes. A quantum computer with enough qubits would render many encryption systems completely obsolete.

Until now, quantum computing is a field that hasnt been applied much in the real world. To give us an idea, with the twenty qubits of the commercial quantum computer announced by IBM, we could apply Shors factorization algorithm only to numbers less than 1048576, which as you can imagine is not very impressive.

Still, the field has a promising evolution. In 1998 the first ord quantum drive (only two qubits, and needed a nuclear magnetic resonance machine to solve a toy problem (the so-called Deutsch-Jozsa problem). In 2001 Shors algorithm was run for the first time. Only 6 years later, in 2007, D-Wave presented its first computer capable of executing quantum quenching with 16 qubits. This year, the same company announced a 2000 qubit quantum quenching computer. On the other hand, the new IBM computers, although with fewer qubits, they are able to implement generic algorithms and not only that of quantum quenching. In short, it seems that the push is strong and that quantum computing will be increasingly applicable to real problems.

What can those applications be? As we mentioned before, the quantum tempering algorithm is very appropriate for machine learning problems, which makes the computers that implement it extremely useful, although the only thing they can do is run that single algorithm. If systems can be developed that, for example, are capable of transcribing conversations or identifying objects in images and can be translated to train them in quantum computers, the results could be orders of magnitude better than those that already exist. The same algorithm could also be used to find solutions to problems in medicine or chemistry, such as finding the optimal treatment methods for a patient or studying the possible structures of complex molecules.

Generic quantum computers, which have fewer qubits right now, could run more algorithms. For example, they could be used to break much of the crypto used right now as we discussed earlier (which explains why the NSA wanted to have a quantum computer). They would also serve as super-fast search engines if Grovers search algorithm can be implemented, and for physics and chemistry, they can be very useful as efficient simulators of quantum systems.

Unfortunately, algorithms and codes for classic computers couldnt be used on quantum computers and magically get an improvement in speed: you need to develop a quantum algorithm (not a trivial thing) and implement it in order to get that improvement. That, at first, greatly restricts the applications of quantum computers and will be a problem to overcome when those systems are more developed.

However, the main problem facing quantum computing is building computers. Compared to a normal computer, a quantum computer is an extremely complex machine: they operate at a temperature close to absolute zero (-273 C), the qubits support are superconducting and the components to be able to read and manipulate the qubits are not simple either.

What can a non-quantum quantum computer be like? As we have explained before, the two relevant concepts of a quantum computer are superposition and entanglement, and without them, there cannot be the speed improvements that quantum algorithms promise. If computer disturbances modify overlapping qubits and bring them to classical states quickly, or if they break the interweaving between several qubits, what we have is not a quantum computer but only an extremely expensive computer that only serves to run a handful of algorithms. equivalent to a normal computer (and will probably give erroneous results).

Of the two properties, entanglement is the most difficult to maintain and prove to exist. The more qubits there are, the easier it is for one of them to deinterlace (which explains why increasing the number of qubits is not a trivial task). And it is not enough to build the computer and see that correct results come out to say that there are intertwined qubits: looking for evidence of entanglement is a task in itself and in fact, the lack of evidence was one of the main criticisms of D-systems. Wave in its beginnings.

A priori and with the materials that quantum computers are being built with, it does not seem that miniaturization is too feasible. But there is already research on new materials that could be used to create more accessible quantum computers. Who knows if fifty years from now we will be able to buy quantum CPUs to improve the speed of our computers.

comments

See the original post:
Will Quantum Computing Really Change The World? Facts And Myths - Analytics India Magazine

Google’s top quantum computing brain may or may not have quit – Fudzilla

We will know when someone opens his office door

John Martinis, who had established Googles quantum hardware group in 2014, has cleaned out his office, put the cats out and left the building.

Martinis says a few months after he got Googles now legendary quantum computing experiment to go he was reassigned from a leadership position to an advisory one.

Martinis told Wired that the change led to disagreements with Hartmut Neven, the longtime leader of Googles quantum project.

Martinis said he had to go because his professional goal is for someone to build a quantum computer.

Google has not disputed this account, and says that the company is grateful for Martinis contributions and that Neven continues to head the companys quantum project.

Martinis retains his position as a professor at the UC Santa Barbara, which he held throughout his tenure at Google, and says he will continue to work on quantum computing.

To be fair, Googles quantum computing project was founded by Neven, who pioneered Googles image search technology, and got enough cats together.

The project took on greater scale and ambition when Martinis joined in 2014 to establish Googles quantum hardware lab in Santa Barbara, bringing along several members of his university research group. His nearby lab at UC Santa Barbara had produced some of the most prominent work in the field over the past 20 years, helping to demonstrate the potential of using superconducting circuits to build qubits, the building blocks of quantum computers.

Googles ground-breaking supremacy experiment used 53 qubits working together. They took minutes to crunch through a carefully chosen math problem the company calculated would take a supercomputer 10,000 years to work out. It still does not have a practical use, and the cats were said to be bored with the whole thing.

Originally posted here:
Google's top quantum computing brain may or may not have quit - Fudzilla

On the Heels of a Light Beam – Scientific American

As a 16-year-old boy, Albert Einstein imagined chasing after a beam of light in the vacuum of space. He mused on that vision for years, turning it over in his mind, asking questions about the relation between himself and the beam. Those mental investigations eventually led him to his special theory of relativity. Such thought experiments, which Einstein referred to by the German term gedankenexperiment, continue to nourish the heart of physics today, especially in the field of quantum mechanics, which he helped to establish.

In quantum mechanics, things don't happen, theoretical physicist Stephen L. Adler tells our reporter Tim Folger, referring to the probabilistic nature of quantum reality.

Philosophically, this may be true, but it hasn't stopped researchers from testing quantum concepts. Using lasers to excite electrons into emitting photons, a group at Delft University of Technology in the Netherlands ruled out the existence of hidden variables, which Einstein believed were controlling so-called entangled particlesone of the main tenets of quantum theory. Without these mysterious forces, bizarre dynamics could indeed be at work in the quantum world, defying our notions of space and time. Physicist Lee Smolin argues that the fabric of the cosmos is a vast collection of atomic interactions within an evolving network of relations where causality among events is complex and irrespective of distance.

Despite the theoretical mysteries of quantum theory, its real-world applications are growing. Researchers are cooling atomic systems to near absolute zero for use as quantum simulators to study applications in superconductors and superfluids. Others are using tabletop experiments to monitor the gravitational fields around entangled objectsminuscule gold or diamond spheres, for examplelooking for signs that gravity itself is quantized into discrete bits. At a larger scale, tools such as the Event Horizon Telescope, which recently took the first picture of a black hole, and gravitational-wave detectors could help resolve long-standing, vexing contradictions between quantum mechanics and general relativity.

These quantum insights are fueling tremendous innovation. A team of researchers in China successfully tested superposition over a distance of 1,200 kilometers, paving the way for an unhackable quantum-communications network. Computer scientists are using quantum algorithms to enhance traditional systems, ratcheting up progress toward the heralded quantum computing era. Such applications are still immature, as Elizabeth Gibney reports, yet it's not stopping investors from pouring money into quantum start-ups.

Science historians have argued about whether Einstein accepted the elements of quantum theory that conflicted with his own theories. Who knows whether he could have imagined the applications his ideas engendered. In any case, the thought experiment continues.

See the original post:
On the Heels of a Light Beam - Scientific American

Eleven Princeton faculty elected to American Academy of Arts and Sciences – Princeton University

Princeton faculty members Rubn Gallo, M. Zahid Hasan, Amaney Jamal, Ruby Lee, Margaret Martonosi, Tom Muir, Eve Ostriker, Alexander Smits, Leeat Yariv and Muhammad Qasim Zaman have been named members of the American Academy of Arts and Sciences. Visiting faculty member Alondra Nelson also was elected to the academy.

They are among 276 scholars, scientists, artists and leaders in the public, nonprofit and private sectors elected this year in recognition of their contributions to their respective fields.

Gallo is the Walter S. Carpenter, Jr., Professor in Language, Literature, and Civilization of Spain and a professor of Spanish and Portuguese. He joined the Princeton faculty in 2002. His most recent book is Conversacin en Princeton(2017)with Mario Vargas Llosa, who was teaching at Princeton when he received the Nobel Prize in Literature in 2010.

Gallos other books include Prousts LatinAmericans(2014);Freuds Mexico: Into the Wilds of Psychoanalysis(2010); Mexican Modernity: the Avant-Garde and the Technological Revolution(2005); New Tendencies in Mexican Art(2004); andThe Mexico City Reader(2004). He is currently working on Cuba: A New Era, a book about the changes in Cuban culture after the diplomatic thaw with the United States.

Gallo received the Gradiva award for the best book on a psychoanalytic theme and the Modern Language Associations Katherine Singer Kovacs Prize for the best book on a Latin American topic. He is a member of the board of the Sigmund Freud Museum in Vienna, where he also serves as research director.

Photo by

Nick Barberio, Office of Communications

Hasan is the Eugene Higgins Professor of Physics. He studiesfundamental quantum effects in exotic superconductors, topological insulators and quantum magnetsto make new discoveries about the nature of matter, work that may have future applications in areas such asquantum computing. He joined the faculty in 2002and has since led his research team to publish many influential findings.

Last year, Hasans lab led research that discovered that certain classes of crystals with an asymmetry like biological handedness, known as chiral crystals, may harbor electrons that behave in unexpected ways. In 2015, he led a research team that first observed Weyl fermions, which, if applied to next-generation electronics, could allow for a nearly free and efficient flow of electricity in electronics, and thus greater power, especially for computers.

In 2013, Hasan was named a fellow of the American Physical Society for the experimental discovery of three-dimensional topological insulators a new kind of quantum matter. In 2009, he received a Sloan Research Fellowship for groundbreaking research.

Photo by Tori Repp/Fotobuddy

Jamal is the Edwards S. Sanford Professor of Politics and director of the Mamdouha S. Bobst Center for Peace and Justice. She has taught at Princeton since 2003. Her current research focuses on the drivers of political behavior in the Arab world, Muslim immigration to the U.S. and Europe, and the effect of inequality and poverty on political outcomes.

Jamal also directs the Workshop on Arab Political Development and the Bobst-AUB Collaborative Initiative. She is also principal investigator for the Arab Barometer project, which measures public opinion in the Arab world. She is the former President of the Association of Middle East Womens Studies.

Her books include Barriers to Democracy (2007), which won the 2008 APSA Best Book Award in comparative democratization, and Of Empires and Citizens, which was published by Princeton University Press (2012). She is co-editor of Race and Arab Americans Before and After 9/11: From Invisible Citizens to Visible Subjects (2007) and Citizenship and Crisis: Arab Detroit after 9/11 (2009).

Photo by Tori Repp/Fotobuddy

Lee is the Forrest G. Hamrick Professor in Engineering and professor of electrical engineering. She is an associated faculty member in computer science. Lee joined the Princeton faculty in 1998.Her work at Princeton explores how the security and performance of computing systems can be significantly and simultaneously improved by hardware architecture. Her designs of secure processor architectures have strongly influenced industry security offerings and also inspired new generations of academic researchers in hardware security, side-channel attacks and defenses, secure processors and caches, and enhanced cloud computing and smartphone security.

Her research lies at the intersection of computer architecture, cybersecurity and, more recently, the branch of artificial intelligence known as deep learning.

Lee spent 17 years designing computers at Hewlett-Packard, and was a chief architect there before coming to Princeton. Among many achievements, Lee is known in the computer industry for her design of the HP Precision Architecture (HPPA or PA-RISC) that powered HPs commercial and technical computer product families for several decades, and was widely regarded as introducing key forward-looking features. In the '90s she spearheaded the development of microprocessor instructions for accelerating multimedia, which enabled video and audio streaming, leading to ubiquitous digital media.Lee is a fellow into the Association for Computing Machinery and the Institute of Electrical and Electronics Engineers.

Margaret Martonosi, the Hugh Trumbull Adams 35 Professor of Computer Science, specializes in computer architecture and mobile computing with an emphasis on power efficiency. She was one of the architects of the Wattch power modeling infrastructure, a tool that was among the first to allow computer scientists to incorporate power consumption into early-stage computer systems design. Her work helped demonstrate that power needs can help dictate the design of computing systems. More recently, Martonosis work has also focused on architecture and compiler issues in quantum computing.

She currently serves as head of the National Science Foundations Directorate for Computer and Information Science and Engineering, one of seven top-level divisions within the NSF. From 2017 until February 2020, she directed Princetons Keller Center for Innovation in Engineering Education, a center focused on enabling students across the University to realize their aspirations for addressing societal problems. She is an inventor who holds seven U.S. patents and has co-authored two technical reference books on power-aware computer architecture. In 2018, she was one of 13 co-authors of a National Academies consensus study report on progress and challenges in quantum computing.

Martonosi is a fellow of the Association for Computing Machinery (ACM) and the Institute of Electrical and Electronics Engineers IEEE). Among other honors, she has received a Jefferson Science Fellowship, the IEEE Technical Achievement Award, and the ACM SIGARCH Alan D. Berenbaum Distinguished Service Award. She joined the Princeton faculty in 1994.

Muir is the Van Zandt Williams, Jr. Class of 65 Professor of Chemistry and chair of the chemistry department. He joined Princeton in 2011 and is also an associated faculty member in molecular biology.

He leads research in investigating the physiochemical basis of protein function in complex systems of biomedical interest. By combining tools of organic chemistry, biochemistry, biophysics and cell biology, his lab has developed a suite of new technologies that provide fundamental insight into how proteins work. The chemistry-driven approaches pioneered by Muirs lab are now widely used by chemical biologists around the world.

Muir has published over 150 scientific articles and has won a number of honors for his research.He received a MERIT Award from the National Institutes of Health and is a fellow of American Association for the Advancement of Science and the Royal Society of Edinburgh.

Photo by Thomas Sayers Ellis

Nelson is the Harold F. Linder Chair in the School of Social Science at the Institute for Advanced Study and a visiting lecturer with the rank of professor in sociology at Princeton. She is president of the Social Science Research Council and is one of the country's foremost thinkers in the fields of science, technology, social inequalityand race. Her groundbreaking books include "The Social Life of DNA: Race, Reparations, and Reconciliation after the Genome" (2016) and "Body and Soul: The Black Panther Party and the Fight Against Medical Discrimination" (2011).Her other books include"Genetics and the Unsettled Past: The Collision of DNA, Race, and History" (with Keith Wailoo of Princeton and Catherine Lee) and"Technicolor: Race, Technology, and Everyday Life" (with Thuy Linh Tu). In 2002 she edited "Afrofuturism," a special issue of Social Text.

Nelson's writings and commentary also have reached the broader public through a variety of outlets. She has contributed to national policy discussions on inequality and the implications of new technology on society.

She is an elected fellow of the American Academy of Political and Social Science, the Hastings Centerand the Sociological Research Association. She serves on several advisory boards, including the Andrew. W. Mellon Foundation and the American Association for the Advancement of Science.

Ostriker, professor of astrophysical sciences, studies the universe. Her research is in the area of theoretical and computational astrophysics, and the tools she uses are powerful supercomputers and algorithms capable of simulating the birth, life, death and reincarnation of stars in their galactic homes. Ostriker and her fellow researchers build computer models using fundamental physical laws ones that govern gravity, fluid dynamics and electromagnetic radiation to follow the evolution of conditions found in deep space.

Ostriker, who came to Princeton in 2012, and her team have explored the formation of superbubbles, giant fronts of hot gas that billow out from a cluster of supernova explosions. More recently, she and her colleagues turned their focus toward interstellar clouds.

The research team uses computing resources through the Princeton Institute for Computational Science and Engineering and its TIGER and Perseus research computing clusters, as well as supercomputers administered through NASA. In 2017, Ostriker received a Simons Investigator Award.

Photo by

Nick Donnoli, Office of Communications

Smits is the Eugene Higgins Professor of Mechanical and Aerospace Engineering, Emeritus. His research spans the field of fluid mechanics, including fundamental turbulence, supersonic and hypersonic flows, bio-inspired flows, sports aerodynamics, and novel energy-harvesting concepts.

He joined the Princeton faculty in 1981 and transferred to emeritus status in 2018. Smits served as chair of the Department of Mechanical and Aerospace Engineering for 13 years and was director of the Gas Dynamics Laboratory on the Forrestal Campus for 33 years. During that time, he received several teaching awards, including the Presidents Award for Distinguished Teaching.

Smits has written more than 240 articles and three books, and edited seven volumes. He was awarded seven patents and helped found three companies. He is a member of the National Academy of Engineering and a fellow of the American Physical Society, the American Institute of Aeronautics and Astronautics, the American Society of Mechanical Engineers, the American Association for the Advancement of Science, and the Australasian Fluid Mechanics Society.

Yariv is the Uwe Reinhardt Professor of Economics. An expert in applied theory and experimental economics, her research interests concentrate on game theory, political economy, psychology and economics. She joined the faculty in 2018. Yariv also is director of the Princeton Experimental Laboratory for the Social Sciences.

She is a member of several professional organizations and is lead editor of American Economic Journal: Microeconomics, a research associate with the Political Economy Program of the National Bureau of Economic Research, and a research fellow with the Industrial Organization Programme of the Centre for Economic Policy Research.

She is also a fellow of the Econometric Society and the Society for the Advancement of Economic Theory, and has received numerous grants for researchand awards for her many publications.

Zaman, who joined the Princeton faculty in 2006, is the Robert H. Niehaus 77 Professor of Near Eastern Studies and Religion and chair of the Department of Near Eastern Studies.

He has written on the relationship between religious and political institutions in medieval and modern Islam, on social and legal thought in the modern Muslim world, on institutions and traditions of learning in Islam, and on the flow of ideas between South Asia and the Arab Middle East. He is the author of Religion and Politics under the Early Abbasids (1997), The Ulama in Contemporary Islam: Custodians of Change (2002), Ashraf Ali Thanawi: Islam in Modern South Asia (2008), Modern Islamic Thought in a Radical Age: Religious Authority and Internal Criticism (2012), and Islam in Pakistan: A History (2018). With Robert W. Hefner, he is also the co-editor of Schooling Islam: The Culture and Politics of Modern Muslim Education (2007); with Roxanne L. Euben, of Princeton Readings in Islamist Thought (2009); and, as associate editor, with Gerhard Bowering et al., of the Princeton Encyclopedia of Islamic Political Thought (2013). Among his current projects is a book on South Asia and the wider Muslim world in the 18th and 19th centuries.

In 2017, Zaman received Princetons Graduate Mentoring Award. In 2009, he received a Guggenheim Fellowship.

The mission of the academy: Founded in 1780, the American Academy of Arts and Sciences honors excellence and convenes leaders from every field of human endeavor to examine new ideas, address issues of importance to the nation and the world, and work together to cultivate every art and science which may tend to advance the interest, honor, dignity, and happiness of a free, independent, and virtuous people.

Read the original here:
Eleven Princeton faculty elected to American Academy of Arts and Sciences - Princeton University

The Economic Impact of Coronavirus on Value of Quantum Computing Market Predicted to Surpass US$ by the of 20702019-2019 – Jewish Life News

Persistence Market Research recently published a market study that sheds light on the growth prospects of the global Quantum Computing market during the forecast period (20XX-20XX). In addition, the report also includes a detailed analysis of the impact of the novel COVID-19 pandemic on the future prospects of the Quantum Computing market. The report provides a thorough evaluation of the latest trends, market drivers, opportunities, and challenges within the global Quantum Computing market to assist our clients arrive at beneficial business decisions.

The recent published research report sheds light on critical aspects of the global Quantum Computing market such as vendor landscape, competitive strategies, market drivers and challenges along with the regional analysis. The report helps the readers to draw a suitable conclusion and clearly understand the current and future scenario and trends of global Quantum Computing market. The research study comes out as a compilation of useful guidelines for players to understand and define their strategies more efficiently in order to keep themselves ahead of their competitors. The report profiles leading companies of the global Quantum Computing market along with the emerging new ventures who are creating an impact on the global market with their latest innovations and technologies.

Request Sample Report @ https://www.persistencemarketresearch.co/samples/14758

The recent published study includes information on key segmentation of the global Quantum Computing market on the basis of type/product, application and geography (country/region). Each of the segments included in the report is studies in relations to different factors such as market size, market share, value, growth rate and other quantitate information.

The competitive analysis included in the global Quantum Computing market study allows their readers to understand the difference between players and how they are operating amounts themselves on global scale. The research study gives a deep insight on the current and future trends of the market along with the opportunities for the new players who are in process of entering global Quantum Computing market. Market dynamic analysis such as market drivers, market restraints are explained thoroughly in the most detailed and easiest possible manner. The companies can also find several recommendations improve their business on the global scale.

The readers of the Quantum Computing Market report can also extract several key insights such as market size of varies products and application along with their market share and growth rate. The report also includes information for next five years as forested data and past five years as historical data and the market share of the several key information.

Request Report Methodology @ https://www.persistencemarketresearch.co/methodology/14758

Global Quantum Computing Market by Companies:

The company profile section of the report offers great insights such as market revenue and market share of global Quantum Computing market. Key companies listed in the report are:

Company Profiles

Global Quantum Computing Market by Geography:

For any queries get in touch with Industry Expert @ https://www.persistencemarketresearch.co/ask-an-expert/14758

Some of the Major Highlights of TOC covers in Quantum Computing Market Report:

Chapter 1: Methodology & Scope of Quantum Computing Market

Chapter 2: Executive Summary of Quantum Computing Market

Chapter 3: Quantum Computing Industry Insights

Chapter 4: Quantum Computing Market, By Region

Chapter 5: Company Profile

And Continue

See more here:
The Economic Impact of Coronavirus on Value of Quantum Computing Market Predicted to Surpass US$ by the of 20702019-2019 - Jewish Life News

Google’s Head of Quantum Computing Hardware Resigns – WIRED

In late October 2019, Google CEO Sundar Pichai likened the latest result from the companys quantum computing hardware lab in Santa Barbara, California, to the Wright brothers first flight.

One of the labs prototype processors had achieved quantum supremacyevocative jargon for the moment a quantum computer harnesses quantum mechanics to do something seemingly impossible for a conventional computer. In a blog post, Pichai said the milestone affirmed his belief that quantum computers might one day tackle problems like climate change, and the CEO also name-checked John Martinis, who had established Googles quantum hardware group in 2014.

Heres what Pichai didnt mention: Soon after the team had first got its quantum supremacy experiment working a few months earlier, Martinis says, he had been reassigned from a leadership position to an advisory one. Martinis tells WIRED that the change led to disagreements with Hartmut Neven, the longtime leader of Googles quantum project.

Martinis resigned from Google early this month. Since my professional goal is for someone to build a quantum computer, I think my resignation is the best course of action for everyone, he adds.

A Google spokesman did not dispute this account, and says that the company is grateful for Martinis contributions and that Neven continues to head the companys quantum project. Parent company Alphabet has a second, smaller, quantum computing group at its X Labs research unit. Martinis retains his position as a professor at the UC Santa Barbara, which he held throughout his tenure at Google, and says he will continue to work on quantum computing.

Googles quantum computing project was founded by Neven, who pioneered Googles image search technology, in 2006, and initially focused on software. To start, the small group accessed quantum hardware from Canadian startup D-Wave Systems, including in collaboration with NASA.

Everything you ever wanted to know about qubits, superpositioning, and spooky action at a distance.

The project took on greater scale and ambition when Martinis joined in 2014 to establish Googles quantum hardware lab in Santa Barbara, bringing along several members of his university research group. His nearby lab at UC Santa Barbara had produced some of the most prominent work in the field over the past 20 years, helping to demonstrate the potential of using superconducting circuits to build qubits, the building blocks of quantum computers.

Qubits are analogous to the bits of a conventional computer, but in addition to representing 1s and 0s, they can use quantum mechanical effects to attain a third state, dubbed a superposition, something like a combination of both. Qubits in superposition can work through some very complex problems, such as modeling the interactions of atoms and molecules, much more efficiently than conventional computer hardware.

How useful that is depends on the number and reliability of qubits in your quantum computing processor. So far the best demonstrations have used only tens of qubits, a far cry from the hundreds or thousands of high quality qubits experts believe will be needed to do useful work in chemistry or other fields. Googles supremacy experiment used 53 qubits working together. They took minutes to crunch through a carefully chosen math problem the company calculated would take a supercomputer on the order of 10,000 years, but does not have a practical application.

Martinis leaves Google as the company and rivals that are working on quantum computing face crucial questions about the technologys path. Amazon, IBM, and Microsoft, as well as Google offer their prototype technology to companies such as Daimler and JP Morgan so they can run experiments. But those processors are not large enough to work on practical problems, and it is not clear how quickly they can be scaled up.

When WIRED visited Googles quantum hardware lab in Santa Barbara last fall, Martinis responded optimistically when asked if his hardware team could see a path to making the technology practical. I feel we know how to scale up to hundreds and maybe thousands of qubits, he said at the time. Google will now have to do it without him.

More Great WIRED Stories

See original here:
Google's Head of Quantum Computing Hardware Resigns - WIRED

Explainer: What is a quantum computer? | MIT Technology Review

This is the first in a series of explainers on quantum technology. The other two are on quantum communication and post-quantum cryptography.

A quantum computer harnesses some of the almost-mystical phenomena of quantum mechanics to deliver huge leaps forward in processing power. Quantum machines promise to outstrip even the most capable of todaysand tomorrowssupercomputers.

They wont wipe out conventional computers, though. Using a classical machine will still be the easiest and most economical solution for tackling most problems. But quantum computers promise to power exciting advances in various fields, from materials science to pharmaceuticals research. Companies are already experimenting with them to develop things like lighter and more powerful batteries for electric cars, and to help create novel drugs.

The secret to a quantum computers power lies in its ability to generate and manipulate quantum bits, or qubits.

Today's computers use bitsa stream of electrical or optical pulses representing1s or0s. Everything from your tweets and e-mails to your iTunes songs and YouTube videos are essentially long strings of these binary digits.

Quantum computers, on the other hand, usequbits, whichare typically subatomic particles such as electrons or photons. Generating and managing qubits is a scientific and engineering challenge. Some companies, such as IBM, Google, and Rigetti Computing, use superconducting circuits cooled to temperatures colder than deep space. Others, like IonQ, trap individual atoms in electromagnetic fields on a silicon chip in ultra-high-vacuum chambers. In both cases, the goal is to isolate the qubits in a controlled quantum state.

Qubits have some quirky quantum properties that mean a connected group of them can provide way more processing power than the same number of binary bits. One of those properties is known as superposition and another is called entanglement.

Qubits can represent numerous possible combinations of 1and 0 at the same time. This ability to simultaneously be in multiple states is called superposition. To put qubits into superposition, researchers manipulate them using precision lasers or microwave beams.

Thanks to this counterintuitive phenomenon, a quantum computer with several qubits in superposition can crunch through a vast number of potential outcomes simultaneously. The final result of a calculation emerges only once the qubits are measured, which immediately causes their quantum state to collapse to either 1or 0.

Researchers can generate pairs of qubits that are entangled, which means the two members of a pair exist in a single quantum state. Changing the state of one of the qubits will instantaneously change the state of the other one in a predictable way. This happens even if they are separated by very long distances.

Nobody really knows quite how or why entanglement works. It even baffled Einstein, who famously described it as spooky action at a distance. But its key to the power of quantum computers. In a conventional computer, doubling the number of bits doubles its processing power. But thanks to entanglement, adding extra qubits to a quantum machine produces an exponential increase in its number-crunching ability.

Quantum computers harness entangled qubits in a kind of quantum daisy chain to work their magic. The machines ability to speed up calculations using specially designed quantum algorithms is why theres so much buzz about their potential.

Thats the good news. The bad news is that quantum machines are way more error-prone than classical computers because of decoherence.

The interaction of qubits with their environment in ways that cause their quantum behavior to decay and ultimately disappear is called decoherence. Their quantum state is extremely fragile. The slightest vibration or change in temperaturedisturbances known as noise in quantum-speakcan cause them to tumble out of superposition before their job has been properly done. Thats why researchers do their best to protect qubits from the outside world in those supercooled fridges and vacuum chambers.

But despite their efforts, noise still causes lots of errors to creep into calculations. Smart quantum algorithmscan compensate for some of these, and adding more qubits also helps. However, it will likely take thousands of standard qubits to create a single, highly reliable one, known as a logical qubit. This will sap a lot of a quantum computers computational capacity.

And theres the rub: so far, researchers havent been able to generate more than 128 standard qubits (see our qubit counter here). So were still many years away from getting quantum computers that will be broadly useful.

That hasnt dented pioneers hopes of being the first to demonstrate quantum supremacy.

Its the point at which a quantum computer can complete a mathematical calculation that is demonstrably beyond the reach of even the most powerful supercomputer.

Its still unclear exactly how many qubits will be needed to achieve this because researchers keep finding new algorithms to boost the performance of classical machines, and supercomputing hardware keeps getting better. But researchers and companies are working hard to claim the title, running testsagainst some of the worlds most powerful supercomputers.

Theres plenty of debate in the research world about just how significant achieving this milestone will be. Rather than wait for supremacy to be declared, companies are already starting to experiment with quantum computers made by companies like IBM, Rigetti, and D-Wave, a Canadian firm. Chinese firms like Alibaba are also offering access to quantum machines. Some businesses are buying quantum computers, while others are using ones made available through cloud computing services.

One of the most promising applications of quantum computers is for simulating the behavior of matterdown to the molecular level. Auto manufacturers like Volkswagen and Daimler are using quantum computers to simulate the chemical composition of electrical-vehicle batteries to help find new ways to improve their performance. And pharmaceutical companies are leveraging them to analyze and compare compounds that could lead to the creation of new drugs.

The machines are also great for optimization problems because they can crunch through vast numbers of potential solutions extremely fast. Airbus, for instance, is using them to help calculate the most fuel-efficient ascent and descent paths for aircraft. And Volkswagen has unveiled a service that calculates the optimal routes for buses and taxis in cities in order to minimize congestion. Some researchers also think the machines could be used to accelerate artificial intelligence.

It could take quite a few years for quantum computers to achieve their full potential. Universities and businesses working on them are facing a shortage of skilled researchersin the fieldand a lack of suppliersof some key components. But if these exotic new computing machines live up to their promise, they could transform entire industries and turbocharge global innovation.

Read more from the original source:
Explainer: What is a quantum computer? | MIT Technology Review

Quantum computing heats up down under as researchers reckon they know how to cut costs and improve stability – The Register

Boffins claim to have found path to 'real-world applications' by running hot

Dr Henry Yang and Professor Andrew Dzurak: hot qubits are a game-changer for quantum computing development. Pic credit: Paul Henderson-Kelly

Scientists in Australia are claiming to have made a breakthrough in the field of quantum computing which could ease the technology's progress to affordability and mass production.

A paper by researchers led by Professor Andrew Dzurak at Sydney's University of New South Wales published in Nature today says they have demonstrated quantum computing at temperatures 15 times warmer than previously thought possible.

Temperature is important to quantum computing because quantum bits (qubits) the equivalent classical computing bits running the computer displaying this story can exist in superconducting circuits or form within semiconductors only at very low temperatures.

Most quantum computers being developed by the likes of IBM and Google form qubits at temperatures within 0.1 degrees above absolute zero or -273.15C (-459.67F). These solid-state platforms require cooling to extremely low temperatures because vibrations generated by heat disrupt the qubits, which can impede performance. Getting this cold requires expensive dilution refrigerators.

Artistic representation of quantum entanglement. Pic credit: Luca Petit for QuTech

But Dzurak's team has shown that they can maintain stable "hotbits" at temperatures up to 15 times higher than existing technologies. That is a sweltering 1.5 Kelvin (-271.65C). It might not seem like much, but it could make a big difference when it comes to scaling quantum computers and getting them one step closer to practical applications.

"For most solid-state qubit technologies for example, those using superconducting circuits or semiconductor spins scaling poses a considerable challenge because every additional qubit increases the heat generated, whereas the cooling power of dilution refrigerators is severely limited at their operating temperature. As temperatures rise above 1 Kelvin, the cost drops substantially and the efficiency improves. In addition, using silicon-based platforms is attractive, as this can assist integration into classical systems that use existing silicon-based hardware," the paper says.

Keeping temperature at around 1.5 Kelvin can be achieved using a few thousand dollars' worth of refrigeration, rather than the millions of dollars needed to cool chips to 0.1 Kelvin, Dzurak said.

"Our new results open a path from experimental devices to affordable quantum computers for real-world business and government applications," he added.

The researchers used "isotopically enriched silicon" but the proof of concept published today promises cheaper and more robust quantum computing which can be built on hardware using conventional silicon chip foundries, they said.

Nature published another independent study by Dr Menno Veldhorst and colleagues at Delft University of Technology in the Netherlands which details a quantum circuit that operates at 1.1 Kelvin, confirming the breakthrough.

If made more practical and cheaper, quantum computers could represent a leap forward in information science. Whereas the bit in classical computing either represents a one or a zero, qubits superimpose one and zero, representing both states at the same time. This creates an exponential improvement in performances such that so eight qubits theoretically have two to eight times the performance of eight bits. For example, Google and NASA have demonstrated that a quantum computer with 1,097 qubits outperformed existing supercomputers by more than 3,600 times and personal computers by 100 million.

While the experimental nature and cost of quantum computing means it is unlikely to make it into any business setup soon, anything to make the approach more practical could make a big difference to scientific computational challenges such as protein folding. The problem of how to predict the structure of a protein from its amino acid sequence is important for understanding how proteins function in a wide range of biological processes and could potentially help design better medicines.

Sponsored: Webcast: Build the next generation of your business in the public cloud

See the article here:
Quantum computing heats up down under as researchers reckon they know how to cut costs and improve stability - The Register

What To Expect In The Emerging Age Of Quantum Computing – Law360

Law360 (April 21, 2020, 5:23 PM EDT) -- Once considered a scientific impossibility, quantum computing is now expected to have a far-reaching commercial impact thanks to an increase in investment and a myriad of new discoveries by physicists and computer scientists. Quantum computers have the potential to transform industries from auto manufacturing to pharmaceuticals to finance, but the technology has only recently moved from the laboratory to the commercial market.

At the Consumer Electronics Show in Las Vegas in January, IBM Corp. announced it had struck partnerships with Daimler AG (the parent company of Mercedes-Benz) and Delta Air Lines Inc. to harness quantum computing to solve real-world issues for...

In the legal profession, information is the key to success. You have to know whats happening with clients, competitors, practice areas, and industries. Law360 provides the intelligence you need to remain an expert and beat the competition.

TRY LAW360 FREE FOR SEVEN DAYS

Follow this link:
What To Expect In The Emerging Age Of Quantum Computing - Law360