Page 109«..1020..108109110111..120130..»

Category Archives: Quantum Computing

Riverlane and Astex form quantum chemistry alliance – Business Weekly

Posted: June 6, 2020 at 4:45 pm

Quantum computing software specialist Riverlane is collaborating with Cambridge neighbour and world-leading fragment-based drug discovery company Astex to demonstrate the future potential of quantum chemistry.

Riverlane builds ground-breaking software to unleash the power of quantum computers. Chemistry is a key application in which quantum computing can be of significant value, as high-level quantum chemistry calculations can be solved far faster than using classical methods.

World leaders in drug discovery and development, Astex Pharmaceuticals apply innovative solutions to treat cancer and diseases of the central nervous system.

The two companies are combining their expertise in quantum computing software and quantum chemistry applications to speed up drug development and move us closer to quantum advantage.

As part of the collaboration, Astex is funding a post-doctoral research scientist at Riverlane. They will apply very high levels of quantum theory to study the properties of covalent drugs, in which protein function is blocked by the formation of a specific chemical bond.

So far in this field of research, only empirical methods and relatively low levels of quantum theory have been applied. Riverlane will provide access to specialised quantum software to enable simulations of the target drug-protein complexes.

Dave Plant, Principal Research Scientist at Riverlane, said: This collaboration will produce newly enhanced quantum chemical calculations to drive efficiencies in the drug discovery process. It will hopefully lead to the next generation of quantum inspired pharmaceutical products.

Chris Murray, SVP of Discovery Technology at Astex added: "We are excited about the prospect of exploring quantum computing in drug discovery applications.

It offers the opportunity to deliver much more accurate calculations of the energetics associated with the interaction of drugs with biological molecules, leading to potential improvements in drug discovery productivity.

To read the latest edition of Business Weekly visit the epaper.

Read the original here:

Riverlane and Astex form quantum chemistry alliance - Business Weekly

Posted in Quantum Computing | Comments Off on Riverlane and Astex form quantum chemistry alliance – Business Weekly

Top Artificial Intelligence Investments and Funding in May 2020 – Analytics Insight

Posted: at 4:45 pm

The startup scenario is being changed by bringing in investment and deal activity around intelligent automation and artificial intelligence, big data and machine learning. The data plainly demonstrates that new businesses that had AI as a core product are creating narrow AI tech packed away with the heaviest investment from leading VC firms and investors who are putting vigorously in deep tech startups in big data, enterprise AI and automation. It likewise underscores a great part of the financing going on in domain explicit breakthrough innovations, and not broadly useful AI tech.

Investment funds, venture capital (VC) firms and corporate financial specialists are venturing up equity investments in artificial intelligence (AI) start-ups, mirroring a developing worldwide interest for AI advances and their business applications.

The aggregate sum contributed and the worldwide number of deals has expanded enormously since 2011, yet wide varieties in investment profiles develop among nations and areas.

Lets look at some of the top AI investments which took place in the month of May 2020.

Runa Capital has closed its third investment fund with $157 million to back startups in deep tech areas such as artificial intelligence and quantum computing. The firm said Runa Capital Fund III surpassed its target of $135 million. The new capital will allow the company to continue its strategy of making investments that range between $1 million and $10 million in early-stage companies.

Cybersecurity threat remediation provider Dtex recently announced it has raised $17.5 million. The funds will be used to expand into new and existing verticals, including banking and financial services, critical infrastructure, government, defense, pharmaceuticals, life sciences, and manufacturing.

GigaSpaces, a startup developing in-memory computing solutions for AI and machine learning workloads, last month announced it has raised $12 million. The funds will be used to scale expansion and accelerate product R&D, according to CEO Adi Paz. Fortissimo Capital led the investment in three-year-old, New York-based GigaSpaces, joined by existing investors Claridge Israel and BRM Group. The round brings GigaSpaces total raised to $53 million, following a $20 million series D in January 2016.

Omilia, a startup developing natural language technologies, today announced it raised $20 million in its first ever financing round. Founder and CEO Dimitris Vassos says the capital will help strengthen Omilias go-to-market efforts as it eyes expansion in North America and Western Europe. Omilias product portfolio spans a conversational platform and solutions targeting voice biometrics, speech recognition, and fraud prevention.

Logistics startup DispatchTrack announced it raised $144 million in the companys first-ever financing round. CEO Satish Natarajan says it will be used to support product research and development, as well as business, segment, and geographic expansion. DispatchTrack was founded in 2010 by Satish Natarajan and Shailu Satish, a husband-and-wife team who focused on the furniture industry before expanding into building materials, appliances, food and beverage distribution, restaurants, field and home services, and third-party logistics.

Share This ArticleDo the sharing thingy

About AuthorMore info about author

Analytics Insight is an influential platform dedicated to insights, trends, and opinions from the world of data-driven technologies. It monitors developments, recognition, and achievements made by Artificial Intelligence, Big Data and Analytics companies across the globe.

View original post here:

Top Artificial Intelligence Investments and Funding in May 2020 - Analytics Insight

Posted in Quantum Computing | Comments Off on Top Artificial Intelligence Investments and Funding in May 2020 – Analytics Insight

If transistors can’t get smaller, then coders have to get smarter – MIT News

Posted: at 4:45 pm

In 1965, Intel co-founder Gordon Moore predicted that the number of transistors that could fit on a computer chip would grow exponentially and they did, doubling about every two years. For half a century, Moores Law has endured: Computers have gotten smaller, faster, cheaper, and more efficient, enabling the rapid worldwide adoption of PCs, smartphones, high-speed internet, and more.

This miniaturization trend has led to silicon chips today that have almost unimaginably small circuitry. Transistors, the tiny switches that implement computer microprocessors, are so small that 1,000 of them laid end-to-end are no wider than a human hair. And for a long time, the smaller the transistors were, the faster they could switch. But today, were approaching the limit of how small transistors can get. As a result, over the past decade researchers have been scratching their heads to find other ways to improve performance so that the computer industry can continue to innovate.

While we wait for the maturation of new computing technologies like quantum, carbon nanotubes, or photonics (which may take a while), other approaches will be needed to get performance as Moores Law comes to an end. In a recent journal article published in Science, a team from MITs Computer Science and Artificial Intelligence Laboratory (CSAIL) identifies three key areas to prioritize to continue to deliver computing speed-ups: better software, new algorithms, and more streamlined hardware.

Senior author Charles E. Leiserson says that the performance benefits from miniaturization have been so great that, for decades, programmers have been able to prioritize making code-writing easier rather than making the code itself run faster. The inefficiency that this tendency introduces has been acceptable, because faster computer chips have always been able to pick up the slack.

But nowadays, being able to make further advances in fields like machine learning, robotics, and virtual reality will require huge amounts of computational power that miniaturization can no longer provide, says Leiserson, the Edwin Sibley Webster Professor in MIT's Department of Electrical Engineering and Computer Science. If we want to harness the full potential of these technologies, we must change our approach to computing.

Leiserson co-wrote the paper, published this week, with Research Scientist Neil Thompson, Professor Daniel Sanchez, Adjunct Professor Butler Lampson, and research scientists Joel Emer, Bradley Kuszmaul, and Tao Schardl.

No more Moore

The authors make recommendations about three areas of computing: software, algorithms, and hardware architecture.

With software, they say that programmers previous prioritization of productivity over performance has led to problematic strategies like reduction: taking code that worked on problem A and using it to solve problem B. For example, if someone has to create a system to recognize yes-or-no voice commands, but doesnt want to code a whole new custom program, they could take an existing program that recognizes a wide range of words and tweak it to respond only to yes-or-no answers.

While this approach reduces coding time, the inefficiencies it creates quickly compound: if a single reduction is 80 percent as efficient as a custom solution, and you then add 20 layers of reduction, the code will ultimately be 100 times less efficient than it could be.

These are the kinds of strategies that programmers have to rethink as hardware improvements slow down, says Thompson. We cant keep doing business as usual if we want to continue to get the speed-ups weve grown accustomed to.

Instead, the researchers recommend techniques like parallelizing code. Much existing software has been designed using ancient assumptions that processors can only do only one operation at a time. But in recent years multicore technology has enabled complex tasks to be completed thousands of times faster and in a much more energy-efficient way.

Since Moore's Law will not be handing us improved performance on a silver platter, we will have to deliver performance the hard way, says Moshe Vardi, a professor in computational engineering at Rice University. This is a great opportunity for computing research, and the [MIT CSAIL] report provides a road map for such research.

As for algorithms, the team suggests a three-pronged approach that includes exploring new problem areas, addressing concerns about how algorithms scale, and tailoring them to better take advantage of modern hardware.

Lastly, in terms of hardware architecture, the team advocates that hardware be streamlined so that problems can be solved with fewer transistors and less silicon. Streamlining includes using simpler processors and creating hardware tailored to specific applications, like the graphics-processing unit is tailored for computer graphics.

Hardware customized for particular domains can be much more efficient and use far fewer transistors, enabling applications to run tens to hundreds of times faster, says Schardl. More generally, hardware streamlining would further encourage parallel programming, creating additional chip area to be used for more circuitry that can operate in parallel.

While these approaches may be the best path forward, the researchers say that it wont always be an easy one. Organizations that use such techniques may not know the benefits of their efforts until after theyve invested a lot of engineering time. Plus, the speed-ups wont be as consistent as they were with Moores Law: they may be dramatic at first, and then require large amounts of effort for smaller improvements.

Certain companies have already gotten the memo.

For tech giants like Google and Amazon, the huge scale of their data centers means that even small improvements in software performance can result in large financial returns, says Thompson. But while these firms may be leading the charge, many others will need to take these issues seriously if they want to stay competitive.

Getting improvements in the areas identified by the team will also require building up the infrastructure and workforce that make them possible.

Performance growth will require new tools, programming languages, and hardware to facilitate more and better performance engineering, says Leiserson. It also means computer scientists being better educated about how we can make software, algorithms, and hardware work together, instead of putting them in different silos.

This work was supported, in part, by the National Science Foundation.

Go here to read the rest:

If transistors can't get smaller, then coders have to get smarter - MIT News

Posted in Quantum Computing | Comments Off on If transistors can’t get smaller, then coders have to get smarter – MIT News

From 47,000 annually to 2 lakh daily, PPE production skyrockets – The Tribune India

Posted: May 11, 2020 at 11:27 am

Tribune News Service

Chandigarh, May 10

There has been a massive spike in production of personal protection equipment (PPE) kits in the country following outbreak of the Covid-19 pandemic. From just 47,000 kits being produced annually, the output has gone up to about two lakh per day.

Stating this here today, Dr G Satheesh Reddy, Chairman, Defence Research and Development Organisation (DRDO), said Covid-19 has provided a lot of opportunity for research and development and industrial production, but cautioned that delays in development is of no use.

He was addressing scientists and staff at the Centre for Development of Advanced Computing (C-DAC), Mohali, via video-conferencing on the occasion of the centers 32nd foundation day. Directors and scientists from various laboratories of the DRDO, Council for Scientific and Industrial Research and other institutions also participated in the conference.

He also spoke about medical ventilators produced by the industry with assistance from the DRDO, which costs from Rs 1.5 to 4 lakh and have export potential.

Dr Reddy said the C-DAC, an autonomous body under the Ministry of Electronics and Information Technology, will be considered as an extended arm of the DRDO for undertaking applied research.

Lauding the role of the Mohali center in research and development in electronics and information technology, Dr Reddy said artificial intelligence tools developed by it would be required in all most every field.

Dr PK Khosla, Director, C-DAC, Mohali, gave an overview of the work done in the organisations four verticals healthcare technology, cyber security, e-governance and education and training. He also spoke about four new areas under focus, including artificial intelligence, augmented and virtual reality, robotics and quantum computing.

Dr Hemant Darbari, Director General, C-DAC, spoke on e-Sanjeevni OPD, a recently launched national level telemedicine project rolled out by the C-DAC, Mohali. It has been extended to 15 states within three weeks and provides access to over a thousand doctors.

Read this article:

From 47,000 annually to 2 lakh daily, PPE production skyrockets - The Tribune India

Posted in Quantum Computing | Comments Off on From 47,000 annually to 2 lakh daily, PPE production skyrockets – The Tribune India

RAND report finds that, like fusion power and Half Life 3, quantum computing is still 15 years away – The Register

Posted: April 11, 2020 at 7:41 pm

Quantum computers pose an "urgent but manageable" threat to the security of modern communications systems, according to a report published Thursday by influential US RAND Corporation.

The non-profit think tank's report, "Securing Communications in the Quantum Computing Age: Managing the Risks to Encryption," urges the US government to act quickly because quantum code-breaking could be a thing in, say, 12-15 years.

If adequate implementation of new security measures has not taken place by the time capable quantum computers are developed, it may become impossible to ensure secure authentication and communication privacy without major, disruptive changes, said Michael Vermeer, a RAND scientist and lead author of the report in a statement.

Experts in the field of quantum computing like University of Texas at Austin computer scientist Scott Aaronson have proposed an even hazier timeline.

Noting that the quantum computers built by Google and IBM have been in the neighborhood of 50 to 100 quantum bits (qubits) and that running Shor's algorithm to break public key RSA cryptosystems would probably take several thousand logical qubits meaning millions of physical qubits due to error correction Aaronson recently opined, "I dont think anyone is close to that, and we have no idea how long it will take."

But other boffins, like University of Chicago computer science professor Diana Franklin, have suggested Shor's algorithm might be a possibility in a decade and a half.

So even though quantum computing poses a theoretical threat to most current public-key cryptography and less risk for lattice-based, symmetric, privacy key, post-quantum, and quantum cryptography there's not much consensus about how and when this threat might manifest itself.

Nonetheless, the National Institute of Standards and Technology, the US government agency overseeing tech standards, has been pushing the development of quantum-resistant cryptography since at least 2016. Last year it winnowed a list of proposed post-quantum crypto (PQC) algorithms down to a field of 26 contenders.

The RAND report anticipates quantum computers capable of crypto-cracking will be functional by 2033, with the caveat that experts propose dates both before and after that. PQC algorithm standards should gel within the next five years, with adoption not expected until the mid-to-late 2030s, or later.

But the amount of time required for the US and the rest of the world to fully implement those protocols to mitigate the risk of quantum crypto cracking may take longer still. Note that the US government is still running COBOL applications on ancient mainframes.

"If adequate implementation of PQC has not taken place by the time capable quantum computers are developed, it may become impossible to ensure secure authentication and communication privacy without major, disruptive changes to our infrastructure," the report says.

RAND's report further notes that consumer lack of awareness and indifference to the issue means there will be no civic demand for change.

Hence, the report urges federal leadership to protect consumers, perhaps unaware that Congress is considering the EARN-IT Act, which critics characterize as an "all-out assault on encryption."

"If we act in time with appropriate policies, risk reduction measures, and a collective urgency to prepare for the threat, then we have an opportunity for a future communications infrastructure that is as safe as or more safe than the current status quo, despite overlapping cyber threats from conventional and quantum computers," the report concludes.

It's worth recalling that a 2017 National Academy of Sciences, Engineering, and Medicine report, "Global Health and the Future Role of the United States," urged the US to maintain its focus on global health security and to prepare for infection disease threats.

That was the same year nonprofit PATH issued a pandemic prevention report urging the US government to "maintain its leadership position backed up by the necessary resources to ensure continued vigilance against emerging pandemic threats, both at home and abroad."

The federal government's reaction to COVID-19 is a testament to the impact of reports from external organizations. We can only hope that the threat of crypto-cracking quantum computers elicits a response that's at least as vigorous.

Sponsored: Practical tips for Office 365 tenant-to-tenant migration

Link:

RAND report finds that, like fusion power and Half Life 3, quantum computing is still 15 years away - The Register

Posted in Quantum Computing | Comments Off on RAND report finds that, like fusion power and Half Life 3, quantum computing is still 15 years away – The Register

Seeqc Gains Over $11M in Funding to Introduce Fully Digital Quantum Computing – HPCwire

Posted: at 7:41 pm

ELMSFORD, N.Y., April 10, 2020 Seeqcannounced funding of $5 Million from M Ventures, the strategic corporate venture capital arm of Merck KGaA, Darmstadt, Germany to develop commercially viable quantum computing systems for problem-specific applications.

The M Ventures funding follows a $6.8 million seed round from investors including BlueYard Capital, Cambium, NewLab and the Partnership Fund for New York City.

Seeqc is developing a new approach to making quantum computing useful, via fully Digital Quantum Computing. The solution combines classical and quantum computing to form an all-digital architecture through a system-on-a-chip design that utilizes 10-40 GHz superconductive classical co-processing to address the efficiency, stability and cost issues endemic to quantum computing systems. Seeqc recently spun out of HYPRES, Inc., the worlds leading developer of superconductor electronics, to pursue a vision of making quantum computing useful, commercially and at scale.

Through the spin-out, Seeqc acquired significant infrastructure and intellectual property from HYPRES. HYPRES had garnered over $100 million from public and private investments to develop a multi-layer commercial superconductor chip foundry and intellectual property. This investment was in support of creating commercial superconductive computing solutions, much of which is now part of Seeqc.

These assets give Seeqc the advantage of having sophisticated tools, facilities and IP for the design, testing and manufacturing of quantum-ready superconductor chips and wafers.

Merck KGaA, Darmstadt, Germany, will be a strategic partner for Seeqc supporting its research and development towards useful, application-specific quantum computers both as semiconductor materials solutions provider as well as an end user for high performance quantum computing.

The team of executives and scientists at Seeqc has deep expertise and experience in commercial superconductive computing solutions and quantum computing. The companys executive leadership team consists of John Levy, co-CEO, chair, and co-founder; Oleg Mukhanov, PhD, co-CEO, CTO, and co-founder; and Matthew Hutchings, PhD, chief product officer, co-founder.

Seeqc enters the quantum computing market as the world leader in superconductive electronics and is one of the only companies to ever design, manufacture and deliver multi-layer superconductive digital chips operating at tens of GHz into complex cryogenic systems.

Quantum Computing is Approaching Its Next Phase

While major advancements have been made recently with early generation quantum computing systems, such systems and architectures are inherently unstable and unscalable. The industry continues to struggle with cost, readout and control challenges, excessive input/output connection count as well as the complexities of managing microwave pulses used to control and readout qubits.

Leading state-of-the-art qubits need to be kept at near absolute zero temperatures to function, so as solutions require hundreds, even thousands of qubits and maybe even more cost, control/readout, complexity and heat management greatly impact scale.

Seeqcs digital classical-quantum hybrid approach mitigates many of these challenges:

The brute force or labware approach to quantum computing contemplates building machines with thousands or even millions of qubits requiring multiple analog cables and, in some cases, complex CMOS readout/control for each qubit, but that doesnt scale effectively as the industry strives to deliver business-applicable solutions, said John Levy, co-chief executive officer at Seeqc. With Seeqcs hybrid approach, we utilize the power of quantum computers in a digital system-on-a-chip environment, offering greater control, cost reduction and with a massive reduction in energy, introducing a more viable path to commercial scalability.

We believe that the best way to make quantum computing commercially viable is to ensure that early engagement with customers is ultra-focused and problem-specific, with the goal to solve previously insurmountable challenges with a targeted application specific hardware and software solution, continued Levy. A close partnership with customers, academic experts and application developers to work through the early use cases of quantum computing is critical to extracting its novel and potentially world-changing value.

Were excited to be working with a world leading team and fab on one of the most pressing issues in modern quantum computing, says Owen Lozman, Vice President at M Ventures. We recognize that scaling the current generations of superconducting quantum computers beyond the noisy intermediate scale quantum era will require fundamental changes in qubit control and wiring. Building on deep expertise in single flux quantum technologies, Seeqc has a clear, and importantly cost-efficient, pathway towards addressing existing challenges and disrupting analog, microwave-controlled architectures.

As a leading science and technology company, it is one of our key missions to stay on top of emerging trends. In quantum computing, we see a huge potential to advance major parts of our existing business, says John Langan, Chief Technology Officer, Performance Materials at Merck, KGaA, Darmstadt Germany. Were therefore excited to be embarking on a close partnership with Seeqc to develop the next generation of quantum machines. For one, our Semiconductor Solutions business unit will be working with Seeqc to develop and optimize semiconductor materials and processes utilized in the manufacturing of quantum technologies. In parallel, our Quantum Computing Task Force will support Seeqc in developing application-specific quantum co-processors to expedite the advent of industrially viable quantum computers that can be used by our Digital Organization and by our computational researcher across our three main business units.

The company holds 36 patents and is comprised of 16 PhDs with senior leadership experience from HYPRES.

About Seeqc

Seeqc is developing the first fully digital quantum computing platform for global businesses. Seeqc combines classical and quantum technologies to address the efficiency, stability and cost issues endemic to quantum computing systems. The company applies classical and quantum technology through digital readout and control technology and a unique chip-scale architecture. Seeqcs quantum system provides the energy- and cost-efficiency, speed and digital control required to make quantum computing useful and bring the first commercially-scalable, problem-specific quantum computing applications to market.

About M Ventures

M Ventures is the strategic, corporate venture capital arm of Merck KGaA, Darmstadt,Germany. Its mandate is to invest in innovative technologies and products with the potential to significantly impact the companys core business areas. From its headquarters in Amsterdam and offices in the US and Israel, M Ventures invests globally in transformational ideas driven by great entrepreneurs. M Ventures takes an active role in its portfolio companies and teams up with entrepreneurs and coinvestors to translate innovation towards commercial success. For more information, visitwww.m-ventures.com

About Merck KGaA, Darmstadt, Germany

Merck KGaA, Darmstadt, Germany, a leading science and technology company, operates across healthcare, life science and performance materials. Around 52,000 employees work to make a positive difference to millions of peoples lives every day by creating more joyful and sustainable ways to live. From advancing gene editing technologies and discovering unique ways to treat the most challenging diseases to enabling the intelligence of devices the company is everywhere. In 2019, Merck KGaA, Darmstadt, Germany, generated sales of 16.2 billion in 66 countries. The company holds the global rights to the name and trademark Merck internationally. The only exceptions are the United States and Canada, where the business sectors of Merck KGaA, Darmstadt, Germany operate as EMD Serono in healthcare, MilliporeSigma in life science, and EMD Performance Materials. Since its founding 1668, scientific exploration and responsible entrepreneurship have been key to the companys technological and scientific advances. To this day, the founding family remains the majority owner of the publicly listed company.

Source: Seeqc

Read the original:

Seeqc Gains Over $11M in Funding to Introduce Fully Digital Quantum Computing - HPCwire

Posted in Quantum Computing | Comments Off on Seeqc Gains Over $11M in Funding to Introduce Fully Digital Quantum Computing – HPCwire

Supercomputer Simulations Illuminate the Origins of Planets – HPCwire

Posted: at 7:41 pm

Astronomers believe that many planets including our own solar system emerged from giant disks of gas and dust spinning around stars. To understand these cosmic mechanisms, researchers have typically used simulations to separately examine planetary development and magnetic field formation. Now, new work by researchers from the University of Zurich and the University of Cambridge has unified these fields of study in a single simulation for the first time ever.

Researchers knew that planets likely formed as a result of gravitational instabilities in the disks that allowed particles to congeal together, slowly forming planets over hundreds of thousands of years. With the new study, the research team aimed to examine the effects that magnetic fields have on planet formation in the context of those gravitational instabilities.

To do that, they modified a hybrid mesh-particle method that calculated the mass and gravity using particles, creating a virtual adaptive mesh that allowed the researchers to simultaneously incorporate magnetic fields, fluid dynamics and gravity.

Applying that method required supercomputing power. The researchers turned to Piz Daint, the in-house Cray supercomputer of the Swiss National Supercomputing Centre (CSCS). Piz Daints 5,704 XC50 nodes each pack an Intel Xeon E5-2690 v3 CPU and an Nvidia Tesla P100 GPU, and its 1,813 XC40 nodes each carry two Intel Xeon E5-2695 v4 CPUs. The two sections stack up at 21.2 Linpack petaflops and 1.9 Linpack petaflops respectively, placing 6th and 185th on the most recent Top500 list of the worlds most powerful supercomputers.

After running the simulations on Piz Daint, the researchers got some very interesting results. For some time, the astronomy community has puzzled over why planets spin slower than the disks from which they were born. But now, it seems, they might have their answer.

Our new mechanism seems to be able to solve and explain this very general problem, said Lucio Mayer, professor of computational astrophysics at the University of Zurich and project manager at the National Centre of Competence in Research PlanetS.

The simulation shows that the energy generated by the interaction of the forming magnetic field with gravity acts outwards and drives a wind that throws matter out of the disk, Mayer said. If this is true, this would be a desirable prediction, because many of the protoplanetary disks studied with telescopes that are a million years old have about 90 percent less mass than predicted by the simulations of disks formation so far.

The researchers believe that that matter-ejecting mechanism is the culprit behind the loss of angular momentum in the disks and, ultimately, the planets they birth. The discovery of this mechanism, of course, was only made possible by conducting their unified simulation on Piz Daint.

To read CSCS Simone Ulmers article discussing this research, click here.

Continue reading here:

Supercomputer Simulations Illuminate the Origins of Planets - HPCwire

Posted in Quantum Computing | Comments Off on Supercomputer Simulations Illuminate the Origins of Planets – HPCwire

Is Machine Learning The Quantum Physics Of Computer Science ? – Forbes

Posted: March 26, 2020 at 6:43 am

Preamble: Intermittently, I will be introducing some columns which introduce some seemingly outlandish concepts. The purpose is a bit of humor, but also to provoke some thought. Enjoy.

atom orbit abstract

God does not play dice with the universe, Albert Einstein is reported to have said about the field of Quantum Physics. He was referring to the great divide at the time in the physics community between general relativity and quantum physics. General relativity was a theory which beautifully explained a great deal of physical phenomena in a deterministic fashion. Meanwhile, quantum physics grew out of a model which fundamentally had a probabilistic view of the world. Since Einstein made that statement in the mid 1950s, quantum physics has proven to be quite a durable theory, and in fact, it is used in a variety of applications such as semiconductors.

One might imagine a past leader in computer science such as Donald Knuth exclaiming, Algorithms should be deterministic. That is, given any input, the output should be exact and known. Indeed, since its formation, the field of computer science has focused on building elegant deterministic algorithms which have a clear view of the transformation between inputs and outputs. Even in the regime of non-determinism such as parallel processing, the objective of the overall algorithm is to be deterministic. That is, despite the fact that operations can run out-of-order, the outputs are still exact and known. Computer scientists work very hard to make that a reality.

As computer scientists have engaged with the real world, they frequently face very noisy inputs such as sensors or even worse, human beings. Computer algorithms continue to focus on faithfully and precisely translating input noise to output noise. This has given rise to the Junk In Junk Out (JIJO) paradigm. One of the key motivations for pursuing such a structure has been the notion of causality and diagnosability. After all, if the algorithms are noisy, how is one to know the issue is not a bug in the algorithm? Good point.

With machine learning, computer science has transitioned to a model where one trains a machine to build an algorithm, and this machine can then be used to transform inputs to outputs. Since the process of training is dynamic and often ongoing, the data and the algorithm are intertwined in a manner which is not easily unwound. Similar to quantum physics, there is a class of applications where this model seems to work. Recognizing patterns seems to be a good application. This is a key building block for autonomous vehicles, but the results are probabilistic in nature.

In quantum physics, there is an implicit understanding that the answers are often probabilistic Perhaps this is the key insight which can allow us to leverage the power of machine learning techniques and avoid the pitfalls. That is, if the requirements of the algorithm must be exact, perhaps machine learning methods are not appropriate. As an example, if your bank statement was correct with somewhat high probability, this may not be comforting. However, if machine learning algorithms can provide with high probability the instances of potential fraud, the job of a forensic CPA is made quite a bit more productive. Similar analogies exist in the area of autonomous vehicles.

Overall, machine learning seems to define the notion of probabilistic algorithms in computer science in a similar manner as quantum physics. The critical challenge for computing is to find the correct mechanisms to design and validate probabilistic results.

More here:

Is Machine Learning The Quantum Physics Of Computer Science ? - Forbes

Posted in Quantum Computing | Comments Off on Is Machine Learning The Quantum Physics Of Computer Science ? – Forbes

Research by University of Chicago PhD Student and EPiQC Wins IBM Q Best Paper – Quantaneo, the Quantum Computing Source

Posted: at 6:43 am

The interdisciplinary team of researchers from UChicago, University of California, Berkeley, Princeton University and Argonne National Laboratory won the $2,500 first-place award for Best Paper. Their research examined how the VQE quantum algorithm could improve the ability of current and near-term quantum computers to solve highly complex problems, such as finding the ground state energy of a molecule, an important and computationally difficult chemical calculation the authors refer to as a killer app for quantum computing.

Quantum computers are expected to perform complex calculations in chemistry, cryptography and other fields that are prohibitively slow or even impossible for classical computers. A significant gap remains, however, between the capabilities of todays quantum computers and the algorithms proposed by computational theorists.

VQE can perform some pretty complicated chemical simulations in just 1,000 or even 10,000 operations, which is good, Gokhale says. The downside is that VQE requires millions, even tens of millions, of measurements, which is what our research seeks to correct by exploring the possibility of doing multiple measurements simultaneously.

Gokhale explains the research in this video.

With their approach, the authors reduced the computational cost of running the VQE algorithm by 7-12 times. When they validated the approach on one of IBMs cloud-service 20-qubit quantum computers, they also found lower error as compared to traditional methods of solving the problem. The authors have shared their Python and Qiskit code for generating circuits for simultaneous measurement, and have already received numerous citations in the months since the paper was published.

For more on the research and the IBM Q Best Paper Award, see the IBM Research Blog. Additional authors on the paper include Professor Fred Chong and PhD student Yongshan Ding of UChicago CS, Kaiwen Gui and Martin Suchara of the Pritzker School of Molecular Engineering at UChicago, Olivia Angiuli of University of California, Berkeley, and Teague Tomesh and Margaret Martonosi of Princeton University.

Here is the original post:

Research by University of Chicago PhD Student and EPiQC Wins IBM Q Best Paper - Quantaneo, the Quantum Computing Source

Posted in Quantum Computing | Comments Off on Research by University of Chicago PhD Student and EPiQC Wins IBM Q Best Paper – Quantaneo, the Quantum Computing Source

Organisms grow in wave pattern, similar to ocean circulation – Big Think

Posted: at 6:43 am

When an egg cell of almost any sexually reproducing species is fertilized, it sets off a series of waves that ripple across the egg's surface.

These waves are produced by billions of activated proteins that surge through the egg's membrane like streams of tiny burrowing sentinels, signaling the egg to start dividing, folding, and dividing again, to form the first cellular seeds of an organism.

Now MIT scientists have taken a detailed look at the pattern of these waves, produced on the surface of starfish eggs. These eggs are large and therefore easy to observe, and scientists consider starfish eggs to be representative of the eggs of many other animal species.

In each egg, the team introduced a protein to mimic the onset of fertilization, and recorded the pattern of waves that rippled across their surfaces in response. They observed that each wave emerged in a spiral pattern, and that multiple spirals whirled across an egg's surface at a time. Some spirals spontaneously appeared and swirled away in opposite directions, while others collided head-on and immediately disappeared.

The behavior of these swirling waves, the researchers realized, is similar to the waves generated in other, seemingly unrelated systems, such as the vortices in quantum fluids, the circulations in the atmosphere and oceans, and the electrical signals that propagate through the heart and brain.

"Not much was known about the dynamics of these surface waves in eggs, and after we started analyzing and modeling these waves, we found these same patterns show up in all these other systems," says physicist Nikta Fakhri, the Thomas D. and Virginia W. Cabot Assistant Professor at MIT. "It's a manifestation of this very universal wave pattern."

"It opens a completely new perspective," adds Jrn Dunkel, associate professor of mathematics at MIT. "You can borrow a lot of techniques people have developed to study similar patterns in other systems, to learn something about biology."

Fakhri and Dunkel have published their results today in the journal Nature Physics. Their co-authors are Tzer Han Tan, Jinghui Liu, Pearson Miller, and Melis Tekant of MIT.

Previous studies have shown that the fertilization of an egg immediately activates Rho-GTP, a protein within the egg which normally floats around in the cell's cytoplasm in an inactive state. Once activated, billions of the protein rise up out of the cytoplasm's morass to attach to the egg's membrane, snaking along the wall in waves.

"Imagine if you have a very dirty aquarium, and once a fish swims close to the glass, you can see it," Dunkel explains. "In a similar way, the proteins are somewhere inside the cell, and when they become activated, they attach to the membrane, and you start to see them move."

Fakhri says the waves of proteins moving across the egg's membrane serve, in part, to organize cell division around the cell's core.

"The egg is a huge cell, and these proteins have to work together to find its center, so that the cell knows where to divide and fold, many times over, to form an organism," Fakhri says. "Without these proteins making waves, there would be no cell division."

MIT researchers observe ripples across a newly fertilized egg that are similar to other systems, from ocean and atmospheric circulations to quantum fluids. Courtesy of the researchers.

In their study, the team focused on the active form of Rho-GTP and the pattern of waves produced on an egg's surface when they altered the protein's concentration.

For their experiments, they obtained about 10 eggs from the ovaries of starfish through a minimally invasive surgical procedure. They introduced a hormone to stimulate maturation, and also injected fluorescent markers to attach to any active forms of Rho-GTP that rose up in response. They then observed each egg through a confocal microscope and watched as billions of the proteins activated and rippled across the egg's surface in response to varying concentrations of the artificial hormonal protein.

"In this way, we created a kaleidoscope of different patterns and looked at their resulting dynamics," Fakhri says.

The researchers first assembled black-and-white videos of each egg, showing the bright waves that traveled over its surface. The brighter a region in a wave, the higher the concentration of Rho-GTP in that particular region. For each video, they compared the brightness, or concentration of protein from pixel to pixel, and used these comparisons to generate an animation of the same wave patterns.

From their videos, the team observed that waves seemed to oscillate outward as tiny, hurricane-like spirals. The researchers traced the origin of each wave to the core of each spiral, which they refer to as a "topological defect." Out of curiosity, they tracked the movement of these defects themselves. They did some statistical analysis to determine how fast certain defects moved across an egg's surface, and how often, and in what configurations the spirals popped up, collided, and disappeared.

In a surprising twist, they found that their statistical results, and the behavior of waves in an egg's surface, were the same as the behavior of waves in other larger and seemingly unrelated systems.

"When you look at the statistics of these defects, it's essentially the same as vortices in a fluid, or waves in the brain, or systems on a larger scale," Dunkel says. "It's the same universal phenomenon, just scaled down to the level of a cell."

The researchers are particularly interested in the waves' similarity to ideas in quantum computing. Just as the pattern of waves in an egg convey specific signals, in this case of cell division, quantum computing is a field that aims to manipulate atoms in a fluid, in precise patterns, in order to translate information and perform calculations.

"Perhaps now we can borrow ideas from quantum fluids, to build minicomputers from biological cells," Fakhri says. "We expect some differences, but we will try to explore [biological signaling waves] further as a tool for computation."

This research was supported, in part, by the James S. McDonnell Foundation, the Alfred P. Sloan Foundation, and the National Science Foundation.

Reprinted with permission of MIT News. Read the original article.

From Your Site Articles

Related Articles Around the Web

View post:

Organisms grow in wave pattern, similar to ocean circulation - Big Think

Posted in Quantum Computing | Comments Off on Organisms grow in wave pattern, similar to ocean circulation – Big Think

Page 109«..1020..108109110111..120130..»