The Prometheus League
Breaking News and Updates
- Abolition Of Work
- Ai
- Alt-right
- Alternative Medicine
- Antifa
- Artificial General Intelligence
- Artificial Intelligence
- Artificial Super Intelligence
- Ascension
- Astronomy
- Atheism
- Atheist
- Atlas Shrugged
- Automation
- Ayn Rand
- Bahamas
- Bankruptcy
- Basic Income Guarantee
- Big Tech
- Bitcoin
- Black Lives Matter
- Blackjack
- Boca Chica Texas
- Brexit
- Caribbean
- Casino
- Casino Affiliate
- Cbd Oil
- Censorship
- Cf
- Chess Engines
- Childfree
- Cloning
- Cloud Computing
- Conscious Evolution
- Corona Virus
- Cosmic Heaven
- Covid-19
- Cryonics
- Cryptocurrency
- Cyberpunk
- Darwinism
- Democrat
- Designer Babies
- DNA
- Donald Trump
- Eczema
- Elon Musk
- Entheogens
- Ethical Egoism
- Eugenic Concepts
- Eugenics
- Euthanasia
- Evolution
- Extropian
- Extropianism
- Extropy
- Fake News
- Federalism
- Federalist
- Fifth Amendment
- Fifth Amendment
- Financial Independence
- First Amendment
- Fiscal Freedom
- Food Supplements
- Fourth Amendment
- Fourth Amendment
- Free Speech
- Freedom
- Freedom of Speech
- Futurism
- Futurist
- Gambling
- Gene Medicine
- Genetic Engineering
- Genome
- Germ Warfare
- Golden Rule
- Government Oppression
- Hedonism
- High Seas
- History
- Hubble Telescope
- Human Genetic Engineering
- Human Genetics
- Human Immortality
- Human Longevity
- Illuminati
- Immortality
- Immortality Medicine
- Intentional Communities
- Jacinda Ardern
- Jitsi
- Jordan Peterson
- Las Vegas
- Liberal
- Libertarian
- Libertarianism
- Liberty
- Life Extension
- Macau
- Marie Byrd Land
- Mars
- Mars Colonization
- Mars Colony
- Memetics
- Micronations
- Mind Uploading
- Minerva Reefs
- Modern Satanism
- Moon Colonization
- Nanotech
- National Vanguard
- NATO
- Neo-eugenics
- Neurohacking
- Neurotechnology
- New Utopia
- New Zealand
- Nihilism
- Nootropics
- NSA
- Oceania
- Offshore
- Olympics
- Online Casino
- Online Gambling
- Pantheism
- Personal Empowerment
- Poker
- Political Correctness
- Politically Incorrect
- Polygamy
- Populism
- Post Human
- Post Humanism
- Posthuman
- Posthumanism
- Private Islands
- Progress
- Proud Boys
- Psoriasis
- Psychedelics
- Putin
- Quantum Computing
- Quantum Physics
- Rationalism
- Republican
- Resource Based Economy
- Robotics
- Rockall
- Ron Paul
- Roulette
- Russia
- Sealand
- Seasteading
- Second Amendment
- Second Amendment
- Seychelles
- Singularitarianism
- Singularity
- Socio-economic Collapse
- Space Exploration
- Space Station
- Space Travel
- Spacex
- Sports Betting
- Sportsbook
- Superintelligence
- Survivalism
- Talmud
- Technology
- Teilhard De Charden
- Terraforming Mars
- The Singularity
- Tms
- Tor Browser
- Trance
- Transhuman
- Transhuman News
- Transhumanism
- Transhumanist
- Transtopian
- Transtopianism
- Ukraine
- Uncategorized
- Vaping
- Victimless Crimes
- Virtual Reality
- Wage Slavery
- War On Drugs
- Waveland
- Ww3
- Yahoo
- Zeitgeist Movement
-
Prometheism
-
Forbidden Fruit
-
The Evolutionary Perspective
Category Archives: Quantum Computing
Back to the Future: Protecting Against Quantum Computing – Nextgov
Posted: May 27, 2022 at 2:06 am
The previous two years have proven the importance of proactively working to secure our data, especially as organizations underwent digital transformations and suffered increased cyberattacks as a result. For those organizations that have been breached, but their data hasnt yet been exploited and released to the wild, it may already be too late.
Organizations that have already experienced a data breach may become victims of harvest today, decrypt tomorrow or capture-now-decrypt-later attacks. These attacks, also referred to as harvesting for short, capitalize on known vulnerabilities to steal data that may not even be truly accessible using todays decryption technologies.
These attacks require long-term planning and projections on the advancement of quantum-computing technologies. While these technologies may still be years away from being commercially available and widely used, organizations should look to protect against these threats now to prevent themselves from becoming a future casualty.
Before getting into more detail on the future threat posed by quantum computing, we should look to a historic example to inform our present decision-making.
Lessons from the Enigma
In 1919 a Dutchman invented an encoding machine that was universally adopted by the German army, called the Enigma. Unbeknownst to Germany, the Allied powers managed to break the coding scheme, and were able to decode some messages as early as 1939, when the first German boots set foot in Poland. For years, however, the German army believed the Enigma codes were unbreakable and was communicating in confidence, never realizing their messages were out in the open.
History may already be repeating itself. I cant help but think that most organizations today also believe that their encrypted data is safe, but someone else may be close to, or already, reading their secure mail without them even knowing.
Todays modern cryptography is often deemed unbreakable, but a big, shiny black building in Maryland suggests that governments may be better at this than is widely believed. Although a lot of credit goes to the magical and elusive quantum computer, the reality is different: poor implementations of crypto suites are the primary vector for breaking encryption of captured traffic. So are certificates captured through other means, brute-forced passwords and even brute-forced crypto, because insufficient entropy is used to generate random numbers.
All these techniques are part of the arsenal of any nation who wants to strategically collect information on the happenings of other international playerswhether government or private companies. These techniques also require higher levels of coordination and financial backing to be a successful part of an intelligence strategy. As I continue to see, when the value of the captured information is high enough, the investment is worth it. Consider then the vast data centers being built by many governments: they are full of spinning disks of memory storage just in case current approaches don't yield access. Data storage has become an investment in the future of intelligence gathering.
Looking towards the future
Harvesting attacks does not just work as a strategy for quantum computers. We will likely have more powerful processors for brute-forcing in the future. Additionally, other types of stochastic computation machines, such as spintronics, are showing promise and even the de-quantification of popular algorithms may one day see a binary computer version of Peter Shors algorithm. The latter helps us explain how quantum computing may help to make quick work of current encryption techniques. This will allow breaking of Diffie-Hellman key exchanges or RSA on a conventional computer in smaller time frames.
So how do we shield ourselves? It is hard to imagine armoring oneself against any possible threat to encryption. Just like it is difficult to predict exactly which stocks will do well, and which ones won't. There are too many factors and too much chaos. One is left with only the option of diversification: using an out-of-band key distributing strategy that allows multiple paths for key and data to flow, and a range of algorithms and keys to be used. By diversifying our cryptographic approaches we are also able to minimize the damage in case a particular strategy fails us. Monocultures are at risk of pandemics, let's not fall victim to encryption monoculture as we move into the future.
It is past time to take steps now that will protect organizations from future threats. This includes developing actionable standards. Both federal agencies and the private sector need to embrace quantum-safe encryption. Additionally, they should look to develop next-generation, standards-based systems that will address current encryption method shortcomings and poor key management practices. This will help to ensure not only quantum-safe protection from future threats, but also stronger security from contemporary threats.
Organizations face a dizzying array of threats and need to constantly remain vigilant to thwart attacks. While looking to protect against current threats is certainly important, organizations should begin projecting future threats, including the threat posed by quantum computing. As technology continues to advance each day, one should remember that past encryption, like the Enigma machine, didnt remain an enigma for long and was broken in time. The advent of quantum computing may soon make our unbreakable codes go the way of the dinosaur. Prepare accordingly.
Read more:
Back to the Future: Protecting Against Quantum Computing - Nextgov
Posted in Quantum Computing
Comments Off on Back to the Future: Protecting Against Quantum Computing – Nextgov
Q&A with Atos’ Eric Eppe, an HPCwire Person to Watch in 2022 – HPCwire
Posted: at 2:06 am
HPCwire presents our interview with Eric Eppe, head of portfolio & solutions, HPC & Quantum at Atos, and an HPCwire 2022 Person to Watch. In this exclusive Q&A, Eppe recounts Atos major milestones from the past year and previews whats in store for the year ahead. Exascale computing, quantum hybridization and decarbonization are focus areas for the company and having won five out of the seven EuroHPC system contracts, Atos is playing a big role in Europes sovereign technology plans. Eppe also shares his views on HPC trends whats going well and what needs to change and offers advice for the next-generation of HPC professionals.
Eric, congratulations on your selection as a 2022 HPCwire Person to Watch. Summarize the major milestones achieved last year for Atos in your division and briefly outline your HPC/AI/quantum agenda for 2022.
2021 was a strong year for Atos Big Data and Security teams, despite the pandemic. Atos BullSequana XH2000 was in its third year and was already exceeding all sales expectations. More than 100,000 top bin AMD CPUs were sold on this platform, and it made one of the first entries for AMD Epyc in the Top500.
We have not only won five out of seven EuroHPC petascale projects, but also delivered some of the most significant HPC systems. For example, we delivered one of largest climate studies and weather forecast systems in the world to the European Centre for Medium-Range Weather Forecasts (ECMWF). In addition, Atos delivered a full BullSequana XH2000 cluster to the German climate research center (DKRZ). 2021 was also the launch of Atos ThinkAI and the delivery of a number of very large AI systems such as WASP in Sweden.
2022 is the year in which we are preparing the future with our next-gen Atos BullSequana XH3000 supercomputer, a hybrid computing platform bringing together flexibility, performance and energy-efficiency. Announced recently in Paris, this goes along with the work that has started on hybrid computing frameworks to integrate AI and quantum accelerations with supercomputing workflows.
Sovereignty and sustainability were key themes at Atos launch of its exascale supercomputing architecture, the BullSequana XH3000. Please address in a couple paragraphs how Atos views these areas and why they are important.
This was a key point I mentioned during the supercomputers reveal. For Europe, the real question is should we indefinitely rely on foreign technologies to find new vaccines, develop autonomous electric vehicles, and find strategies to face climate changes?
The paradox is that Europe leads the semiconductor substrate and manufacturing markets (with Soitec and ASML) but has no European foundry in the <10nm class yet. It is participating in the European Processor Initiative (EPI) and will implement SiPearl technologies in the BullSequana XH3000, but it will take time to mature enough and replace other technologies.
Atos has built a full HPC business in less than 15 years, becoming number one in Europe and in the top four worldwide in the supercomputer segment, with its entire production localized in its French factory. We are heavily involved in all projects that are improving European sovereignty.
EU authorities are today standing a bit behind compared to how the USA and China regulations are managing large petascale or exascale procurements, as well as the difference between how funding flows to local companies developing HPC technologies. This is a major topic.
Atos has developed a significant amount of IP, ranging from supercomputing platforms, low latency networks, cooling technologies, software and AI, security and large manufacturing capabilities in France with sustainability and sovereignty as a guideline. We are partnering with a number of European companies, such as SiPearl, IQM, Pasqal, AQT, Graphcore, ARM, OVH and many labs, to continue building this European Sovereignty.
Atos has announced its intention to develop and support quantum accelerators. What is Atos quantum computing strategy?
Atos has taken a hardware-agnostic approach in crafting quantum-powered supercomputers and enabling end-user applications. Atos ambition is to be a major player in multiple domains amongst which are quantum programming and simulation, the next-generation quantum-powered supercomputers, consulting services, and of course, quantum-safe cybersecurity.Atos launched the Atos Quantum Learning Machine (QLM) in 2017, a quantum appliance emulating almost all target quantum processing units with abstractions to connect to real quantum computing hardware when available. We have been very successful with the QLM in large academics or research centers on all continents. In 2021, there was a shift of many commercial companies starting to work on real use cases, and the QLM is the best platform to start these projects without waiting for hardware to be available at scale.
Atos plays a central role in European-funded quantum computing projects. We are cooperating with NISC QPU makers to develop new technologies and increase their effectiveness in a hybrid computing scenario. This includes, but is not limited to, hybrid frameworks, containerization, parallelization, VQE, GPU usage and more.
Where do you see HPC headed? What trends and in particular emerging trends do you find most notable? Any areas you are concerned about, or identify as in need of more attention/investment?
As for upcoming trends in the world of supercomputing, I see a few low-noise trends. Some technological barriers that may trigger drastic changes, and some arising technologies that may have large impacts on how we do HPC in the future. Most players, and Atos more specifically, are looking into quantum hybridization and decarbonization which will open many doors in the near future.
Up to this point, HPC environment has been quite conservative. I believe that administrators are starting to see the benefits of orchestration and micro service-based cluster management. There are some obstacles, but I do see more merits than issues in containerizing and orchestrating HPC workloads. There are some rising technological barriers that may push our industry in a corner, while at the same time giving us opportunities to change the way we architect our systems.
High performance low latency networks are making massive use of copper cables. With higher data rates (400Gb/s in 2022 and 800Gb/s in 2025) the workable copper cable length will be divided by 4x, replaced by active or fiber cables with cabling costs certainly increasing by 5 or 6x. This is clearly an obstacle to systems that are going to range in the 25,000 endpoints, with a cabling budget in tens of millions.
This very simple problem may impose a paradigm shift in the way devices, from a general standpoint, are connected and communicate together. This triggers deeper architectural design points changes from racks to nodes and down to elements that are deeply integrated today such as compute cores, buses, memory and associated controllers, and switches. I wont say the 800Gb/s step alone will change everything, but the maturity of some technologies, such as silicon photonics and the emerging standardization on very powerful protocols like CXL, will enable a lot more flexibility while continuing to push the limits. Also, note that CXL is just in its infancy, but already shows promise for a memory coherent space between heterogenous devices, centralized or distributed, mono or multi-tenant memory pools.
Silicon photonic integrated circuits (PICs), because they offer theoretically Tb/s bandwidth through native fiber connection, should allow a real disaggregation between devices that are today very tightly connected together on more complex and more expensive than ever PCBs.
What will be possible inside a node will be possible outside of it, blurring the traditional frontier between a node, a blade, a rack and a supercomputer, offering a world of possibilities and new architectures.
The market is probably not fully interested in finding an alternative to the ultra-dominance of the Linpack or its impact on how we imagine, engineer, size and deliver our supercomputers. Ultimately, how relevant is its associated ranking to real life problems? I wish we could initiate a trend that ranks global system efficiency versus available peak power. This would help HPC players to consider working on all optimization paths rather than piling more and more compute power.
Lastly, I am concerned by the fact that almost nothing has changed in the last 30 years in how applications are interacting with data. Well, HPC certainly uses faster devices. We now have clustered shared file systems like Lustre. Also, we have invented object-oriented key and value abstractions, but in reality storage subsystems are most of the time centralized. They are connected on the high-speed fabric. They are also oversized to absorb checkpoints from an ever-growing node count, while in nominal regime they only use a portion of the available bandwidth. Ultimately with workloads, by nature spread across all fabric, most of the power consumption comes from IOs.
However, its time to change this situation. There are some possible avenues, and they will improve as a side effect, the global efficiency of HPC workloads, hence the sustainability and the value of HPC solutions.
More generally, what excites you about working in high-performance computing?
Ive always loved to learn and be intellectually stimulated, especially in my career environment. High performance computing, along with AI and now quantum, are giving me constant food for thoughts and options to solve big problems than I will ever been able to absorb.
I appreciate pushing the limits every day, driving the Atos portfolio and setting the directions, ultimately helping our customers to solve their toughest problems. This is really rewarding for me and our Atos team. Im never satisfied, but Im very proud of what we have achieved together, bringing Atos into the top four ranking worldwide in supercomputers.
What led you to pursue a career in the computing field and what are your suggestions for engaging the next generation of IT professionals?
Ive always been interested by technology, initially attracted by everything that either flew or sailed. Really, Im summarizing this into everything that plays with wind. In my teenage years, after experiencing sailboards and gliders, I was fortunate enough to have access to my first computer in late 1979 when I was 16. My field of vision prevented me from being a commercial pilot, thus I started pursuing a software engineering master degree that led me into the information technology world.
When I began my career in IT, I was not planning any specific path to a specific domain. I simply took all opportunities to learn a new domain, work hard to succeed, and jump to something new that excited me. In my first position, I was lucky enough to work on an IBM mainframe doing CAD with some software development, as well as embracing a fully unknown system engineering role that I had to learn from scratch. Very educational! I jumped from developing in Fortran and doing system engineering on VM/SP and Unix. Then I learned Oracle RDMBS and Internet at Intergraph, HPC servers and storage at SGI. I pursued my own startups, and now Im leading the HPC, AI and quantum portfolio at Atos.
What I would tell the next generation of IT professional for their career is to:
First, only take roles in which you will learn new things. It could be managerial, financial, technical it doesnt matter. To evolve in your future career, the more diverse experience you have, the better you will be able to react and be effective. Move to another role when you are not learning anymore or if you are far too long in your comfort zone.
Second, look at problems to solve, think out of the box and with a 360-degree vision. Break the barriers, and change the angle of view to give new perspectives and solutions to your management and customers.
Also, compensation is important, but its not all. What you will do, how it will make you happy in your life, and what you will achieve professionally is more important. Ultimately, compare your salary with the free time that remains to spend it with your family and friends. Lastly, compensation is not always an indicator of success, but rather changing the world for the better and making our planet a better place to live is the most important benefit you will find in high performance computing.
Outside of the professional sphere, what can you tell us about yourself family stories, unique hobbies, favorite places, etc.? Is there anything about you your colleagues might be surprised to learn?
Together with my wife, we are the proud parents of two beautiful adult daughters. Also we have our three-year-old, bombshell Jack Russell named Pepsy, who brings a lot of energy to our house.
We live Northwest of Paris in a small city on the Seine river. Im still a private pilot and still cruising sail boats with family and friends. I recently participated in the ARC 2021 transatlantic race with three friends on a trimaran boat a real challenge and a great experience. Soon, were off to visiting Scotland for a family vacation!
Eppe is one of 12 HPCwire People to Watch for 2022. You can read the interviews with the other honorees at this link.
Read the original here:
Q&A with Atos' Eric Eppe, an HPCwire Person to Watch in 2022 - HPCwire
Posted in Quantum Computing
Comments Off on Q&A with Atos’ Eric Eppe, an HPCwire Person to Watch in 2022 – HPCwire
@HPCpodcast: Satoshi Matsuoka on the TOP500, Fugaku and Arm, Quantum and Winning Japan’s Purple Ribbon Medal of Honor – insideHPC – insideHPC
Posted: at 2:06 am
Satoshi Matsuoka
An eminent figure in the HPC community, Prof. Satoshi Matsuoka, director of the RIKEN Center for Computational Science (R-CCS) and professor of computer science at Tokyo Institute of Technology, joined our @HPCpodcast for a far ranging discussion of supercomputing past, present and future.
At RIKEN, Matsuoka has overseen development of Fugaku, number 1 on the TOP500 list of the worlds most powerful supercomputers (the list will be updated next week during the ISC 2022 conference in Hamburg as of now its not known if Fugaku will retain its position). Previously, Matsuoka was lead developer of another well-know supercomputer, TSUBAMI, the most powerful supercomputer in Japan at the time.
He also is a recent winner of the Purple Ribbon Medal, one of Japans highest honors, and in our conversation Matsuoka explains why the award ceremony did not include the usual presence of the Emperor of Japan. Thats how our discussion starts; other topics are time stamped below:
start The Purple Ribbon Medal of Honor
2:15 The role of Japan in supercomputing
3:45 TOP500 and ORNLs Exascale system
5:00 Fugaku and Arm
8:00 Why not SPARC
11:30 The balance and beauty of Fugaku and its predecessor, the K-Computer
15:15 Notable applications of Fugaku, including Covid research
25:00 Future of supercomputing and whats next after Fugaku
31:45 FPGA and CGRA
36:00 Quantum Computing
40:30 Nintendo days and working with the late, great Satoru Iwata
48:30 Pursuit of perfection, with a mention of the movie Jiro Dreams of Sushi
You can find our podcasts at insideHPCs@HPCpodcast page, onTwitterand at theOrionX.net blog.Heresthe RSS feed.
Read the original:
Posted in Quantum Computing
Comments Off on @HPCpodcast: Satoshi Matsuoka on the TOP500, Fugaku and Arm, Quantum and Winning Japan’s Purple Ribbon Medal of Honor – insideHPC – insideHPC
Could quantum computing bring down Bitcoin and end the age of crypto? – OODA Loop
Posted: at 2:06 am
Quantum computers will eventually break much of todays encryption, and that includes the signing algorithm of Bitcoin and other cryptocurrencies. Approximately one-quarter of the Bitcoin ($168bn) in circulation in 2022 is vulnerable to quantum attack, according to a study by Deloitte.Cybersecurity specialist Itan Barmes led the vulnerability study of the Bitcoin blockchain. He found the level of exposure that a large enough quantum computer would have on the Bitcoin blockchain presents a systemic risk. If [4 million] coins are eventually stolen in this way, then trust in the system will be lost and the value of Bitcoin will probably go to zero, he says.Todays cryptocurrency market is valued at approximately $3trn and Bitcoin reached an all-time high of more than $65,000 per coin in 2021, making crypto the best-performing asset class of the past ten years, according to Geminis Global State of Crypto report for 2022. However, Bitcoins bumpy journey into mainstream investor portfolios coincides with major advances in quantum computing.
Full story : Could quantum computing bring down Bitcoin and end the age of crypto?
Excerpt from:
Could quantum computing bring down Bitcoin and end the age of crypto? - OODA Loop
Posted in Quantum Computing
Comments Off on Could quantum computing bring down Bitcoin and end the age of crypto? – OODA Loop
Cancer Detection, Oil Spill Cleanup, Quantum Computing, and a New Medical Device: Meet the Spring 2022 Innovation Fund Finalists – Polsky Center for…
Posted: May 15, 2022 at 9:59 pm
Published on Wednesday, May 11, 2022
Previous finalists have built exciting technologies across a wide range of industries representative of the core strengths of the University and its partners. (Image: iStock.com/AntonioSolano)
Three teams have been selected as finalists for the George Shultz Innovation Fund spring 2022 investment cycle bringing to the forefront solutions for scaling quantum computers, sustainably cleaning oil spills, detecting cancer with saliva, and improving medical devices.
Managed by the Polsky Center for Entrepreneurship and Innovation, the George Shultz Innovation Fund provides up to $250,000 in co-investment funding for early-stage tech ventures coming out of the University of Chicago,Argonne National Laboratory,Fermilab, andtheMarine Biological Laboratory.
>> Register to attend the virtual Innovation Fund Finals on May 25th.
Through the Innovation Funds programmatic scope, we are able to surround these tech startups with a community of support including distinguished angel and venture capital investors, potential customers, advisors, scientists, entrepreneurs, and industry partners to help move their projects forward, said Ozge Guney Altay, director of Polsky Science Ventures.
The startups we invest in go through a rigorous, venture-capital-style due diligence conducted by a multi-disciplinary team of Innovation Fund Associates, added Altay. Our core mission with the Innovation Fund is to help researchers turn their innovations into ventures that are positioned to succeed in their fund-raising efforts.
The spring 2022 finalists include:
The core mission of the Shultz Innovation Fund is to help researchers turn their innovations into ventures that advance cutting-edge technologies, generate significant financial returns, and create lasting impact for humankind.
The teams receive guidance and dedicated support from the Polsky Center, business experts, an advisory committee, and studentInnovation Fund Associateswho are training in venture capitalism.
Over the last 11 years, the George Shultz Innovation Fund has invested $9.2 million in 90 companies that have gone on to raise $235 million in follow-on funding. Companies launched with the funds support includeExplORer Surgical,Corvidia,ClostraBio, andSuper.Tech.
Discussions with the Innovation Fund leadership and associates helped us clarify our core strategy and business model. We also benefited from the training and support on crafting a pitch deck, which will help us in conversations with future investors as well, said Pranav Gokhale, CEO and cofounder of Super.tech. Building on its success after securing $150,000 from the Innovation Fund in 2020, Super.tech went on to securemillions of dollars in federal research and was recently acquired by the global quantum ecosystem leader,ColdQuanta. Major organizations, including Fortune 500 companies and national research laboratories, today relay on its software as part of their strategic quantum initiatives.
A finalist in the 2021 spring cohort, Esya Labs CEO and Cofounder Dhivya Venkat said: All startups at the University of Chicago should apply to go through the George Shultz Innovation Fund. It was such a beneficial experience. Esya Labs earlier this yearannouncedthat Novartis is among the first companies to use its technology.
Go here to see the original:
Posted in Quantum Computing
Comments Off on Cancer Detection, Oil Spill Cleanup, Quantum Computing, and a New Medical Device: Meet the Spring 2022 Innovation Fund Finalists – Polsky Center for…
Quantum-South Declares their Air Cargo Optimization Application Ready for Production – Quantum Computing Report
Posted: at 9:59 pm
Quantum-South Declares their Air Cargo Optimization Application Ready for Production
We had reported in February how Quantum-South had created a prototype quantum application that could optimize how air cargo should be loaded optimally into a freighter aircraft. It is a very complex problem for classical computers because one must take into account many different targets and constraints including revenue, priority, center of gravity, shear force and volume, and industry standard weight and balance restrictions. Researchers often refer to this type of problem as a Knapsack Problem. Providing an optimum solution to this problem for each flight can have a tremendous profitability impact for airline companies, because over a hundred billion dollars is spent each year in air freight shipments. Quantum-South had started developing this application as a submission to the Airbus Quantum Computing Challenge in 2019, during which Quantum South was selected as one of the global finalists in the competition.
Quantum-South has now declared their application ready for production use with popular Airbus A330-200F or Boeing 747-400 air freighters. Their software currently runs on the D-Wave quantum annealer for these aircraft and can provide a solution within minutes. Quantum-South also has a version of their application that can run on gate-based quantum computers for small scale freighters, but these processors will need to increase in size before the application can support the larger freighters. Quantum-South is also working on a version of this application that can support maritime cargo. The problem is similar, but the constraints can be different.
Additional information about Quantum-Souths air cargo optimization application is available in a news release posted on their website here.
May 13, 2022
This site uses Akismet to reduce spam. Learn how your comment data is processed.
View original post here:
Posted in Quantum Computing
Comments Off on Quantum-South Declares their Air Cargo Optimization Application Ready for Production – Quantum Computing Report
IonQ Launches Native Gate Access, Extends Open-Source Capabilities for Researchers and Developers – HPCwire
Posted: at 9:59 pm
COLLEGE PARK, M.D., May 12, 2022 IonQ, an industry leader in quantum computing, today announced support for specifying quantum circuits in a hardware-native gate format across its systems. Researchers, academic institutions, and developers looking for new ways to test, learn and discover real-world solutions can now more precisely and expressively define their algorithms that run on IonQ quantum hardware.
IonQ provides customers with access to its cloud quantum computing platform the IonQ Quantum Cloud which allows users to run quantum programs on IonQs hardware remotely. Customers have the flexibility and simplicity to define quantum algorithms in whatever format best suits their needs, and the platforms proprietary compilation, optimization and post-processing stack ensure consistent, high-quality results. However, advanced researchers and developers often need more fine-grained control over each individual gate run on hardware when exploring novel algorithms, solutions, and fundamental techniques.
In order to serve this group of innovators more effectively, IonQ is further democratizing access to its industry-leading hardware by providing users with the ability to submit quantum programs using its hardware-native gate format. Developers can now specify precisely what is happening to every qubit throughout their entire algorithm, improving overall usefulness through new error mitigation or post-processing techniques. The feature is now available via IonQs direct API, Google Cloud Marketplace integration, and a variety of open-source tools such as Qiskit, Cirq, PennyLane, and others.
Researchers, academics, developers, and other tinkerers like to be as close to the metal as possible when designing quantum experiments that can surpass todays benchmarks they want to be able to play at every layer of the stack to extract as much performance and novel insight as possible from these systems, said Nathan Shammah, from Unitary Fund, the nonprofit organization developing Mitiq, the first open-source software for quantum error mitigation. IonQ providing a native gate interface across several open-source tools further opens access and paves the way for the open source community to maximize control and improve performance in quantum computing software.
By providing the open source community with greater access to IonQs quantum hardware through native gates, we are doubling down on our commitment to provide researchers with the tools needed to experiment with quantum computers in the way they best see fit, said Jungsang Kim, Co-Founder and CTO at IonQ. Quantums true potential will only be realized by those willing to push the boundaries of whats possible, and IonQs industry-leading hardware provides the ideal platform to build on top of and seek out solutions for the worlds most complex problems.
Todays news is the latest in a series of announcements by IonQ designed to push accessibility to quantum systems forward. In March, IonQ unveiled an industry-standard #AQ performance benchmark set to evaluate the quality of results output from a quantum computer. Additionally, the company announced in February the development of the N-qubit Toffoli gate alongside Duke University, introducing a new way to operate on many connected qubits at once by leveraging multi-qubit communication. More recently, IonQ announced the extension of its commercial partnership with Hyundai Motors to use quantum machine learning to improve the computation process for tasks like road sign image classification and simulation in a real-world test environment.
About IonQ
IonQ, Inc. is a leader in quantum computing, with a proven track record of innovation and deployment. IonQs latest generation quantum computer, IonQ Aria, is the worlds most powerful quantum computer, and IonQ has defined what it believes is the best path forward to scale. IonQ is the only company with its quantum systems available through the cloud on Amazon Braket, Microsoft Azure, and Google Cloud, as well as through direct API access. IonQ was founded in 2015 by Christopher Monroe and Jungsang Kim based on 25 years of pioneering research. To learn more, visit http://www.ionq.com.
Source: IonQ
More:
Posted in Quantum Computing
Comments Off on IonQ Launches Native Gate Access, Extends Open-Source Capabilities for Researchers and Developers – HPCwire
How Properties of Mechanical Quantum Systems Can Be Measured Without Destroying the Quantum State – SciTechDaily
Posted: at 9:58 pm
Optical microscope image of the acoustic resonator viewed from above (two larger disks, the inner of which is the piezoelectric transducer) and of the antenna connected to the superconducting qubit (white structure). Credit: Adapted from von Lpke et al. Nat. Phys. DOI: 10.1038/s41567-022-01591-2 (2022)
New experimental work establishes how quantum properties of mechanical quantum systems can be measured without destroying the quantum state.
Systems in which mechanical motion is controlled at the level of individual quanta are emerging as a promising quantum-technology platform. New experimental work now establishes how quantum properties of such systems can be measured without destroying the quantum state a key ingredient for tapping the full potential of mechanical quantum systems.
When thinking about quantum mechanical systems, single photons and well-isolated ions and atoms may spring to mind, or electrons spreading through a crystal. More exotic in the context of quantum mechanics are genuinely mechanical quantum systems; that is, massive objects in which mechanical motion such as vibration is quantized.
In a series of seminal experiments, quintessential quantum-mechanical features have been observed in mechanical systems, including energy quantization and entanglement. However, with a view to putting such systems to use in fundamental studies and technological applications, observing quantum properties is but a first step. The next one is to master the handling of mechanical quantum objects, so that their quantum states can be controlled, measured, and eventually exploited in device-like structures.
The group of Yiwen Chu in the Laboratory of Solid State Physics at ETH Zurich has now made major progress in that direction. Writing in the journal Nature Physics, they report the extraction of information from a mechanical quantum system without destroying the precious quantum state. This advance paves the path to applications such as quantum error correction, and beyond.
The ETH physicists employ a slab of high-quality sapphire, a little under half a millimeter thick, as their mechanical system. On its top sits a thin piezoelectrical transducer that can excite acoustic waves, which are reflected at the bottom and thus extend across a well-defined volume inside the slab. These excitations are the collective motion of a large number of atoms, yet they are quantized (in energy units known as phonons) and can be subjected, in principle at least, to quantum operations in very much the same ways as the quantum states of atoms, photons and electrons can be.
Intriguingly, it is possible to interface the mechanical resonator with other quantum systems, and with superconducting qubits in particular. The latter are tiny electronic circuits in which electromagnetic energy states are quantized, and they are currently one of the leading platforms for building scalable quantum computers. The electromagnetic fields associated with the superconducting circuit enable the coupling of the qubit to the piezoelectrical transducer of the acoustic resonator, and thereby to its mechanical quantum states.
Photograph of the flip-chip bonded hybrid device, with the acoustical-resonator chip on top of the superconducting-qubit chip. The bottom chip is 7 mm in length. Credit: Adapted from von Lpke et al. Nat. Phys. DOI: 10.1038/s41567-022-01591-2 (2022)
In such hybrid qubitresonator devices, the best of two worlds can be combined. Specifically, the highly developed computational capabilities of superconducting qubits can be used in synchrony with the robustness and long lifetime of acoustical modes, which can serve as quantum memories or transducers. For such applications, however, merely coupling qubit and resonator states will be not enough. For example, a straightforward measurement of the quantum state in the resonator destroys it, making repeated measurements impossible. What is needed instead is the capability to extract information about the mechanical quantum state in a more gentle, well-controlled manner.
Demonstrating a protocol for such so-called quantum non-demolition measurements is what Chus doctoral students Uwe von Lpke, Yu Yang, and Marius Bild, working with Branco Weiss fellow Matteo Fadel and with support from semester project student Laurent Michaud, now achieved. In their experiments there is no direct energy exchange between the superconducting qubit and the acoustic resonator during the measurement. Instead, the properties of the qubit are made to depend on the number of phonons in the acoustic resonator, with no need to directly touch the mechanical quantum state think about a theremin, the musical instrument in which the pitch depends on the position of the musicians hand without making physical contact with the instrument.
Creating a hybrid system in which the state of the resonator is reflected in the spectrum of the qubit is highly challenging. There are stringent demands on how long the quantum states can be sustained both in the qubit and in the resonator, before they fade away due to imperfections and perturbations from the outside. So the task for the team was to push the lifetimes of both the qubit and the resonator quantum states. And they succeeded, by making a series of improvements, including a careful choice of the type of superconducting qubit used and encapsulating the hybrid device in a superconducting aluminum cavity to ensure tight electromagnetic shielding.
Having successfully pushed their system into the desired operational regime (known as the strong dispersive regime), the team was able to gently extract the phonon-number distribution in their acoustic resonator after exciting it with different amplitudes. Moreover, they demonstrated a way to determine in one single measurement whether the number of phonons in the resonator is even or odd a so-called parity measurement without learning anything else about the distribution of phonons. Obtaining such very specific information, but no other, is crucial in a number of quantum-technological applications. For instance, a change in parity (a transition from an odd to an even number or vice versa) can signal that an error has affected the quantum state and that correcting is needed. Here it is essential, of course, that the to-be-corrected state is not destroyed.
Before an implementation of such error-correction schemes is possible, however, further refinement of the hybrid system is necessary, in particular, to improve the fidelity of the operations. But quantum error correction is by far not the only use on the horizon. There is an abundance of exciting theoretical proposals in the scientific literature for quantum-information protocols as well as for fundamental studies that benefit from the fact that the acoustic quantum states reside in massive objects. These provide, for example, unique opportunities for exploring the scope of quantum mechanics in the limit of large systems and for harnessing the mechanical quantum systems as a sensor.
References:
Parity measurement in the strong dispersive regime of circuit quantum acoustodynamics by Uwe von Lpke, Yu Yang, Marius Bild, Laurent Michaud, Matteo Fadel and Yiwen Chu, 12 May 2022, Nature Physics.DOI: 10.1038/s41567-022-01591-2
Good vibrations for quantum computing by Amy Navarathna and Warwick P. Bowen, 12 May 2022, Nature Physics.DOI: 10.1038/s41567-022-01613-z
Originally posted here:
Posted in Quantum Computing
Comments Off on How Properties of Mechanical Quantum Systems Can Be Measured Without Destroying the Quantum State – SciTechDaily
Atos Talks European HPC Openness and a Hybrid Future of AI, Machine Learning and Quantum Supercomputing – insideHPC
Posted: at 9:58 pm
This exclusive Q&A interview was conducted by Nages Sieslack of the ISC 2022 conference organization, with Eric Eppe, head of portfolio and solutions, HPC & Quantum at France-based HPC systems vendor Atos.
Nages Sieslack: How are the needs of the European HPC market changing with regard to traditional supercomputing and things like deep learning/AI and data-centric computing?
Eric Eppe: Supercomputers are the soft power for all nationals. They are essential for numerical simulations, accelerating technological, industrial, and scientific innovations. We see a transition from traditional compute-centric simulation toward data-centric, resulting in a more heterogeneous workload. We believe the future of HPC is hybrid. This means combining traditional simulation workflows using CPU & GPU (even TPU, FPGA, IPU, why not QPU) with advanced techniques to accelerate part of these workflows thanks to machine learning, artificial intelligence (AI), or even quantum computing (QC). The virtue of deep learning/AI is not only limited to the GPU accelerator on the hardware side but also serves as the foundation of smart software within HPC cluster management and workload optimization. In this regard, deep learning/AI-empowered software optimizes workloads while increasing the systems global efficiency.
Sieslack: How are those changes affecting the types of systems and services you are offering?
Eppe: Atos leads the hybrid computing trend with its existing HPC portfolio and the newly revealed BullSequana XH3000. It is the next-generation hybrid computing platform, the foundation for any scale simulation up to the exascale. It has unparalleled flexibility, industry-leading density, and embedded security. For Atos, Exascale doesnt mean Exaflopic performance only. We believe that Increasing global system and application efficiency is the only way to decrease system cost and stay within a reasonable power consumption at that scale. Thus, we have incorporated ML/AI mechanisms in our HPC Software suites to optimize simulation and keep control of energy consumption for unprecedented efficiency. We have also witnessed the need for high-performance AI simulations and launched the ThinkAI solution last year. With ThinkAI, we eliminate all the roadblocks in designing, developing, and installing high-performance AI systems, putting AI simulations at the fingertips of all businesses and academics. Furthermore, we leverage our HPC-as-a-Service portfolio, enabling any customer to run their simulations anywhere they want.
Sieslack: Geographically, where do see your biggest opportunities for growth in the HPC market, both within Europe and globally?
Eppe: Compared with China, the U.S., and Japan, which are relatively closed HPC economies (they build their own HPCs for their use), Europe is the most dynamic and open HPC market. Europe has significantly invested in the EuroHPC JU, Atos empowers 5 of the 7 EuroHPC centers. Europe continues to invest in supercomputing, including HPC and Quantum computing, e.g.the upcoming Exascale tenders, as an extension of the EuroHPC JU. Atos designs, develops, and builds our HPCs in Angers, France, and is number 1 in HPC in Europe. We also have our HPC, AI & QC R&D centers in France. We actively participate in European initiatives to develop the European microprocessor with EPI and the GAIA-X initiative. Atos is the undisputed leader in the European HPC market, instrumental to its technological and economic sovereignty.
Sieslack: What trends are you seeing for your HPC on-demand service via your Nimbix cloud offering with regards to use cases and the types of customers?
Eric Eppe of Atos
Eppe: As industry analysts have predicted, cloud computing will continue to grow at double-digit rates through 2025*. Our on-demand service through Nimbix is seeing this growth, with customers across the globe consuming computer power at record numbers. We have witnessed on-demand usage increase specifically within automotive manufacturing, Lifesciences, and academic research organizations. We are pleased to offer these industries the most comprehensive hybrid HPC cloud portfolio and are excited to be advancing this space with new offerings and technology. In fact, in the second half of the year, we will deploy the first public cloud offering in partnership with a top hyperscaler to provide Genomic analytics of sequencing data, from specialized cluster resources delivered by Atos Nimbix.
Sieslack: What is Atos doing now on the quantum computing front? Which companies and partners are currently using your Quantum Learning Machine simulator?
Eppe: Quantum Computing will re-invent how we simulate, co-existing with HPC. In Dec 2021, Atos confirmed its role as a global leader in quantum hybridization technologies at its 8th Quantum Advisory Board. For Atos, we work mainly on five strategic directions to accelerate quantum computing:
On top of these five strategic paths, we have launched Qscore, a universal metric to benchmark quantum applications performance. Together with clients worldwide, such as Argonne Labs, BMW, CESGA, SENAI CIMATEC, and Total, we are accelerating the arrival of the quantum era.
*Source: Intersect360 Research forecasts cloud computing will continue to grow at double-digit rates through 2025.
View original post here:
Posted in Quantum Computing
Comments Off on Atos Talks European HPC Openness and a Hybrid Future of AI, Machine Learning and Quantum Supercomputing – insideHPC
Colocation consolidation: Analysts look at what’s driving the feeding frenzy – The Register
Posted: at 9:58 pm
Analysis Colocation facilities aren't just a place to drop a couple of servers anymore. Many are quickly becoming full-fledged infrastructure-as-a-service providers as they embrace new consumption-based models and place a stronger emphasis on networking and edge connectivity.
But supporting the growing menagerie of value-added services takes a substantial footprint and an even larger customer base, a dynamic that's driven a wave of consolidation throughout the industry, analysts from Forrester Research and Gartner told The Register.
"You can only provide those value-added services if you're big enough," Forrester research director Glenn O'Donnell said.
The past few months have seen this trend play out en masse, with the latest being private equity firm DigitalBridge Investment Management's take over of datacenter provider Switch Inc in a deal valued at $11 billion.
Switch operates datacenters specializing in high-performance infrastructure. The company completed its fifth Prime datacenter campus in Texas last year, but this is only the latest colo acquisition in recent memory.
"There have been a pile of smaller colocation providers that have been coming together, either being acquired by the big boys, or they've been merging," O'Donnell said.
There's been a flurry of colocation mergers and acquisitions over the past few months. Here's just a sampling: NorthC acquired Netrics, LightEdge bought NFinit, EdgeConnex made off with GTN, Unitas Global snapped up INAP, VPLS nabbed a Carrier-1 datacenter in Texas, and Digital 9 absorbed Finnish colo Ficolo and Volta's London datacenters.
So what's driving this ramp in M&A activity? You might think it's the cloud, and while there's certainly some truth to that, O'Donnell says it's not the full story.
"I always like to remind people that just because cloud is so big and growing does not mean the datacenter is dead," he said, adding that to some extent cloud has actually driven people to colos more than it has hurt them.
"I won't give cloud all of the credit, but cloud certainly proved that this is a viable way of doing things," O'Donnell added.
What the cloud has managed to do is force colocation providers to innovate around new consumption models and platform services, while simultaneously expanding their reach closer to the edge.
The major cloud providers operate a relatively small number of extremely large datacenters located in key metros around the world. By contrast, colocation providers like Equinix and Digital Realty operate hundreds of datacenters around the globe.
This reach is not only one of the big attractions of colocation providers, Gartner analyst Matthew Brisse said, but it also turns out to be one of the biggest drivers of M&A activity.
"Size matters in this business because customers, especially multinational customers, want datacenters in a lot of different places," O'Donnell said.
According to Brisse, when enterprises start looking into colocation facilities, their main concern is getting workloads spun up in the right place. "The main reason that people go to colos, is location, location, location," he said.
And this demand has only accelerated as colocation providers look to offer services closer to the edge.
"We see the colocation providers starting to build out their edge offering as opposed to a simple hoteling experience for your infrastructure," Brisse said.
These aren't necessarily large datacenter facilities in the traditional sense, either, he explained. These can be as small as a half-sized shipping container positioned at the base of a cell tower.
Smaller regional colocation providers also serve an important role because they tend to build in places the larger players overlook, Brisse explained.
"A lot of companies don't have the luxury of sitting right next to an Equinix facility," he said. "There's lots of opportunities out there for colocation market in totality."
And as colocation providers inch closer to the edge, Brisse argues networking and automation are only becoming more important.
One of the most potent value adds offered by major colocation providers today is networking.
"As you look at the colocation services, the networking services have become a pretty big deal to differentiate them from just being a simple chunk of real estate to plop your servers," O'Donnell said.
And here again the larger players have the advantage. "Networking connectivity requires a big provider with lots of locations connected by their own fiber," he added.
These backbone networks allow workloads running in a datacenter on one side of the country to communicate with another without ever going out over the open internet.
But it's not just networking between colocation datacenters that's important. Many of these colocation facilities are located directly adjacent to the major cloud and software-as-a-service providers.
"So AWS, for example, or Microsoft Azure might be in the same building as you and connecting to it is just a matter of connecting to a different cage in that same building," O'Donnell said. "Smaller players can't do that, but the bigger guys can."
However, as customers increasingly turn to colocation providers for edge compute and networking, complexity rears its ugly head, Brisse argues.
In the future, "we're going to have lots of datacenters everywhere; we're going to have lots of data distributed in the right location; we're going to have edge facilities everywhere bringing data close to the edge," he said. "It is not going to be possible for humans to monitor all of that activity."
So, in addition to growing their footprint and network services, Brisse believes colos will also need to invest in AI operations capabilities to manage this complexity.
Both Brisse and O'Donnell expect the colocation market to continue to consolidate as macroeconomic forces put a pressure on smaller players.
"If the economic troubles we're seeing are persistent, I think we will see an acceleration of this kind of [M&A] activity," O'Donnell said.
It's important to remember that while colos may look like tech companies on the inside, on the books, they're really real estate investment trusts, he said, adding that in the current economic environment, colos are a comparatively safe bet in an otherwise dismal commercial real estate market.
"Colo is a hot market and getting hotter," O'Donnell said.
Read the original here:
Colocation consolidation: Analysts look at what's driving the feeding frenzy - The Register
Posted in Quantum Computing
Comments Off on Colocation consolidation: Analysts look at what’s driving the feeding frenzy – The Register