The Prometheus League
Breaking News and Updates
- Abolition Of Work
- Ai
- Alt-right
- Alternative Medicine
- Antifa
- Artificial General Intelligence
- Artificial Intelligence
- Artificial Super Intelligence
- Ascension
- Astronomy
- Atheism
- Atheist
- Atlas Shrugged
- Automation
- Ayn Rand
- Bahamas
- Bankruptcy
- Basic Income Guarantee
- Big Tech
- Bitcoin
- Black Lives Matter
- Blackjack
- Boca Chica Texas
- Brexit
- Caribbean
- Casino
- Casino Affiliate
- Cbd Oil
- Censorship
- Cf
- Chess Engines
- Childfree
- Cloning
- Cloud Computing
- Conscious Evolution
- Corona Virus
- Cosmic Heaven
- Covid-19
- Cryonics
- Cryptocurrency
- Cyberpunk
- Darwinism
- Democrat
- Designer Babies
- DNA
- Donald Trump
- Eczema
- Elon Musk
- Entheogens
- Ethical Egoism
- Eugenic Concepts
- Eugenics
- Euthanasia
- Evolution
- Extropian
- Extropianism
- Extropy
- Fake News
- Federalism
- Federalist
- Fifth Amendment
- Fifth Amendment
- Financial Independence
- First Amendment
- Fiscal Freedom
- Food Supplements
- Fourth Amendment
- Fourth Amendment
- Free Speech
- Freedom
- Freedom of Speech
- Futurism
- Futurist
- Gambling
- Gene Medicine
- Genetic Engineering
- Genome
- Germ Warfare
- Golden Rule
- Government Oppression
- Hedonism
- High Seas
- History
- Hubble Telescope
- Human Genetic Engineering
- Human Genetics
- Human Immortality
- Human Longevity
- Illuminati
- Immortality
- Immortality Medicine
- Intentional Communities
- Jacinda Ardern
- Jitsi
- Jordan Peterson
- Las Vegas
- Liberal
- Libertarian
- Libertarianism
- Liberty
- Life Extension
- Macau
- Marie Byrd Land
- Mars
- Mars Colonization
- Mars Colony
- Memetics
- Micronations
- Mind Uploading
- Minerva Reefs
- Modern Satanism
- Moon Colonization
- Nanotech
- National Vanguard
- NATO
- Neo-eugenics
- Neurohacking
- Neurotechnology
- New Utopia
- New Zealand
- Nihilism
- Nootropics
- NSA
- Oceania
- Offshore
- Olympics
- Online Casino
- Online Gambling
- Pantheism
- Personal Empowerment
- Poker
- Political Correctness
- Politically Incorrect
- Polygamy
- Populism
- Post Human
- Post Humanism
- Posthuman
- Posthumanism
- Private Islands
- Progress
- Proud Boys
- Psoriasis
- Psychedelics
- Putin
- Quantum Computing
- Quantum Physics
- Rationalism
- Republican
- Resource Based Economy
- Robotics
- Rockall
- Ron Paul
- Roulette
- Russia
- Sealand
- Seasteading
- Second Amendment
- Second Amendment
- Seychelles
- Singularitarianism
- Singularity
- Socio-economic Collapse
- Space Exploration
- Space Station
- Space Travel
- Spacex
- Sports Betting
- Sportsbook
- Superintelligence
- Survivalism
- Talmud
- Technology
- Teilhard De Charden
- Terraforming Mars
- The Singularity
- Tms
- Tor Browser
- Trance
- Transhuman
- Transhuman News
- Transhumanism
- Transhumanist
- Transtopian
- Transtopianism
- Ukraine
- Uncategorized
- Vaping
- Victimless Crimes
- Virtual Reality
- Wage Slavery
- War On Drugs
- Waveland
- Ww3
- Yahoo
- Zeitgeist Movement
-
Prometheism
-
Forbidden Fruit
-
The Evolutionary Perspective
Category Archives: Quantum Computing
IBM Provides Harris-Stowe State University with $2M in AI, Cloud Resources for Student Skill Building – HPCwire
Posted: January 7, 2021 at 5:53 am
ST. LOUIS, Jan. 6, 2021 Harris-Stowe State University has announced a multi-million dollar collaboration with IBM on a comprehensive program designed to develop diverse and high demand skill sets that align with industry needs and trends so both students and faculty can develop the skills they need today for the jobs of tomorrow.
IBM and Harris-Stowe State University are building on the need to advance digital skills in education and are dedicated to providing future focused curriculum and educational tools to help train the diverse workforce of tomorrow in fast-growing technologies such as artificial intelligence (AI), blockchain, data science, cybersecurity, cloud and quantum.
Harris-Stowe State University is thrilled to collaborate with IBM to provide greater access to skills and training in the tech industry, said Dr. Corey S. Bradford, Sr., president of Harris-Stowe State University. As the world, more than ever relies on the use of science, technology, engineering, and mathematics to solve grand societal challenges, Harris-Stowe must continue to develop well prepared and ready graduates to join the STEM workforce. This collaboration is yet another example of our commitment to supporting student and faculty development and assisting in preparing students to compete and lead globally.
The collaboration extends IBMs recent investment in technology, assets, resources and skills development with HBCUs across the United States through the IBM Skills Academy and enhanced IBM Academic Initiative.
Equal access to skills and jobs is the key to unlocking economic opportunity and prosperity for diverse populations, said Valinda Scarbro Kennedy, HBCU Program Lead, IBM Global University Programs. As we announced earlier this fall, IBM is deeply committed to helping HBCU students build their skills to better prepare for the future of work. Through this collaboration, Harris-Stowe State University students will have an opportunity to gain modern skills in emerging technologies across hybrid cloud, quantum and AI so they can be better prepared for the future of work in the digital economy.
As part of its multi-year Global University Programs, which include the IBM Academic Initiative and the IBM Skills Academy, IBM is providing more than $100M in assets, faculty training, pre-built and maintained curriculum content, hands on labs, use cases, digital badges and software to participating HBCUs. The IBM Academic Initiative provides access to resources at no-charge for teaching, learning and non-commercial research with recent enhancements including access to guest lectures. The IBM Skills Academy is a comprehensive, integrated program through an education portal designed to create a foundation of diverse and high demand skill sets that directly correlate to what students will need in the workplace. The learning tracks address topics such as artificial intelligence, cybersecurity, blockchain, data science and quantum computing.
IBMs investment in HBCUs like Harris-Stowe State University is part of the companys dedicated work to promote social justice and racial equality by creating equitable, innovative experiences for HBCU students to acquire the necessary skills to help unlock economic opportunity and prosperity.
About IBM
IBM is a global leader in business transformation, serving clients in more than 170 countries around the world with open hybrid cloud and AI technology. For more information, please visit here.
About Harris-Stowe State University
Harris-Stowe State University (HSSU), located in midtown St. Louis offers the most affordable bachelors degree in the state of Missouri. The University is a fully accredited four-year institution with more than 50 majors, minors and certificate programs in education, business and arts and sciences. Harris-Stowes mission is to provide outstanding educational opportunities for individuals seeking a rich and engaging academic experience. HSSUs programs are designed to nurture intellectual curiosity and build authentic skills that prepare students for leadership roles in a global society.
Source: IBM
Read the original post:
Posted in Quantum Computing
Comments Off on IBM Provides Harris-Stowe State University with $2M in AI, Cloud Resources for Student Skill Building – HPCwire
Photonic processor heralds new computing era | The Engineer The Engineer – The Engineer
Posted: at 5:53 am
A multinational team of researchers has developed a photonic processor that uses light instead of electronics and could help usher in a new dawn in computing.
Current computing relies on electrical current passed through circuitry on ever-smaller chips, but in recent years this technology has been bumping up against its physical limits.
To facilitate the next generation of computation-hungry technology such as artificial intelligence and autonomous vehicles, researchers have been searching for new methods to process and store data that circumvent those limits, and photonic processors are the obvious candidate.
Funding boost for UK quantum computing
Featuring scientists from the Universities of Oxford, Mnster, Exeter, Pittsburgh, cole Polytechnique Fdrale (EPFL) and IBM Research Europe, the team developed a new approach and processor architecture.
The photonic prototype essentially combines processing and data storage functionalities onto a single chip so-called in-memory processing, but using light.
Light-based processors for speeding up tasks in the field of machine learning enable complex mathematical tasks to be processed at high speeds and throughputs, said Mnster Universitys Wolfram Pernice, one of the professors who led the research.
This is much faster than conventional chips which rely on electronic data transfer, such as graphic cards or specialised hardware like TPUs [Tensor Processing Unit].
Led by Pernice, the team combined integrated photonic devices with phase-change materials (PCMs) to deliver super-fast, energy-efficient matrix-vector (MV) multiplications. MV multiplications underpin much of modern computing from AI to machine learning and neural network processing and the imperative to carry out such calculations at ever-increasing speeds, but with lower energy consumption, is driving the development of a whole new class of processor chips, so-called tensor processing units (TPUs).
The team developed a new type of photonic TPU capable of carrying out multiple MV multiplications simultaneously and in parallel. This was facilitated by using a chip-based frequency comb as a light source, which enabled the team to use multiple wavelengths of light to do parallel calculations since light has the property of having different colours that do not interfere.
Our study is the first to apply frequency combs in the field of artificially neural networks, said Tobias Kippenberg, Professor at EPFL
The frequency comb provides a variety of optical wavelengths which are processed independently of one another in the same photonic chip.
Described in Nature, the photonic processor is part of a new wave of light-based computing that could fundamentally reshape the digital world and prompt major advances in a range of areas, from AI and neural networks to medical diagnosis.
Our results could have a wide range of applications, said Prof Harish Bhaskaran from the University of Oxford.
A photonic TPU could quickly and efficiently process huge data sets used for medical diagnoses, such as those from CT, MRI and PET scanners.
More:
Photonic processor heralds new computing era | The Engineer The Engineer - The Engineer
Posted in Quantum Computing
Comments Off on Photonic processor heralds new computing era | The Engineer The Engineer – The Engineer
Quantum computing is so last-decade. Get ready to invest in the final frontier… teleportation – MarketWatch
Posted: January 5, 2021 at 2:33 pm
If 2020 had you wishing you could say Beam me up, Scotty, youre not alone. You may be one tiny step closer to getting your wish in a few decades or so.
Scientists from Fermilab, Caltech, NASAs Jet Propulsion Laboratory and the University of Calgary achieved long-distance quantum teleportation in mid-2020, they confirmed in an academic journal article published last month. Its another step toward realizing whats often called quantum computing, and also toward understanding physics on a different level than we do now, perhaps well enough to someday teleport humans. And while there is no ETF specifically for that yet, here are some broad guidelines for thinking about how to invest in very nascent technologies.
For starters, its good to understand the broad contours of the industry supporting the idea. A 2020 market research analysis estimates the quantum computing market will top $65 billion per year by 2030, while a 2019 BCG report makes the case for investing now, rather than waiting for things to take off. As MarketWatch reported in late 2019, quantum computing is expected to remake everything from pharmaceuticals to cybersecurity.
Right now, there are several blue-chip biggies involved in the quantum race. Scientists from AT&T were involved in the 2020 experiments, and big companies like Microsoft MSFT, +0.04%, Tencent TCEHY, +4.86%, and IBM IBM, +2.15% all have initiatives.
Its easy enough to find exchange-traded funds with big holdings of those giants likely easier than finding publicly-traded small companies on the bleeding edge of these technologies but its also important to remember how small a share of their revenues experimental ventures like these are.
There are still some good models for funds constructed around developing industries like this one, noted Todd Rosenbluth, head of mutual fund and ETF research at CFRA. One is the Procure Space ETF UFO, +1.29%, which sports the ticker UFO. UFO launched before Virgin Galactic SPCE, +3.12% went public, at a moment when it was hard to call it a true pure-play space fund. As MarketWatch noted at the time, UFO is composed of companies involved in existing space-related business lines: ground equipment manufacturing that uses satellite systems, rocket and satellite manufacturing and operation, satellite-based telecommunications and broadcasting, and so on.
The one ETF that might now be said to be closest to offering access to quantum technology takes a similar approach. The Defiance Quantum ETF QTUM, +1.79% has quantum in its name, but says it provides exposure to companies on the forefront of cloud computing, quantum computing, machine learning, and other transformative computing technologies.
Another consideration might be an ETF specializing in very early-stage technology. In December, MarketWatch profiled the Innovator Loup Frontier Technology ETF LOUP, +2.49%. Rosenbluth has also been watching the Direxion Moonshot Innovators ETF MOON, +1.17%.
Disruptive technology themes have gotten a boost from one of biggest success stories of 2020, he said in an interview. ARK Invests fund lineup took in billions of dollars and enjoyed triple-digit gains as their bets on technology had a moment.
The next-gen narrative seems to resonate with investors, and complex themes like these make a good case for investing in actively-managed funds that benefit from researchers expertise. That means that when it succeeds, Theres a snowball effect of investors coming to see the benefits of using ETFs for these kinds of themes, Rosenbluth said.
I think the future is bright for these types of ETFs, Rosenbluth told MarketWatch. Theres less white space in the ETF world than there was before, but its inevitable that there will be a teleportation-related ETF.
Read next: What will 2021 bring for ETFs?
The rest is here:
Posted in Quantum Computing
Comments Off on Quantum computing is so last-decade. Get ready to invest in the final frontier… teleportation – MarketWatch
Major Quantum Computing Projects And Innovations Of 2020 – Analytics India Magazine
Posted: at 2:33 pm
Quantum computing has opened multiple doors of possibilities for quick and accurate computation for complex problems, something which traditional methods fail at doing. The pace of experimentation in quantum computing has very naturally increased in recent years. 2020 too saw its share of such breakthroughs, which lays the groundwork for future innovations. We list some of the significant quantum computing projects and experiments of 2020.
IT services company Atos devised Q-Score for measuring quantum performance. As per the company, this is the first universal quantum metric that applies to all programmable quantum processors. The company said that in comparison to qubits, the standard figure of merit for performance assessment, Q-Score provides explicit, reliable, objective, and comparable results when solving real-world optimisation problems.
The Q-Score is calculated against three parameters: application-driven, ease of use, and objectiveness and reliability.
Googles AI Quantum team performed the largest chemical simulation, to date, on a quantum computer. Explaining the experiment in a paper titled, Hartree-Fock on a superconducting qubit quantum computer, the team said it used variational quantum eigensolver (VQE) to simulate chemical mechanisms using quantum algorithms.
It was found that the calculations performed in this experiment were two times larger than the previous similar experiments and contained about ten times the number of quantum gate operations.
The University of Sydney developed an algorithm for characterising noise in large scale quantum computers. Noise is one of the major obstacles in building quantum computers. With this newly developed algorithm, they have tried to tame the noise by reducing interference and instability.
A new method was introduced to return an estimate of the effective noise with relative precision. The method could also detect all correlated errors, enabling the discovery of long-range two-qubit correlations in the 14 qubit device. In comparison, the previous methods would render infeasible for device size above 10 qubits.
The tool is highly scalable, and it has been tested successfully on the IBM Quantum Experience device. The team believes that with this, the efficiency of quantum computers in solving computing problems will be addressed.
Canadian quantum computing D-Wave Systems announced the general availability of its next-generation quantum computing platform. This platform offers new hardware, software, and tools for accelerating the delivery of quantum computing applications. The platform is now available in the Leap quantum cloud service and has additions such as Advantage quantum system with 5000 qubits and 15-way qubit connectivity.
It also has an expanded solver service that can perform calculations of up to one million variables. With these capabilities, the platform is expected to assist businesses that are running real-time quantum applications for the first time.
Physicists at MIT reported evidence of Majorana fermions on the surface of gold. Majorana fermions are particles that are theoretically their own antiparticle; it is the first time these have been observed on metal as common as gold. With this discovery, physicists believe that this could prove to be a breakthrough for stable and error-free qubits for quantum computing.
The future innovation in this direction would be based on the idea that combinations of Majorana fermions pairs can build qubit in such a way that if noise error affects one of them, the other would still remain unaffected, thereby preserving the integrity of the computations.
In December, Intel introduced Horse Ridge II. It is the second generation of its cryogenic control chip, considered a milestone towards developing scalable quantum computers. Based on its predecessor, Horse Ridge I, it supports a higher level of integration for the quantum systems control. It can read qubit states and control several gates simultaneously to entangle multiple qubits. One of its key features is the Qubit readout that provides the ability to read the current qubit state.
With this feature, Horse Ridge II allows for faster on-chip, low latency qubit state detection. Its multigate pulsing helps in controlling the potential of qubit gates. This ability allows for the scalability of quantum computers.
I am a journalist with a postgraduate degree in computer network engineering. When not reading or writing, one can find me doodling away to my hearts content.
More:
Major Quantum Computing Projects And Innovations Of 2020 - Analytics India Magazine
Posted in Quantum Computing
Comments Off on Major Quantum Computing Projects And Innovations Of 2020 – Analytics India Magazine
Quantum Computing Entwined with AI is Driving the Impossible to Possible – Analytics Insight
Posted: at 2:33 pm
Mergingquantum computing with artificial intelligence (AI)has been on the priority list for researchers and scientists. Even though quantum computing is still in the early phases of development, there have been many innovations and breakthrough. However, it is still unclear on whether the world will change for good or bad when AI is totally influenced by quantum computing.
Quantum computingis similar to traditional computing. It relies on bits, which are 0s and 1s to encode information. The data keeps growing despite limiting it. Moores law has observed that the number of transistors on integrated circuits wills double every two years, making way for tech giants to run the race of making the smallest chips. This has also induced tech companies to compete for the first launch of a viable quantum computer that would be exponentially more powerful than todays computers. The futuristic computer will process all the data we generate and solve increasingly complex problems.
Remarkably, the use ofquantum algorithms in artificial intelligencetechniques will boost machines learning abilities. This will lead to improvements in an unprecedented way. The main goal of the merger is to achieve a so-calledquantum advantage, where complex algorithms can be calculated significantly faster than with the best classical computer. The expected change will be a breakthrough in AI. Experts and business leaders predict thatquantum computings processing powercould begin to improve AI systems within about five years. However, combining AI and quantum is considered scary from an angle. The late researcher and scientist Stephen Hawking has said that the development of full AI could spell the end of the human race. Once humans develop AI, it will take off on its own and redesign itself at an ever-increasing rate. Humans, who are limited by slow biological evolution couldnt compete and would supersede.
Can solve complex problems quickly
One of the major expectations that people have fromquantum computingis to have increased computational skill. It is predicted that quantum computers will be able to complete calculations within seconds that would take thousands of years to calculate. Google claims that the company has a quantum computer that is 100 million times faster than any existing computer. This futuristic and quick way of calculating will solve all the data problems in minutes if not seconds. The key to availing the transition is by converting all the existing data into quantum language.
Enhance warfighter capabilities
Even though the improvement of quantum computing is in the initial stage, it is expected to enhance warfighter capabilities significantly in the future. It is predicted that quantum computing is likely to impact ISR (intelligence, surveillance and reconnaissance), solving logistic problems more quickly. While we know the types of problems and general application space, optimisation problems will be some of the first where we will see advantages.
Applications in the banking sector
Malpractice and constant forgeries are common in the banking and financial sector. Fortunately, the combination of AI with quantum computing might help improve and combat fraud detection. Models trained using a quantum computer will be capable of detecting patterns that are hard to spot using conventional equipment. Meanwhile, the acceleration of algorithms will yield great advantages in terms of the volume of information that the machines handle for this purpose.
Help integrate data from different datasets
Quantum computers are anticipated to be experts in merging different datasets. Although this seems quite impossible without human intervention in the initial phase, computers will eventually learn to integrate data in the future. Henceforth, if there are different raw data sources with unique schema attached to them and a research team wants to compare them, a computer would have to understand the relationship between the schemas before the data could be compared.
All is not good though
In some way, AI and quantum computing worry people with an equal amount of expectations it gives. Quantum computing technology will be very futuristic, but we cant assure you that it is human-friendly. It could be far better than humans suppressing people in their jobs. Quantum computing also poses athreat to security. The latest Thales Data Threat report says that 72% of surveyed security experts worldwide believe quantum computing will have a negative impact on data security within the next five years.
See more here:
Quantum Computing Entwined with AI is Driving the Impossible to Possible - Analytics Insight
Posted in Quantum Computing
Comments Off on Quantum Computing Entwined with AI is Driving the Impossible to Possible – Analytics Insight
The future is here for blockchain and IIoT – Electronic Products & Technology
Posted: at 2:33 pm
In recent years, blockchain technology has created quite a buzz about having the potential to disrupt entire industries. The industrial sector is certainly no exception, thanks to the many different potential applications for blockchain. If youre unfamiliar with this revolutionary new process, youre not alone. However, to maintain your businesses competitive edge, adapting to it may be essential, as it quickly moves from being a buzzword to an industry standard.
To understand why it is important, we should quickly explore the roots of blockchain. The concept is not that new. It has sprouted from a computer process known as a Merkel tree, named after Ralph Merkle, who patented this cryptographic process in 1979. His patent expired in 2002, and Merkle trees started the now-infamous process to become blockchain technology soon after.
You unknowingly interact with various types of Merkle trees daily. The entire internet utilizes them in one form or another. However, Merkle trees have glaring problems that the updated version (blockchain) can dramatically improve. This is especially true in the industrial sector, where information is often siloed, which can hinder a business efforts to streamline its processes.
Source: Getty Images
As the landscape of connectivity worldwide continues to grow, the Internet of Things (IoT) is growing exponentially. This is especially true in the industrial sector, as connectivity helps businesses streamline their procurement process and strengthen supply chains. Even though the IoT is still in its infancy, it has proven invaluable for industries.
However, as we enter into the Fourth Industrial Revolution with IoT, it has not been without significant new vulnerabilities and problems. To combat these vulnerabilities, blockchain utilizes a type of database that makes transactional information immutable, meaning the data cannot be changed once verified and committed to the ledger. Therefore, it can dramatically improve trust and security through a distributed and immutable ledger.
While blockchain is most often used for transactions, the information stored within it is limitless. Because of this, blockchain is poised to become the next stage of security for the IoT and IIoT precisely because of the ability to maintain siloing of certain critical information while also allowing connectivity in vital areas to streamline processes.
IIoT devices are highly vulnerable to attacks, primarily due to their subpar manufacturing standards, low processing power, and meager storage capabilities. In general, there are four types of attacks that IoT devices can fall victim to:
The threat of security breaches increases as IoT devices grow in popularity because conventional systems cannot handle the scale of complex information that IoT devices generate. However, blockchain can reduce damage from attacks or eliminate these threats.
The numerous benefits of blockchain often far outweigh the cost of implementation and integration. Blockchain attempts to eliminate the ability for hackers to exploit the traditional single point of entry for attacks. Each transaction is given cryptographic, unforgeable signatures, making the entire chain more robust and attack-resistant.
Summed up, blockchain can deliver coveted advantages in the area known as the Three Ds: Distribution, Data, and Delivery. Some enterprises may be hesitant to implement blockchain in their processes due to the misunderstanding of costs. However, the return on investment can be realized in the following ways:
Of course, even the best-laid plans often come with some drawbacks. While relatively inexpensive to implement, the overhead involved can cut into the overall benefits of the technology. Because blockchain is decentralized, each node (computer within the network) must maintain its copy of the ledger.
This can lead to high energy usage, as these nodes must stay online and available at all times to confirm transactions. For example, one Bitcoin transaction can consume as much energy as 90% of an average American homes energy consumption for an entire day. Many enterprises are working on innovative ways to incorporate renewable energy to offset the tremendous energy consumption.
Proposed solutions include moving away from the proof-of-work validation method and replacing it with the proof-of-stake or proof-of-authority method. Others, like Red Belly Blockchain, aim to reduce energy consumption while simultaneously improving the scalability and bottlenecking of transactions. Red Belly uses a process called verification sharding that can grow with the demand.
Another notable solution for the scalability problem is the use of state channels, which allow users to interact with one another off-chain to reduce the gridlock that can happen during high-traffic times. Fasttoken has revolutionized the use of state channels and open-sourced their code on GitHub. Many other enterprises are rushing to implement this solution, as well.
As global governments and corporations compete to achieve supremacy in quantum computing, it raises many questions about our current systems viability and even whether blockchain will be enough to keep IoT devices and information secure.
Current encryption methods, even blockchain, are breakable. However, to hack them, one must have access to a significant amount of computing power. Most hacks on robust cybersecurity systems involve a large number of people working for long periods.
What makes quantum computing so problematic is that it drastically reduces the time and resources needed to break encryption. A new type of quantum algorithmcalled Grovers algorithmcan turn a 128-bit key into the equivalent of a 64-bit key. Continuing along this line, a 64-bit key becomes a 32-bit key, and so on.
One defense against an attack by a quantum computer is to double the size of bit-keys. So, a 128-bit key will become a 256-bit key. But, as quantum computers become more robust, will doubling the keys be our only line of defense? As one article from American Scientist points out, the U.S. National Institute of Standards and Technology is already evaluating 69 potential new methods for what it calls post-quantum cryptography. The organization expects to have a draft standard by 2024.
IIoT has become an offshoot area of research due largely to the broad application possibilities of IoT in industry. As the world continues to embrace IoT and augment the physical world with the virtual world, vulnerabilities in these systems will consistently present new challenges. Using a proactive approach to introduce time-tested and innovative new technologies like blockchain will help address these vulnerabilities before they become much larger problems.
Read more here:
The future is here for blockchain and IIoT - Electronic Products & Technology
Posted in Quantum Computing
Comments Off on The future is here for blockchain and IIoT – Electronic Products & Technology
The Year Ahead: 3 Predictions From the ‘Father of the Internet’ Vint Cerf – Nextgov
Posted: at 2:33 pm
In 2011, the movie "Contagion" eerily predicted what a future world fighting a deadly pandemic would look like. In 2020, I, along with hundreds of thousands of people around the world, saw this Hollywood prediction play out by being diagnosed with COVID-19. It was a frightening year by any measure, as every person was impacted in unique ways.
Having been involved in the development of the Internet in the 1970s, Ive seen first-hand the impact of technology on peoples lives. We are now seeing another major milestone in our lifetimethe development of a COVID-19 vaccine.
What the"Contagion" didnt show is what happens after a vaccine is developed. Now, as we enter 2021, and with the first doses of a COVID-19 vaccine being administered, a return to normal feels within reach. But what will our return to normal look like really? Here are threepredictions for 2021.
1. Continuous and episodic Internet of Medical Things monitoring devices will prove popular for remote medical diagnosis. The COVID-19 pandemic has dramatically changed the practice of clinical medicine at least in the parts of the world where Internet access is widely available and at high enough speeds to support video conferencing. A video consult is often the only choice open to patients short of going to a hospital when outpatient care is insufficient. Video-medicine is unsatisfying in the absence of good clinical data (temperature, blood pressure, pulse for example). The consequence is that health monitoring and measurement devices are increasingly valued to support remote medical diagnosis.
My Prediction: While the COVID-19 pandemic persists into 2021, demand for remote monitoring and measurement will increase. In the long run, this will lead to periodic and continuous monitoring and alerting for a wide range of chronic medical conditions. Remote medicine and early warning health prediction will in turn help citizens save on health care costs and improve and further extend life expectancy.
2. Cities will (finally) adopt self-driving cars. Self-driving cars are anything but new, having emerged from a Defense Advanced Research Projects Agency Grand Challenge in 2004. Sixteen years later, many companies are competing to make this a reality but skeptics around this technology remain.
My Prediction: In the COVID-19 aftermath, I predict driverless car service will grow in 2021 as people will opt for rides that minimize exposure to drivers and self-clean after every passenger. More cities and states will embrace driverless technology to accommodate changing transportation and public transportation preferences.
3. A practical quantum computation will be demonstrated. In 2019, Google reported that it had demonstrated an important quantum supremacy milestone by showing a computation in minutes that would have taken a conventional computer thousands of years to complete. The computation, however, did not solve any particular practical problem.
My Prediction: In the intervening period, progress has been made and it seems likely that by 2021, we will see some serious application of quantum computing to solve one or more optimization problems in mechanical design, logistics scheduling or resource allocation that would be impractical with conventional supercomputing.
Despite the challenges 2020 presented, it also unlocked some opportunities like leapfrogging with tech adoption. My hope is that the public sector sustains the speed for innovation and development to unlock even greater advancements in the year ahead.
Vinton G. Cerf is vice president and chief Internet evangelist for Google. Cerf has held positions at MCI, the Corporation for National Research Initiatives, Stanford University, UCLA and IBM. Vint Cerf served as chairman of the board of the Internet Corporation for Assigned Names and Numbers (ICANN) and was founding president of the Internet Society. He served on the U.S. National Science Board from 2013-2018.
Here is the original post:
The Year Ahead: 3 Predictions From the 'Father of the Internet' Vint Cerf - Nextgov
Posted in Quantum Computing
Comments Off on The Year Ahead: 3 Predictions From the ‘Father of the Internet’ Vint Cerf – Nextgov
IBM Provides Harris-Stowe State University with $2 Million in Artificial Intelligence and Open Hybrid Cloud Technology Resources to Help Students…
Posted: at 2:33 pm
ST. LOUIS, Jan. 5, 2021 /PRNewswire/ --Harris-Stowe State University announced today a multi-million dollar collaboration with IBM (NYSE: IBM)on a comprehensive program designed todevelop diverse and high demand skill sets that align with industry needs and trends so both students and faculty can develop the skills they need today for the jobs of tomorrow.
IBM and Harris-Stowe State University are building on the need to advance digital skills in education and are dedicated to providing future focused curriculum and educational tools to help train the diverse workforce of tomorrow in fast-growing technologies such as artificial intelligence (AI), blockchain, data science, cybersecurity, cloud and quantum.
"Harris-Stowe State University is thrilled to collaborate with IBM to provide greater access to skills and training in the tech industry," said Dr. Corey S. Bradford, Sr., president of Harris-Stowe State University. "As the world, more than ever relies on the use of science, technology, engineering, and mathematics to solve grand societal challenges, Harris-Stowe must continue to develop well prepared and ready graduates to join the STEM workforce. This collaboration is yet another example of our commitment to supporting student and faculty development and assisting in preparing students to compete and lead globally."
The collaboration extends IBM's recent investmentin technology, assets, resources and skills development withHBCUs across the United States through the IBM Skills Academy and enhanced IBM Academic Initiative.
"Equal access to skills and jobs is the key to unlocking economic opportunity and prosperity for diverse populations," said Valinda Scarbro Kennedy, HBCU Program Lead, IBM Global University Programs. "As we announced earlier this fall, IBM is deeply committed to helping HBCU students build their skills to better prepare for the future of work. Through this collaboration, Harris-Stowe State University students will have an opportunity to gain modern skills in emerging technologies across hybrid cloud, quantum and AI so they can be better prepared for the future of work in the digital economy."
As part of its multi-year Global University Programs, which include the IBM Academic Initiative and the IBM Skills Academy, IBM is providing more than$100M inassets,faculty training, pre-built and maintained curriculum content, hands on labs, use cases, digital badges and software to participating HBCUs. The IBM Academic Initiative provides access to resources at no-charge for teaching, learning and non-commercial research with recent enhancements including access to guest lectures. TheIBM Skills Academy is a comprehensive, integrated program through an education portal designed tocreate a foundation of diverse and high demand skill sets that directly correlate to what students will need in the workplace.The learning tracks address topics such as artificial intelligence, cybersecurity, blockchain, data science and quantum computing.
IBM's investment in HBCUs like Harris-Stowe State University is part of the company's dedicated work to promote social justice and racial equality by creating equitable, innovative experiences for HBCU students to acquire the necessary skills to help unlock economic opportunity and prosperity.
About IBMIBM is a global leader in business transformation, serving clients in more than 170 countries around the world with open hybrid cloud and AI technology. For more information, please visit here.
About Harris-Stowe State UniversityHarris-Stowe State University (HSSU), located in midtown St. Louis offers the most affordable bachelor's degree in the state of Missouri. The University is a fully accredited four-year institution with more than 50 majors, minors and certificate programs in education, business and arts and sciences. Harris-Stowe's mission is to provide outstanding educational opportunities for individuals seeking a rich and engaging academic experience. HSSU's programs are designed to nurture intellectual curiosity and build authentic skills that prepare students for leadership roles in a global society.
Media Contact:Alandrea P. Stewart, Ed.D.Harris-Stowe State UniversityO: (314) 340-3991C: (314) 203-4296[emailprotected]
SOURCE IBM
See original here:
Posted in Quantum Computing
Comments Off on IBM Provides Harris-Stowe State University with $2 Million in Artificial Intelligence and Open Hybrid Cloud Technology Resources to Help Students…
Farewell 2020: Bleak, Yes. But a Lot of Good Happened Too – HPCwire
Posted: at 2:33 pm
Here on the cusp of the new year, the catchphrase 2020 hindsight has a distinctly different feel. Good riddance, yes. But also proof of sciences power to mobilize and do good when called upon. Theres gratitude by those who came through less scathed, and, maybe more willingness to assist those who didnt.
Despite the unrelenting pandemic, high performance computing (HPC) proved itself an able member of the worldwide community of pandemic fighters. We should celebrate that, perhaps quietly since the work isnt done. HPC made a significant difference in speeding up and enabling vastly distributed research and funneling the results to those who could turn them into patient care, epidemiology guidance, and now vaccines. Remarkable really. Necessary, of course, but actually got done too. (Forget the quarreling; thats who we are.)
Across the Tabor family of publications, weve run more than 200 pandemic-related articles. I counted nearly 70 significant pieces in HPCwire. The early standing up of Fugaku at RIKEN, now comfortably astride the Top500 for a second time and by a significant margin, to participate in COVID-19 research is a good metaphor for HPCs mobilization. Many people and organizations contributed to the HPC v. pandemic effort and that continues.
Before spotlighting a few pandemic-related HPC activities and digging into a few other topics, lets do a speed-drive through the 2020 HPC/AI technology landscape.
Consolidation continued among chip players (Nvidia/Arm, AMD/Xilinx) while the AI chip newcomers (Cerebras, Habana (now Intel), SambaNova, Graphcore et. al.) were winning deals. Nvidias new A100 GPU is amazing and virtually everyone else is taking potshots for just that reason. Suddenly RISC-V looks very promising. Systems makers weathered 2020s storm with varying success while IBM seems to be winding down its HPC focus; it also plans to split/spin off its managed infrastructure services. Firing up Fugaku (notably a non-accelerated system) quickly was remarkable. The planned Frontier (ORNL) supercomputer now has the pole position in the U.S. exascale race ahead of the delayed Aurora (ANL).
The worldwide quantum computing frenzy is in full froth as the U.S. looks for constructive ways to spend its roughly $1.25 billion (U.S. Quantum Initiative) and, impressively, China just issued a demonstration of quantum supremacy. Theres a quiet revolution going on in storage and memory (just ask VAST Data). Nvidia/Mellanox introduced its line of 400 Gbs network devices while Ethernet launched its 800 Gbs spec. HPC-in-the-cloud is now a thing not a soon-to-be thing. AI is no longer an oddity but quickly infusing throughout HPC (That happened fast).
Last but not least, hyperscalers demonstrably rule the IT roost. Chipmakers used to, consistently punching above their weight (sales volume). Not so much now:
Ok then. Apologies for the many important topics omitted (e.g. exascale and leadership systems, neuromorphic tech, software tools (can oneAPI flourish?), newer fabrics, optical interconnect, etc.).
Lets start.
I want to highlight two HPC pandemic-related efforts, one current and one early on, and also single out the efforts of Oliver Peckham, HPCwires editor who leads our pandemic coverage which began in earnest with articles on March 6 (Summit Joins the Fight Against the Coronavirus) and March 13 (Global Supercomputing Is Mobilizing Against COVID-19). Actually, the very first piece Tech Conferences Are Being Canceled Due to Coronavirus, March 3 was more about interrupted technology events and we picked it up from our sister pub, Datanami which ran it on March 2. Weve since become a virtualized event world.
Heres an excerpt from the first Summit piece about modeling COVID-19s notorious spike:
Micholas Smith, a postdoctoral researcher at the University of Tennessee/ORNL Center for Molecular Biophysics (UT/ORNL CMB), used early studies and sequencing of the virus to build a virtual model of the spike protein.[A]fter being granted time on Summit through a discretionary allocation, Smith and his colleagues performed a series of molecular dynamics simulations on the protein, cycling through 8,000 compounds within a few days and analyzing how they bound to the spike protein, if at all.
Using Summit, we ranked these compounds based on a set of criteria related to how likely they were to bind to the S-protein spike, Smith said in aninterviewwith ORNL. In total, the team identified 77 candidate small-molecule compounds (such as medications) that they considered worthy of further experimentation, helping to narrow the field for medical researchers.
It took us a day or two whereas it would have taken months on a normal computer, said Jeremy Smith, director of UT/ORNL CMB and principal researcher for the study. Our results dont mean that we have found a cure or treatment for the Wuhan coronavirus. We are very hopeful, though, that our computational findings will both inform future studies and provide a framework that experimentalists will use to further investigate these compounds. Only then will we know whether any of them exhibit the characteristics needed to mitigate this virus.
The flood (and diversity) of efforts that followed was startling. Olivers advice on what to highlight catches the flavor of the challenge: You could go with something like the Fugaku vs. COVID-19 piece or the grocery store piece, maybe contrast them a bit, earliest vs. current simulations of viral particle spreador something like the LANL retrospective piece vs. the piece I just wrote up on their vaccine modeling. Think that might work for a how far weve come angle, either way.
Theres too much to cover.
Last week we ran Olivers article on LANL efforts to optimize vaccine distribution (At Los Alamos National Lab, Supercomputers Are Optimizing Vaccine Distribution). Heres a brief excerpt:
The new vaccines from Pfizer and Moderna have been deemed highly effective by the FDA; unfortunately, doses are likely to be limited for some time. As a result, many state governments are struggling to weigh difficult choices should the most exposed, like frontline workers, be vaccinated first? Or perhaps the most vulnerable, like the elderly and immunocompromised? And after them, whos next?
LANL was no stranger to this kind of analysis: earlier in the year, the lab had used supercomputer-powered tools like EpiCast to simulate virtual cities populated by individuals with demographic characteristics to model how COVID-19 would spread under different conditions. The first thing we looked at was whether it made a difference to prioritize certain populations such as healthcare workers or to just distribute the vaccine randomly,saidSara Del Valle, the LANL computational epidemiologist who is leading the labs COVID-19 modeling efforts. We learned that prioritizing healthcare workers first was more effective in reducing the number of COVID cases and deaths.
You get the idea. The well of HPC efforts to tackle and stymie COVID-19 is extremely deep. Turning unproven mRNA technology into a vaccine in record time was awe-inspiring and required many disciplines. For those unfamiliar with mRNA mechanism heres a brief CDC explanation as it relates to the new vaccines. Below are links to a few HPCwirearticles on the worldwide effort to bring HPC computational power to bear. (The last is a link to the HPCwire COVID-19 Archive which has links to all our major pandemic coverage):
COVID COVERAGE LINKS
Global Supercomputing Is Mobilizing Against COVID-19 (March 12, 2020)
Gordon Bell Special Prize Goes to Massive SARS-CoV-2 Simulations (November 19, 2020)
Supercomputer Research Leads to Human Trial of Potential COVID-19 Therapeutic Raloxifene (October 29, 2020)
AMDs Massive COVID-19 HPC Fund Adds 18 Institutions, 5 Petaflops of Power (September 14, 2020)
Supercomputer-Powered Research Uncovers Signs of Bradykinin Storm That May Explain COVID-19 Symptoms (July 28, 2020)
Researchers Use Frontera to Investigate COVID-19s Insidious Sugar Coating (June 16, 2020)
COVID-19 HPC Consortium Expands to Europe, Reports on Research Projects (May 28, 2020)
At SC20, an Expert Panel Braces for the Next Pandemic (December, 17, 2020)
Whats New in Computing vs. COVID-19: Cerebras, Nvidia, OpenMP & More (May 18, 2020)
Billion Molecules Against COVID-19 Challenge to Launch with Massive Supercomputing Support (April 22, 2020)
Pandemic Wipes Out 2020 HPC Market Growth, Flat to 12% Drop Expected (March 31, 2020)
[emailprotected]Turns Its Massive Crowdsourced Computer Network Against COVID-19 (March 16, 2020)
2020 HPCwire Awards Honor a Year of Remarkable COVID-19 Research (December, 23, 2020)
HPCWIRE COVID-19 COVERAGE ARCHIVE
Making sense of the processor world is challenging. Microprocessors are still the workhorses in mainstream computing with Intel retaining its giant market share despite AMDs encroachment. That said, the rise of heterogeneous computing and blended AI/HPC requirements has shifted focus to accelerators. Nvidias A100 GPU (54 billion transistors on 826mm2of silicon, worlds largest seven-nanometer chip) was launched this spring. Then at SC20 Nvidia announced an enhanced version of the A100, doubling its memory to 80GB; it now delivers 2TB/s of bandwidth. The A100 is an impressive piece of work.
The A100s most significant advantage, says Rick Stevens, associate lab director, Argonne National Laboratory, is its multi-instance GPU capability.
For many people the problem is achieving high occupancy, that is, being able to fill the GPU up because that depends on how much work you have to do. [By] introducing this MIG, this multi instance stuff that they have, theyre able to virtualize it. Most of the real-world performance wins are actually kind of throughput wins by using the virtualization. What weve seen isour big performance improvement is not that individual programs run much faster its that we can run up to seven parallel things on each GPU. When you add up the aggregate performance, you get these factors of three to five improvement over the V100, said Stevens.
Meanwhile, Intels XE GPU line is slowly trickling to market, mostly in card form. At SC20 Intel announced plans to make its high performance discrete GPUs available to early access developers. Notably, the new chips have been deployed at ANL and will serve as a transitional development vehicle for the future (2022) Aurora supercomputer, subbing in for the delayed IntelXE-HPC (Ponte Vecchio) GPUs that are the computational backbone of the system.
AMD, also at SC20, launched its latest GPU the MI100. AMD says it delivers 11.5 teraflops peak double-precision (FP64), 46.1 teraflops peak single-precision matrix (FP32), 23.1 teraflops peak single-precision (FP32), 184.6 teraflops peak half-precision (FP16) floating-point performance, and 92.3 peak teraflops of bfloat16 performance. HPCwire reported, AMDs MI100GPU presents a competitive alternative to Nvidias A100 GPU, rated at 9.7 teraflops of peak theoretical performance. However, the A100 is returning even higher performance than that on its FP64 Linpack runs. It will be interesting to see the specs of the GPU AMD eventually fields for use in its exascale system wins.
The stakes are high in what could become a GPU war. Today, Nvidia is the market leader in HPC.
Turning back to CPUs, which many in HPC/AI have begun to regard as the lesser of CPU/GPU pairings. Perhaps that will change with the spectacular showing of Fujitsus A64FX at the heart of Fugaku. Nvidias proposed acquisition of Arm, not a done deal yet (regulatory concerns), would likely inject fresh energy in what was already a surging Arm push into the datacenter. Of course, Nvidia has jumped into the systems business with its DGX line and presumably wants a home-grown CPU. The big mover of the last couple of years, AMDs Epyc microprocessor line, continues its steady incursion into Intel x86 territory.
Theres not been much discussion around Power10 beyond IBMs summer announcement that Power10 would offer a ~3x performance gain and ~2.6x core efficiency gain over Power9. The new executive director of OpenPOWER Foundation, James Kulina, says attracting more chipmakers to build Power devices is a top goal. Well see. RISC-V is definitely drawing interest but exactly how it fits into the processor puzzle is unclear. Esperanto unveiled a RISC-V based chip aimed at machine learning with 1,100 low-power cores based on the open-source RISC-V. Esperanto reported a goal of 4,000 cores on a single device. Europe is betting on RISC-V. However, at least near-term, RISC-V variants are seen as specialized chips.
The CPU waters are murkier than ever.
Sort of off in a land of their own are AI chip/system players. Their proliferation continues with the early movers winning important deployments. Some observers think 2021 will start sifting winners from the losers. Lets not forget that last year Intel stopped development of its newly-acquired Nervana line in favor of its even more newly-acquired Habana products. Its a high-risk, high-reward arena still.
PROCESSOR COVERAGE LINKS
Intel Xe-HP GPU Deployed for Aurora Exascale Development
Is the Nvidia A100 GPU Performance Worth a Hardware Upgrade?
LLNL, ANL and GSK Provide Early Glimpse into Cerebras AI System Performance
David Patterson Kicks Off AI Hardware Summit Championing Domain Specific Chips
Graphcores IPU Tackles Particle Physics, Showcasing Its Potential for Early Adopters
Intel Debuts Cooper Lake Xeons for 4- and 8-Socket Platforms
Intel Launches Stratix 10 NX FPGAs Targeting AI Workloads
Nvidias Ampere A100 GPU: Up to 2.5X the HPC, 20X the AI
AMD Launches Three New High-Frequency Epyc SKUs Aimed at Commercial HPC
IBM Debuts Power10; Touts New Memory Scheme, Security, and Inferencing
AMDs Road Ahead: 5nm Epyc, CPU-GPU Coupling, 20% CAGR
AI Newcomer SambaNova GAs Product Lineup and Offers New Service
Japans AIST Benchmarks Intel Optane; Cites Benefit for HPC and AI
Storage and memory dont get the attention they deserve. 3D XPoint memory (Intel and Micron), declining flash costs, and innovative software are transforming this technology segment. Hard disk drives and tape arent going away, but traditional storage management approaches such as tiering based on media type (speed/capacity/cost) are under attack. Newcomers WekaIO, VAST Data, and MemVerge are all-in on solid state, and a few leading-edge adopters (NERSC/Perlmutter) are taking the plunge. Data-intensive computing driven by the data flood and AI compute requirements (gotta keep those GPUs busy!) are big drivers.
Our storage systems typically see over an exabyte of I/O annually. Balancing this I/O intensive workload with the economics of storage means that at NERSC, we live and breathe tiering. And this is a snapshot of the storage hierarchy we have on the floor today at NERSC. Although it makes for a pretty picture, we dont have storage tiering because we want to, and in fact, Id go so far as to say its the opposite of what we and our users really want. Moving data between tiers has nothing to do with scientific discovery, said NERSC storage architect Glenn Lockwood during an SC20 panel.
To put some numbers behind this, last year we did a study that found that between 15% and 30% of that exabyte of I/O is not coming from our users jobs, but instead coming from data movement between storage tiers. That is to say that 15% to 30% of the I/O at NERSC is a complete waste of time in terms of advancing science. But even before that study, we knew that both the changing landscape of storage technology and the emerging large-scale data analysis and AI workloads arriving at NERSC required us to completely rethink our approach to tiered storage, said Lockwood.
Not surprisingly Intel and Micron (Optane/3D XPoint) are trying to accelerate the evolution. Micron released what it calls a heterogeneous-memory storage engine (HSE) designed for solid-state drives, memory-based storage and, ultimately, applications requiring persistent memory. Legacy storage engines born in the era of hard disk drives have historically failed to architecturally provide for the increased performance and reduced latency of next-generation nonvolatile media, said the company. Again, well see.
Software defined storage leveraging newer media has all the momentum at the moment with all of the established players IBM, DDN, Panasas, etc., mixing those capabilities into their product sets. WekaIO and Intel have battled it out for the top IO500 spot the last couple of years and Intels DAOS (distributed asynchronous object store) is slated for use in Aurora.
The concept of asynchronous IO is very interesting, noted Ari Berman, CEO, BioTeam research consultancy. Its essentially a queue mechanism at the system write level so system waits in the processors dont have to happen while a confirmed write back comes from the disks. So asynchronous IO allows jobs can keep running while youre waiting on storage to happen, to a limit of course. That would really improve the data input-output pipelines in those systems. Its a very interesting idea. I like asynchronous data writes and asynchronous storage access. I can see there very easily being corruption that creeps into those types of things and data without very careful sequencing. It will be interesting to watch. If it works it will be a big innovation.
Change is afoot and the storage technology community is adapting. Memory technology is also advancing.
Micron introduced a 176-layer 3D NAND flash memory at SC230 that it says increases read and write densities by more than 35 percent.JEDEC published the DDR5 SDRAM spec, the next-generation standard for random access memory (RAM) in the summer. Compared to DDR4, the DDR5 spec will deliver twice the performance and improved power efficiency, addressing ever-growing demand from datacenter and cloud environments, as well as artificial intelligence and HPC applications. At launch, DDR5 modules will reach 4.8 Gbps, providing a 50 percent improvement versus the previous generation. Density goes up four-fold with maximum density increasing from 16 Gigabits per die to 64 Gigabits per die in the new spec. JEDEC representatives indicated there will be 8 Gb and 16 Gb DDR5 products at launch.
There are always the wildcards. IBMs memristive technology is moving closer to practical use. One outlier is DNA-based storage. Dave Turek, longtime IBMer, joined DNA storage start-up Catalog this year and, says Catalog is working on proof of concepts with government agencies and a number of Fortune 500 companies. Some of these are whos-who HPC players, but some are non-HPC players many names you would recognizeWere at what I would say is the beginning of the commercial beginning. Again, well see.
STORAGE & MEMORY LINKS
SC20 Panel OK, You Hate Storage Tiering. Whats Next Then?
Intels Optane/DAOS Solution Tops Latest IO500
Startup MemVerge on Memory-centric Mission
HPC Strategist Dave Turek Joins DNA Storage (and Computing) Company Catalog
DDN-Tintri Showcases Technology Integration with Two New Products
Intel Refreshes Optane Persistent Memory, Adds New NAND SSDs
Micron Boosts Flash Density with 176-Layer 3D NAND
DDR5 Memory Spec Doubles Data Rate, Quadruples Density
IBM Touts STT MRAM Technology at IDEM 2020
The Distributed File Systems and Object Storage Landscape: Whos Leading?
Its tempting to omit quantum computing this year. Too much happened to summarize easily and the overall feel is of steady carry-on progress from 2019. There was, perhaps, a stronger pivot at least by press release count towards seeking early applications for near-term noisy intermediate scale quantum (NISQ) computers. Ion trap qubit technology got another important player in Honeywell which formally rolled out its effort and first system. Intel also stepped out from the shadows a bit in terms of showcasing its efforts. D-Wave launched a giant 5000-qubit machine (Advantage), again using a quantum annealing approach thats different from universal gate-based quantum system. IBM announced a stretch goal of achieving one million qubits!
Calling quantum computing a market is probably premature but monies are being spent. The Quantum Economic Development Consortium (QED-C) and Hyperion Research issued a forecast that projects the global quantum computing (QC) market worth an estimated $320 million in 2020 to grow 27% CAGR between 2020 and 2024. That would reach approximately $830 million by 2024. Chump change? Perhaps but real activity.
IBMs proposed Quantum Volume metric has drawn support as a broad benchmark of quantum computer performance. Honeywell promoted the 128QV score of its launch system. In December IBM reported it too had achieved a 128QV. The first QV reported by IBM was 16 in 2019 at the APS March meeting. Just what a QV of 128 means in determining practical usefulness is unclear but it is steady progress and even Intel agrees that QV is as good as any measure at the moment. DoE is also working on benchmarks, focusing a bit more on performance on given workloads.
[One] major component of benchmarking is asking what kind of resources does it take to run this or that interesting problem. Again, these are problems of interest to DoE, so basic science problems in chemistry and nuclear physics and things like that. What well do is take applications in chemistry and nuclear physics and convert them into what we consider a benchmark. We consider it a benchmark when we can distill a metric from it. So the metric could be the accuracy, the quality of the solution, or the resources required to get a given level of quality, said Raphael Pooser, PI for DoEs Quantum Testbed Pathfinder project at ORNL, during an HPCwire interview.
Next year seems likely to bring more benchmarking activity around system quality, qubit technology, and performance on specific problem sets. Several qubit technologies still vie for sway superconducting, trapped ion, optical, quantum dots, cold atoms, et al. The need to operate at near-zero (K) temps complicates everything. Google claimed achieving Quantum Supremacy last year. This year a group of China researchers also did so. The groups used different qubit technologies (superconducting v. optical) and Chinas effort tried to skirt criticisms that were lobbed at Googles effort. Frankly, both efforts were impressive. Russia reported early last year it would invest $790 million in quantum with achieving quantum supremacy as one goal.
Whats happening now is a kind of pell-mell rush among a larger and increasingly diverse quantum ecosystem (hardware, software, consultants, governments, academia). Fault tolerant quantum computing still seems distant but clever algorithms and error mitigation strategies to make productive use of NISQ systems, likely on narrow applications, look more and more promising.
Here are a few snapshots:
The persistent question is when will all of these efforts pay off and will they be as game-changing as many believe. With new money flowing into quantum, one has the sense there will be few abrupt changes in the next couple years barring untoward economic turns.
QUANTUM COVERAGE LINKS
IBMs Quantum Race to One Million Qubits
Googles Quantum Chemistry Simulation Suggests Promising Path Forward
Intel Connects the (Quantum) Dots in Accelerating Quantum Computing Effort
D-Wave Delivers 5000-qubit System; Targets Quantum Advantage
Honeywell Debuts Quantum System, Subscription Business Model, and Glimpse of Roadmap
Global QC Market Projected to Grow to More Than $800 million by 2024
ORNLs Raphael Pooser on DoEs Quantum Testbed Project
RigettiComputing Wins $8.6M DARPA Grant to Demonstrate Practical Quantum Computing
Braket: Amazons Cloud-First Quantum Environment Is Generally Available
IBM-led Webinar Tackles Quantum Developer Community Needs
Microsofts Azure Quantum Platform Now Offers Toshibas Simulated Bifurcation Machine
As always theres personnel shuffling. Lately hyperscalers have been taking HPC folks. Two long-time Intel executives, Debra Goldfarb and Bill Magro, recently left for the cloud Goldfarb to AWS as director for HPC products and strategy, and Magro to Google as CTO for HPC. Going in the other direction, John Martinis left Googles quantum development team and recently joined Australian start-up Silicon Quantum Computing. Ginny Rometty, of course, stepped down as CEO and chairman at IBM. IBMs long-time HPC exec Dave Turek left to take position with DNA storage start-up, Catalog, and last January, IBMer Brad McCredie joined AMD as corporate VP, GPU platforms.
Continue reading here:
Farewell 2020: Bleak, Yes. But a Lot of Good Happened Too - HPCwire
Posted in Quantum Computing
Comments Off on Farewell 2020: Bleak, Yes. But a Lot of Good Happened Too – HPCwire
Quantum Computing And Investing – ValueWalk
Posted: at 2:33 pm
At a conference on quantum computing and finance on December 10, 2020, William Zeng, head of quantum research at Goldman Sachs, told the audience that quantum computing could have a revolutionary impact on the bank, and on finance more broadly. In a similar vein, Marco Pistoia of JP Morgan stated that new quantum machines will boost profits by speeding up asset pricing models and digging up better-performing portfolios. While there is little dispute that quantum computing has great potential to perform certain mathematical calculations much more quickly, whether it can revolutionize investing by so doing is an altogether different matter.
Get the entire 10-part series on Seth Klarman in PDF. Save it to your desktop, read it on your tablet, or email to your colleagues.
Q3 2020 hedge fund letters, conferences and more
The hope is that the immense power of quantum computers will allow investment managers earn superior investment returns by uncovering patterns in prices and financial data that can be exploited. The dark side is that quantum computers will open the door to finding patterns that either do not actually exist, or if they did exist at one time, no longer do. In more technical terms, quantum computing may allow for a new level of unwarranted data mining and lead to further confusion regarding the role of nonstationarity.
ValueWalk's Raul Panganiban interviews George Mussalli, Chief Investment Officer and Head of Equity Research at PanAgora Asset Management. In this epispode, they discuss quant ESG as well as PanAgoras unique approach to it. The following is a computer generated transcript and may contain some errors. Q3 2020 hedge fund letters, conferences and more Interview . Read More
Any actual sequence of numbers, even one generated by a random process, will have certain statistical quirks. Physicist Richard Feynman used to make this point with reference to the first 767 digits of Pi, replicated below. Allegedly (but unconfirmed) he liked to reel off the first 761 digits, and then say 9-9-9-9-9 and so on.[1] If you only look at the first 767 digits the replication of six straight nines is clearly an anomaly a potential investment opportunity. In fact, there is no discernible pattern in the digits of Pi. Feynman was purposely making fun of data mining by focusing on the first 767 digits.
3 .1 4 1 5 9 2 6 5 3 5 8 9 7 9 3 2 3 8 4 6 2 6 4 3 3 8 3 2 7 9 5 0 2 8 8 4 1 9 7 1 6 9 3 9 9 3 7 5 1 0 5 8 2 0 9 7 4 9 4 4 5 9 2 3 0 7 8 1 6 4 0 6 2 8 6 2 0 8 9 9 8 6 2 8 0 3 4 8 2 5 3 4 2 1 1 7 0 6 7 9 8 2 1 4 8 0 8 6 5 1 3 2 8 2 3 0 6 6 4 7 0 9 3 8 4 4 6 0 9 5 5 0 5 8 2 2 3 1 7 2 5 3 5 9 4 0 8 1 2 8 4 8 1 1 1 7 4 5 0 2 8 4 1 0 2 7 0 1 9 3 8 5 2 1 1 0 5 5 5 9 6 4 4 6 2 2 9 4 8 9 5 4 9 3 0 3 8 1 9 6 4 4 2 8 8 1 0 9 7 5 6 6 5 9 3 3 4 4 6 1 2 8 4 7 5 6 4 8 2 3 3 7 8 6 7 8 3 1 6 5 2 7 1 2 0 1 9 0 9 1 4 5 6 4 8 5 6 6 9 2 3 4 6 0 3 4 8 6 1 0 4 5 4 3 2 6 6 4 8 2 1 3 3 9 3 6 0 7 2 6 0 2 4 9 1 4 1 2 7 3 7 2 4 5 8 7 0 0 6 6 0 6 3 1 5 5 8 8 1 7 4 8 8 1 5 2 0 9 2 0 9 6 2 8 2 9 2 5 4 0 9 1 7 1 5 3 6 4 3 6 7 8 9 2 5 9 0 3 6 0 0 1 1 3 3 0 5 3 0 5 4 8 8 2 0 4 6 6 5 2 1 3 8 4 1 4 6 9 5 1 9 4 1 5 1 1 6 0 9 4 3 3 0 5 7 2 7 0 3 6 5 7 5 9 5 9 1 9 5 3 0 9 2 1 8 6 1 1 7 3 8 1 9 3 2 6 1 1 7 9 3 1 0 5 1 1 8 5 4 8 0 7 4 4 6 2 3 7 9 9 6 2 7 4 9 5 6 7 3 5 1 8 8 5 7 5 2 7 2 4 8 9 1 2 2 7 9 3 8 1 8 3 0 1 1 9 4 9 1 2 9 8 3 3 6 7 3 3 6 2 4 4 0 6 5 6 6 4 3 0 8 6 0 2 1 3 9 4 9 4 6 3 9 5 2 2 4 7 3 7 1 9 0 7 0 2 1 7 9 8 6 0 9 4 3 7 0 2 7 7 0 5 3 9 2 1 7 1 7 6 2 9 3 1 7 6 7 5 2 3 8 4 6 7 4 8 1 8 4 6 7 6 6 9 4 0 5 1 3 2 0 0 0 5 6 8 1 2 7 1 4 5 2 6 3 5 6 0 8 2 7 7 8 5 7 7 1 3 4 2 7 5 7 7 8 9 6 0 9 1 7 3 6 3 7 1 7 8 7 2 1 4 6 8 4 4 0 9 0 1 2 2 4 9 5 3 4 3 0 1 4 6 5 4 9 5 8 5 3 7 1 0 5 0 7 9 2 2 7 9 6 8 9 2 5 8 9 2 3 5 4 2 0 1 9 9 5 6 1 1 2 1 2 9 0 2 1 9 6 0 8 6 4 0 3 4 4 1 8 1 5 9 8 1 3 6 2 9 7 7 4 7 7 1 3 0 9 9 6 0 5 1 8 7 0 7 2 1 1 3 4 9 9 9 9 9 9
When it comes to investing, there is only one sequence of historical returns. With sufficient computing power and with repeated torturing of the data, anomalies are certain to be detected. A good example is factor investing. The publication of a highly influential paper by Professors Eugene Fama and Kenneth French identified three systematic investment factors, which started an industry focused on searching for additional factors. Research by Arnott, Harvey, Kalesnik and Linnainmaa reports that by year-end 2018 an implausibly large 400 significant factors had been discovered. One wonders how many such anomalies quantum computers might find.
Factor investing is just one example among many. Richard Roll, a leading academic financial economist with in-depth knowledge of the anomalies literature has also been an active financial manager. Based on his experience Roll stated that his money management firms attempted to make money from numerous anomalies widely documented in the academic literature but failed to make a nickel.
The simple fact is that if you have machines that can look closely enough at any historical data set, they will find anomalies. For instance, what about the anomalous sequence 0123456789 in the expansion of Pi.? That anomaly can be found beginning at digit 17,387,594,880.
The digits of Pi may be random, but they are stationary. The process that generates the first million digits is the same as the one which generates the million digits beginning at one trillion. The same is not true of investing. Consider, for example, providing a computer the sequence of daily returns on Apple stock from the day the company went public to the present. The computer could sift through the returns looking for patterns, but this is almost certainly a fruitless endeavor. The company that generated those returns is far from stationary. In 1978, Apple was run by two young entrepreneurs and had total revenues of $0.0078 billion. By 2019, the company was run by a large, experienced, management team and had revenues of $274 billion, an increase of about 35,000 times. The statistical process generating those returns is almost certainly nonstationary due to fundamental changes in the company generating them. To a lesser extent, the same is true of nearly every listed company. The market is constantly in flux and the companies are constantly evolving as consumer demands, government regulation, and technology, among other things, continually change. It is hard to imagine that even if there were past patterns in stock prices that were more than data mining, they would persist for long due to nonstationarity.
In the finance arena, computers and artificial intelligence work by using their massive data processing skills to find patterns that humans may miss. But in a nonstationary world the ultimate financial risk is that by the time they are identified those patterns will be gone. As a result, computerized trading comes to resemble a dog chasing its tail. This leads to excessive trading and ever rising costs without delivering superior results on average. Quantum computing risks simply adding fuel. Of course, there are individual cases where specific quant funds make highly impressive returns, but that too could be an example of data mining. Given the large number of firms in the money management business, the probability that a few do extraordinarily well is essentially one.
These criticisms are not meant to imply that quantum computing has no role to play in finance. For instance, it has great potential to improve the simulation analyses involved in assessing risk. The point here is that it will not be a holy grail for improving investment performance.
Despite the drawbacks associated with data mining and nonstationarity, there is one area in which the potential for quantum computing is particularly bright marketing quantitative investment strategies. Selling quantitative investment has always been an art. It involves convincing people that the investment manager knows something that will make them money, but which is too complicated to explain to them and, in some cases, too complicated for the manager to understand. Quantum computing takes that sales pitch to a whole new level because virtually no one will be able to understand how the machine decided that a particular investment strategy is attractive.
This skeptics take is that quantum computing will have little impact on what is ultimately the source of successful investing allocating capital to companies that have particularly bright prospects for developing profitable business in a highly uncertain and non-stationary world. Perhaps at some future date a computer will development the business judgment to determine whether Teslas business prospects justify its current stock price. Until then being able to comb through historical data in search of obscure patterns at ever increasing rates is more likely to produce profits through the generation of management fees rather than the enhancement of investor returns.
[1] The Feynman story has been repeated so often that the sequence of 9s starting at digit 762 is now referred to as the Feynman point in the expansion of Pi.
See the article here:
Posted in Quantum Computing
Comments Off on Quantum Computing And Investing – ValueWalk