New MIUI 12 Super Live Wallpapers reveals remarkable computer animations of the Planet and also Mars – Market Research News

Xiaomi has actually presented its own brand-new MIUI 12 interface, which features many brand-new attributes and also enlargements to create a great deal a lot more enjoyable individual expertise. As well as aside from the brand-new attributes that the most recent upgrade features, there likewise is actually the brand-new MIUI 12 Live wallpaper that has actually been actually presented. As well as those are actually remarkable, to point out the minimum.

The wallpapers illustrate birds-eye views of the Earth and also Mars and also have actually been actually properly called as the Earth Live wallpaper and also the Mars Live wallpaper, specifically. Unnecessary the say, the corresponding wallpapers will definitely reveal photos of the Earth and also Mars and also are actually just remarkable.

All that you need to have to perform is actually mount the 2 online wallpaper APK data on your cell phone. Afterwards, prepared it to your phones hair display and also the residence display and also voila, your phone is going to possess a perfect remodeling of varieties.

Then there likewise is actually the MIUI 12 Super wallpaper too, which, like the various other 2 wallpapers, are actually implied for make use of on Xiaomi gadgets. That mentioned, it ought to function merely as great on any sort of Android phone. One of the very most striking elements along with the MIUI 12 Super Wallpaper is actually that it happens along with some great computer animations.

As Xiaomi explained, any sort of gadget along with the MIUI 12 Super Wallpaper set up on it is actually uncovered, the Earth or even Mars will definitely switch to show a setting. Each one of this, when seen on an AMOLED shows, creates it all appear absolutely exciting. Xiaomi likewise mentioned the Live Wallpapers would certainly enrich the electric battery lifestyle of the AMOLED gadgets.

To administer the MIUI 12 Live Wallpaper on a Xiaomi gadget, install the exact same and also head to the Gallery segment. Certainly there, choose the Live Wallpaper and also use on the 3 dots. In the food selection that opens up, choose Set Video Wallpaper. For some other Android gadgets, you will definitely need to have to install and also mount each the Live Wallpaper in addition to Google Wallpaper application coming from Play Store. Next off, introduce the Google Wallpaper application and also choose any one of the online wallpapers you merely set up. Simply touch on it and also you are actually performed.

To administer MIUI 12 Super Live Wallpaper, download and also mount MIUI 12 Super Wallpaper apk. Download and also mount the Mi Wallpaper application. Next off, download and also mount the Activity launcher application coming from Play Store. Afterwards, select Super Wallpaper and also choose the one that you want to switch on.

Visit link:

New MIUI 12 Super Live Wallpapers reveals remarkable computer animations of the Planet and also Mars - Market Research News

Flood Risks for northern NZ this long weekend (+4 Day Rainfall Accumulation Map) – WeatherWatch.co.nz

Its not often we blame a high pressure system for the chance of flooding but this weekend that may be the set up.

High pressure which is currently over the South Island will expand east of New Zealand this weekend, pulling down a very sub-tropical moisture-rich airflow which will then, in turn, be slow moving due to the high Its a bit like this rain band is building up behind a slow moving truck that it cant over take says WeatherWatch.co.nz head forecaster Philip Duncan. Like slow traffic the rain bands will get longer behind this slow moving high, ensuring some regions get saturated.

Following on from a drought heavy rain can increase the risks of slips and flooding.

WeatherWatch.co.nz, using IBM Watson (the most powerful weather super computer used in New Zealand), estimates 125 to 150mm possible in the ranges of Coromandel Peninsula. It the rain bands taken any longer to move through these numbers could actually increase says Mr Duncan.

This narrow but heavy sub-tropical flow will feed into eastern Northland, northern Auckland, eastern Coromandel Peninsula, western Bay of Plenty and then spreads out east across the rest of Bay of Plenty, East Cape, Gisborne and maybe even Hawkes Bay.

Around the outer edges of this system the rain may be more patchy with long dry spells, drizzly areas and even the odd sunny spell.

Some good rain is expected in the Auckland water catchment dams.

To drill down locally use our hourly and daily rainfall totals and risks in your local WeatherWatch.co.nz forecast or at RuralWeather.co.nz.

With the exception of the extreme rainfall numbers in some places this rain event is precisely what farmers and growers in the dry north and east have been wanting for months but its worth noting not everyone will get a soaking, with the bulk of the rain hugging the eastern coastline from Whangarei to Napier, those further west have much lower totals

Rain wont penetrate south of Banks Peninsula on the eastern side of the South Island, or about Greymouth on the western side.

Visit link:

Flood Risks for northern NZ this long weekend (+4 Day Rainfall Accumulation Map) - WeatherWatch.co.nz

Virtual ICM Seminar: ‘The Promises of the One Health Concept in the Age of Anthropocen’ – HPCwire

May 27, 2020 The Interdisciplinary Centre for Mathematical and Computational Modelling (ICM) at the University of Warsaw invites enthusiasts of HPC and all people interested in challenging topics in Computer and Computational Science to the ICM Seminar in Computer and Computational Science that will be held on May 28, 2020 (16:00 CEST). The event is free.

On May 28, 2020, Dr. Aneta Afelt from the Interdisciplinary Centre for Mathematical and Computational Modelling department at the University of Warsaw, Espace-DEV, IRD Institut de Recherche pour le Dveloppement, will present a lecture titled, The Promises of the One Health Concept in the Age of Anthropocen

The lecture will dive into the One Health concept. In May 2019 an article was published: Anthropocene now: influential panel votes to recognize Earths new epoch situating at the stratigraphy of Earths history a new geological epoch the domination of human influence on shaping the Earths environment. When humans are a central figure in an ecological niche it results in massive subordination and transformation of the environment for their needs. Unfortunately, the outcome of such actions is a robbery of natural resources. The consequences are socially unexpected a global epidemiological crisis. The current COVID-19 pandemic is an excellent example. It seems that one of the most important questions of the anthropocene era is how to maintain stable epidemiological conditions for now and in the future. The One Health concept proposes a new paradigm a deep look at the sources of humanitys well-being: humanitys relationship with the environment. Humanitys health status is interdependent with the well-being of the environment. It is clear that the socio-ecological niche disturbance results in the spread of pathogens. Can sustainable development of socio-ecological niches help? The lecture dives into the results!

To register, visithttps://supercomputingfrontiers.eu/2020/tickets/neijis7eekieshee/

ICM Seminars is an extension of the international Supercomputing Frontiers Europe conference, which took place March 23-25th in virtual space.

The digital edition of SCFE gathered of the order of 1000 participants we want to continue this formula ofOpen Sciencemeetings despite the pandemic and use this forum to present the results of the most current research in the areas of HPC, AI, quantum computing, Big Data, IoT, computer and data networks and many others, says Dr. Marek Michalewicz, chair of the Organising Committee, SCFE2020 and ICM Seminars in Computer and Computational Science.

Registrationfor all weekly events is free. The ICM Seminars began with an inaugural lecture on April 1st by Scott Aronson, David J. Bruton Centennial Professor of Computer Science at the University of Texas. Aronson led the presentation titled Quantum Computational Supremacy and Its Applications.

For more information, visithttps://supercomputingfrontiers.eu/2020/seminars/

About the Interdisciplinary Centre for Mathematical and Computational Modelling (ICM), University of Warsaw (UW)

Established by a resolution of the Senate of the University of Warsaw dated 29 June 1993, the Interdisciplinary Centre for Mathematical and Computational Modelling (ICM), University of Warsaw, is one of the top HPC centres in Poland. ICM is engaged in serving the needs of a large community of computational researchers in Poland through provision of HPC and grid resources, storage, networking and expertise. It has always been an active research centre with high quality research contributions in computer and computational science, numerical weather prediction, visualisation, materials engineering, digital repositories, social network analysis and other areas.

Source: ICM UW

Read the original post:

Virtual ICM Seminar: 'The Promises of the One Health Concept in the Age of Anthropocen' - HPCwire

Supercomputers hacked across Europe to mine cryptocurrency – ZDNet

Multiple supercomputers across Europe have been infected this week with cryptocurrency mining malware and have shut down to investigate the intrusions.

Security incidents have been reported in the UK, Germany, and Switzerland, while a similar intrusion is rumored to have also happened at a high-performance computing center located in Spain.

The first report of an attack came to light on Monday from the University of Edinburgh, which runs the ARCHER supercomputer. The organization reported "security exploitation on the ARCHER login nodes," shut down the ARCHER system to investigate, and reset SSH passwords to prevent further intrusions.

The bwHPC, the organization that coordinates research projects across supercomputers in the state of Baden-Wrttemberg, Germany, also announced on Monday that five of its high-performance computing clusters had to be shut down due to similar "security incidents." This included:

Reports continued on Wednesday when security researcher Felix von Leitner claimed in a blog post that a supercomputer housed in Barcelona, Spain, was also impacted by a security issue and had been shut down as a result.

More incidents surfaced the next day, on Thursday. The first one came from the Leibniz Computing Center (LRZ), an institute under the Bavarian Academy of Sciences, which said it was disconnected a computing cluster from the internet following a security breach.

The LRZ announcement was followed later in the day by another from the Julich Research Center in the town of Julich, Germany. Officials said they had to shut down the JURECA, JUDAC, and JUWELS supercomputers following an "IT security incident." And so has the Technical University in Dresden, which announced they had to shut down their Taurus supercomputer as well.

New incidents also came to light today, on Saturday. German scientist Robert Helling published an analysis on the malware that infected a high-performance computing cluster at the Faculty of Physics at the Ludwig-Maximilians University in Munich, Germany.

The Swiss Center of Scientific Computations (CSCS) in Zurich, Switzerland also shut down external access to its supercomputer infrastructure following a "cyber-incident" and "until having restored a safe environment."

None of the organizations above published any details about the intrusions. However, earlier today, the Computer Security Incident Response Team (CSIRT) for the European Grid Infrastructure (EGI), a pan-European organization that coordinates research on supercomputers across Europe, has released malware samples and network compromise indicators from some of these incidents.

The malware samples were reviewed earlier today by Cado Security, a UK-based cyber-security firm. The company said the attackers appear to have gained access to the supercomputer clusters via compromised SSH credentials.

The credentials appear to have been stolen from university members given access to the supercomputers to run computing jobs. The hijacked SSH logins belonged to universities in Canada, China, and Poland.

Chris Doman, Co-Founder of Cado Security, told ZDNet today that while there is no official evidence to confirm that all the intrusions have been carried out by the same group, evidence like similar malware file names and network indicators suggests this might be the same threat actor.

According to Doman's analysis, once attackers gained access to a supercomputing node, they appear to have used an exploit for the CVE-2019-15666 vulnerability to gain root access and then deployed an application that mined the Monero (XMR) cryptocurrency.

Making matters worse, many of the organizations that had supercomputers go down this week had announced in previous weeks that they were prioritizing research on the COVID-19 outbreak, which has now most likely been hampered as a result of the intrusion and subsequent downtime.

These incidents aren't the first time that crypto-mining malware has been installed on a supercomputer. However, this marks the first time when hackers did this. In previous incidents, it was usually an employee who installed the cryptocurrency miner, for their own personal gain.

For example, in February 2018, Russian authorities arrested engineers from the Russian Nuclear Center for using the agency's supercomputer to mine cryptocurrency.

A month later, Australian officials began an investigation into a similar case at the Bureau of Meteorology, where employees used the agency's supercomputer to mine cryptocurrency.

More here:

Supercomputers hacked across Europe to mine cryptocurrency - ZDNet

Artificial Intelligence Equipped Supercomputer Mining for COVID-19 Connections in 18 Million Research Documents – SciTechDaily

By DOE/Oak Ridge National LaboratoryMay 19, 2020

Using ORNLs Summit supercomputer, scientists can comb through millions of medical journal articles looking for possible connections among FDA-approved drug therapies and known COVID-19 symptoms. Credit: Dasha Herrmannova/Oak Ridge National Laboratory, U.S. Dept. of Energy

Scientists have tapped the immense power of the Summit supercomputer at Oak Ridge National Laboratory to comb through millions of medical journal articles to identify potential vaccines, drugs, and effective measures that could suppress or stop the spread of COVID-19.

A team comprising researchers from ORNL and Georgia Tech are using artificial intelligence methods designed to unearth relevant information from about 18 million available research documents. They looked for connections among 84 billion concepts and cross-referenced keywords associated with COVID-19 such as high fever, dry cough, and shortness of breath with existing medical solutions.

Our goal is to assist doctors and researchers ability to identify information about drug therapies that are already approved by the U.S. Federal Drug Administration, said ORNLs Ramakrishnan Ramki Kannan.

A massive subset of 6 million documents dated between 2010 and 2015 took 80 minutes, and the entire 18 million will take less than a day to run on Summit. Results will be shared with medical researchers for feedback, which will inform adjustments to improve future calculations.

Read more here:

Artificial Intelligence Equipped Supercomputer Mining for COVID-19 Connections in 18 Million Research Documents - SciTechDaily

Educating the Next Generation of Supercomputer Users with Blue Waters – HPCwire

May 18, 2020 Groundbreaking research, like everything, requires a careful concoction of resources. First, it requires a researcher with advanced expertise in the given subject-matter. Next, you must have access to the tools and computing systems that are powerful enough to take an idea and turn it into tangible analysis. Beyond that, the ability to use those resources effectively is vital to producing results.

An equally-important but often-overlooked component of groundbreaking research, however, is the building of the educational pipeline a workforce that is trained in relevant and necessary skills in order to advance and sustain the research being conducted. This pipeline is crucial to guaranteeing that high-level research will continue to enhance discovery and competitiveness.

Through a Blue Waters internship program allocation for her student John McGarigal, Dr. Tulin Kaman, an Assistant Professor in Mathematical Sciences from the University of Arkansas, sought to do just that. By participating in the Blue Waters Internship Program, not only do her students gain experience studying turbulent flows in fusion, but they are also able to incorporate one of the worlds most powerful advanced computing systems into their education.

We do a numerical simulation of realistic applications, which typically occur in Supernova explosions. This occurs in a type of fusion where the confinement is inertial, gravitational and magnetic, said Kaman.

In order to perform these massive simulations, however, a large-scale computing system is necessary.

The mathematical model and numerical simulations are extremely computationally intensive and Blue Waters was a great resource for us to run these simulations on.

Beyond the computational power of Blue Waters, however, the educational component of the Blue Waters system was perhaps the most exciting aspect to Kaman. The Blue Waters Internship Program provided Kamans student, John McGarigal, and Kaman an opportunity where she was able to bring her undergraduate students directly into the supercomputing research ecosystem, allowing them vital hands-on experience in simulating turbulent flows.

Every year, Blue Waters chose interns, and I had two University of Arkansas undergrad students that would be a good fit, said Kaman. My goal was to motivate and train them in high- performance computing, and with the help of Blue Waters, we could actually introduce the use of petascale computing to them, keep them in this field, and give them motivation to succeed.

Via this internship, Kamans student has gained a multitude of skills, all of which are necessary to conduct research at the petascale, but more emphatically, they will carry these skills on with them in their future pursuits.

Through the program, our goal was to teach [our interns] parallel programming models like MPI, distributed memory, OpenMP, etc., and to try to create new development codes, which they did, continued Kaman. This year one of them had an internship at Oak Ridge National Lab in Tennessee, and the other has started an internship in Hewlett-Packard in Texas, where they plan to continue in the direction of high-performance computing.

From the vital experience gained in the Blue Waters Internship Program, Kamans students were able to gain the skills necessary to flourish in research, development and professional careers in all sectors of society, and are active examples of the importance of establishing an HPC-competent workforce pipeline.

About NCSA

TheNational Center for Supercomputing Applications(NCSA) at theUniversity of Illinois at Urbana-Champaignprovides supercomputing and advanced digital resources for the nations science enterprise. At NCSA, University of Illinois faculty, staff, students, and collaborators from around the globe use advanced digital resources to address research grand challenges for the benefit of science and society. NCSA has been advancing one third of the Fortune 50 for more than 30 years by bringing industry, researchers, and students together to solve grand challenges at rapid speed and scale.

Source: NCSA

Continued here:

Educating the Next Generation of Supercomputer Users with Blue Waters - HPCwire

Argonne Gets New Supercomputing Cluster to Power Further COVID-19 Vaccine and Drug Research – OODA Loop

Argonne National Laboratory is reportedly adding to its research capabilities with a new machine that focuses on COVID-19 data analytics and research. On Wednesday, NVIDIA announced that the new technology will be a high power artificial intelligence focused supercomputer used to fuel the search for new drugs and vaccines to combat the ongoing pandemic. NVIDIA also stated that the tech company developed new pre-trained AI models to help improve the detection of COVID-19 through medical imaging.

The latest AI supercomputer to launch at Argonne will be composted of 24 NVIDIA DGX A100 nodes, each of which performing 5 petaflops of performance, ultimately delivering an additional 120 petaflops of computing power to the lab. The system will be used to better understand the virus and in support of the recently established COVID-19 High-Performance Computing Consortium.

Read More: Argonne Gets New Supercomputing Cluster to Power Further COVID-19 Vaccine and Drug Research

Follow this link:

Argonne Gets New Supercomputing Cluster to Power Further COVID-19 Vaccine and Drug Research - OODA Loop

Danger zone! Brit research supercomputer ARCHER’s login nodes exploited in cyber-attack, admins reset passwords and SSH keys – The Register

Updated One of Britain's most powerful academic supercomputers has fallen victim to a "security exploitation" of its login nodes, forcing the rewriting of all user passwords and SSH keys.

The intrusion, which is understood to be under investigation by GCHQ offshoot the National Cyber Security Centre (NCSC), rendered the ARCHER high-performance computing (HPC) network unavailable to its users on Tuesday.

Sysadmins warned ARCHER users that their SSH keys may have been compromised as a result of the apparent attack, advising them to "change passwords and SSH keys on any other systems which you share your ARCHER credentials with".

In a statement posted to the project's status page on Wednesday, ARCHER admins said it appeared several academic high-performance computers were disrupted across Europe in addition to the Cray-built ARCHER. They explained:

Knowledgeable sources speculated to The Register that ARCHER is an obvious resource for research work by computational biologists as well as those modelling the potential further spread of the novel coronavirus and is therefore a target for hostile states looking to steal advances from British research into the virus, or to simply disrupt it.

American authorities are reportedly set to publicly blame China and Iran for trying to hack research institutions trying to develop a vaccine, according to an unsourced claim made in the New York Times newspaper. This appears to be linked to understated and unspecific warnings from NCSC earlier this month about advanced persistent threat (APT) hacker crews targeting counter-COVID-19 research.

Hosted by the University of Edinburgh, ARCHER is a Cray XC30 supercomputer with 118,080 Intel Xeon E5 CPU cores at its disposal. It was due to be retired and replaced this month, though the global pandemic has delayed its planned withdrawal. El Reg reported on ARCHER2 when it was confirmed in October 2019.

ARCHER is one of the most powerful supercomputers in the UK, although it is outclassed by the UK's most powerful publicly known super, an eight-petaFLOPS 241,920-core Cray-Intel machine operated by the Meteorological Office as well as the European Centre for Medium-Range Weather Forecasts's two Cray XC-40s, the Atomic Weapons Establishment's in-house supercomputer and others. It is ranked 334th on the TOP500 list of the world's most powerful supercomputers.

The latest updates on the ARCHER status page said: "Unfortunately, due to the severity of the situation, the ARCHER Service will not be returned before Friday 15th May. We will review the situation with UKRI and NCSC on Friday and will then provide a further update to you."

Professor Alan Woodward of the University of Surrey told The Register: "To see a Cray being attacked is very unusual so I imagine it must be the computing infrastructure around it that has been attacked. Most users obviously don't sit at a terminal directly attached to the supercomputers, so if the means for remote access is rendered inoperable it means the supercomputers become just an expensive lump of metal and silicon.

"Looks like someone has somehow managed to gain a secure shell on an access node. Assuming that's true, it's going to be a real pain as youll have to set everyone up again."

An NCSC spokesman told The Register: "We are aware of this incident and are providing support. The NCSC works with the academic sector to help them improve their security practices and protect its institutions from threats."

Cray, ARCHER's operators, and counter-coronavirus research teams have been asked if they wish to comment. We will update this article as and when they respond.

A University of Edinburgh spokesperson has been in touch to say:

Sponsored: Webcast: Simplify data protection on AWS

More:

Danger zone! Brit research supercomputer ARCHER's login nodes exploited in cyber-attack, admins reset passwords and SSH keys - The Register

Cyber attack knocks UK research supercomputer ARCHER out of action indefinitely – NS Tech

ARCHER, one of the UKs most powerful research supercomputers, has been knocked offline indefinitely due to a security exploitation of its login nodes, in an attack which also affected the wider academic community in the UK and Europe. As a result, all of ARCHERs existing passwords and SSH keys are being rewritten, and a strong warning has been issued against users logging in with existing credentials.

A spokesperson for Edinburgh University, where ARCHER is based, said the institute was currently investigating the issue. On the 11th May 2020 our technology partners were notified of a potential issue that indicated some user accounts may have been misused to gain unauthorised access to the service, they said. Investigations by our technical teams confirmed that a small number of user accounts had been affected so the decision was taken to disable access to allow further work to confirm the extent of the issue.

The university is working with the National Cyber Security Centre (NCSC) and its technology partners to forge a path to recovery and determine when systems can be brought back online. It said that there is nothing to suggest any research, client or personal data was impacted by the attack. A status update on Thursday said that its hoped ARCHER will return to service early next week but that this will be conditional upon the results of diagnostic scans taking place and consultation with NCSC.

Attacks on supercomputers have been rare up until now, but that doesnt mean they are less susceptible than other computers. A supercomputer is not as exotic as it sounds, says Antonios Michalas, assistant professor in the Department of Computing Sciences at Tampere University. Currently, most of the existing supercomputers rely on traditional hardware, with the exception that they have many many resources.

Because supercomputers arent attached to a terminal, theres surrounding infrastructure that allows people to log in remotely. It appears that the attack wasnt levelled directly at ARCHER, but its perimeter. The fact that they are having to change all the passwords and all the SSH keys suggest that somebody somehow managed to get a Secure Shell maybe through somebody having inadvertently given away the keys or the password, says Alan Woodward, a cyber security expert at the University of Surrey. Woodward says if the SSH key was generated by a password, the password could potentially have been obtained in a phishing attack or through a hacked device. Most of these situations are not some terribly clever technical thing, but actually the weak human is the link, he says.

ARCHER is on a range of research projects, such as modelling weather patterns and biomedical data, simulating the Earths climate and designing new materials. But its role in supporting a number of different COVID-19 research projects might have proven a particular draw to hackers.

I am not sure if anyone can say for sure whether this is a targeted attack to either exfiltrate data relating to Covid-19 research or it was an attack to slow the progress of research into Covid-19 by state actors or whether it was simply a indiscriminate scan attack which happened upon the supercomputer, says Kevin Curran, professor of cybersecurity at Ulster University.

Curran believes we can expect more attacks on supercomputers carrying out biological modelling in future. Organised cybercrime and nation-states are able to install malware (often through infected USB & other hardware interfaces) which can reside on air-gapped machines and also use internal communication chips (in the device) to send the data out to the spies receiver outside, he said in an email. Israeli researchers demonstrated how to steal data that bypasses all of these protections using the GSM network, electromagnetic waves and a basic low-end mobile phone. So it is very difficult to protect a targeted asset such as ARCHER.

Shadow digital, science and technology minister Chi Onwurah said: Our research sector is vital to tackling the pandemic, and the ability to run calculations on the UK HPC System of models and forecasts is crucial to leading us all safely out of lockdown. We need urgent clarity on the causes of this breach and what impact it might have on ongoing research into the coronavirus and potential therapies.

She added: Even short delays to modelling can have a large effect down the line, as this can hold up laboratory work, where delays can get compounded due to the strict scheduling required to due to social distancing.

Archer has resided at Edinburgh University since 2013, but is due to be replaced this year with the more powerful Archer2.

See the article here:

Cyber attack knocks UK research supercomputer ARCHER out of action indefinitely - NS Tech

ORNL Summit Supercomputer Resource Leveraged to Mine for COVID-19 Connections – HPCwire

May 18, 2020 Scientists have tapped the immense power of the Summit supercomputer at Oak Ridge National Laboratory to comb through millions of medical journal articles to identify potential vaccines, drugs and effective measures that could suppress or stop the spread of COVID-19.

A team comprising researchers from ORNL and Georgia Tech are using artificial intelligence methods designed to unearth relevant information from about 18 million available research documents. They looked for connections among 84 billion concepts and cross-referenced keywords associated with COVID-19 such as high fever, dry cough and shortness of breath with existing medical solutions.

Our goal is to assist doctors and researchers ability to identify information about drug therapies that are already approved by the U.S. Federal Drug Administration, said ORNLs Ramakrishnan Ramki Kannan.

A massive subset of 6 million documents dated between 2010 and 2015 took 80 minutes, and the entire 18 million will take less than a day to run on Summit. Results will be shared with medical researchers for feedback, which will inform adjustments to improve future calculations.

About Oak Ridge National Laboratory

Oak Ridge National Laboratory is the largest US Department of Energy science and energy laboratory, conducting basic and applied research to deliver transformative solutions to compelling problems in energy and security. ORNLs diverse capabilities span a broad range of scientific and engineering disciplines, enabling the Laboratory to explore fundamental science challenges and to carry out the research needed to accelerate the delivery of solutions to the marketplace.

Source: Oak Ridge National Laboratory

More here:

ORNL Summit Supercomputer Resource Leveraged to Mine for COVID-19 Connections - HPCwire

The Most Interesting Machine in the World – Alta Magazine

Quantum computingmagnificent in conception though embryonic in performanceis being touted as the next great information-technology revolution.

Enthusiasts are predicting that quantum machines will solve problems beyond the reach of conventional computers, transforming everything from medical research to the concepts of space and time.

Meanwhile, the actual quantum computers being tested in university and corporate laboratories are mostly exotic divas that run at temperatures colder than intergalactic space and crash in milliseconds if intruded on by the outside world.

I worry a lot about the hype, John Preskill told me recently as we chatted in his office in the gleaming, glass-shrouded building at the California Institute of Technology in Pasadena where his Institute for Quantum Information does its weird work. A long-sighted physicist in the tradition of Caltechs Richard Feynman, Preskill is a leading advocate of quantum computing. But even he pooh-poohs the idea that quantum computers will soon replace our laptops.

Everybody believes it, but nobody can prove it, he said. Changing everything in 10 years is not realistic.

Such reservations havent kept governments and the private sector from betting that visionaries like Preskilland like Chinas Pan Jian-Wei, known in his homeland as the father of quantumwill succeed. The Chinese government is reportedly investing $11 billion in developing quantum computers and quantum-ready networks. The U.S. government and the European Union are in for more than a billion dollars each. IBM has put a rudimentary quantum computer online, complete with tutorials on how to frame questions it can understand. (Sample instruction: Apply a Hadamard gate to q[0] by dragging and dropping the H gate onto the q[0] line.) Amazons cloud-computing services now include access to quantum computers operated by IonQ, D-Wave Systems, and the Berkeley chipmaker Rigetti Computing. Google claims to have attained quantum supremacy, a term Preskill coined for the ability to solve problems no conventional computer can handle.

Googles facility sits inconspicuously in an aging industrial park near the Santa Barbara airport. Its identified only by a bumper sticker on the glass front door. Inside, I was shown five quantum computers, all humming away. Each was housed in a giant thermos hung on chains to minimize vibrations from the ground. Dozens of reedy silver coaxial cables fed into each computer, conveying microwave pulses through its quantum chip and back to the dozens of scientists hunched over display terminals in the next room. Google research scientist Erik Lucero told me that the teams goal is to make quantum computers practical, then give them to the world.

The world could use them.

Ordinary computer chips are approaching their theoretical limits. Theyve been getting smaller and faster for decades, but their millions of tiny transistors cannot be shrunk much more without running into interference fromironically enoughthe quantum fluctuations that pervade the universe on submicroscopic scales.

Conventional computers also raise environmental concerns. The globalinformation-technology sector, growing by 3 percent a year, already spews as much greenhouse gas emissions as the airlines do. Supercomputers that gobble up more electricity than 10,000 homes are starting to look as antiquated as steam locomotives, considering that a quantum chip about the size of a postage stamp could, in theory, do more in seconds than a supercomputer could accomplish in a thousand years. Quantums greener.

Long-term prospects aside, theres a hardball motive for investing in quantum computing right now: if you dont, somebody else may get there first.

Consider encryption. Todays commercial and military encryption systems were designed to foil conventionalnot quantumcomputers. In the popular public-key encryption system, each financial transaction is identified by a public number, generated by multiplying two primes. Cracking the code requires determining which two prime numbers were multiplied, a task that would take a conventional computer billions of years to accomplish.

Such cryptography systems, immune to brute-force decoding because doing so would take too long, seemed pretty secure until Peter Shor came along.

Shor, a graduate of Marin Countys Tamalpais High School and Caltech who went on to win the Gdel Prize in theoretical computer science, demonstrated in 1994 that a proper quantum computer could break public-key encryptions in a matter of seconds. As a recent National Academy of Sciences report rather dryly put it, Shors algorithm sparked strong commercial interest in deploying post-quantum cryptography well before such a quantum computer has been built.

The challenge is starkly clear. Build a fully functional quantum computer first, and you might crack the other sides codes before they can crack yours. Miss out, and youre toast.

The code-busting potential of quantum computing has not been lost on the Chinese government, which last November passed a law threatening to punish any private corporations employing ciphers the authorities cant break. Much of the money China has earmarked for quantum computer research is said to be going toward deploying computer networks designed to resist quantum intrusion. Chinese researchers are experimenting with quantum-encoding techniques to create messages that cannot be eavesdropped on without the recipient seeing evidence of it. In one such test, on September 29, 2017, Pan Jian-Wei and his colleagues dispatched a quantum-encrypted key from an orbiting satellite to Vienna and Beijing.

Conventional digital computers manipulate binary digits, or bitsthe zeros and ones that, as Alan Turing proved in 1936, can in principle replicate anything in the universe. Visions of a universal computer, which a century earlier had so enchanted the mathematician Charles Babbage that he came to be regarded as a raving crank, grew into todays digital world with its five billion people using mobile phones.

Quantum computers, too, use bits to communicate with the outside world. But inside their quantum world, they employ what are called quantum bits, or qubits.

A single qubit, such as an isolated atom or electron, can generate only a single, on-or-off, zero-or-one statejust like each transistor on a conventional chip. The magic of qubits resides in their ability to be combinedentangled, in the jargonwith one another, so their many qubits start working together. Entangled qubits scale up exponentially: A 4-qubit quantum computer has not 4 but 16 times the power of a 1-qubit machine. A reliable 300-qubit quantum-computing chip could outperform a conventional computer the size of the observable universe.

The current state of the art is somewhere between IBMs 50-qubit Q System Onea black-lacquer showpiece encased in a nine-foot borosilicate glass cubeand a 72-qubit machine being tested by Google. The Google machine said to have attained quantum supremacy employs a 53-qubit chip. (It was built to run 54 qubits but one never worked, so the researchers went with what they had.)

Most such machines are what Preskill calls noisy intermediate-scale quantum systems, or NISQs. Theyre noisy, he notes, in that researchers have imperfect control over their qubits. Theyre intermediate because properly controlling their 50 or so qubits would produce more power than any existing supercomputer but still fall short of quantum computings potential. Until the noise can be significantly reduced, Preskill predicts, quantum computers with 50 to 100 qubits may be able to perform tasks which surpass the capabilities of todays classical digital computers butwill not change the world right away.

NISQ qubits are typically made in superconducting circuits, each a tiny oval racecourse interrupted by a single barrier called a Josephson junction. To build such a NISQ, pack a bunch of Josephson junctions close together, to encourage them to entangle, and chill them to nearly absolute zero in your laboratory thermos. Electrons will circle each racecourse ceaselessly, going in both directions at once, quantum-leaping through the barriers to create a single entity with an enormous calculating potential.

Once thats happening, hit your supercooled chip with a shaped microwave pulse. The pulse excites the quantum system, which responds by exploring its vast internal space of possible futures, canceling out those that exclude one another and delivering the result as an output pulse. Repeat the process, sifting out noise, until the computation is complete.

In a typical quantum computer, such Q-and-A events can take place a trillion times a second.

Entanglement is fragile. Anything from heat to cosmic rays to an overzealous input pulse can wreck it. But its so promising that its been called a physical resource, like energy, and its exploitation an industry.

Preskill characterizes his research as exploring the entanglement frontier.

The fact that much more goes on inside a quantum system than can ever be detected was established in the mid-1920s by the physicist Werner Heisenberg. Dubbed uncertainty, it was long regarded as a limitation on human knowledge. The uncertainty principle means, for instance, that the more one learns about a quantum particles velocity, the less can be known about its location. This is the basis of the joke in which Heisenberg, pulled over for speeding by a cop who tells him, You were going 90 miles an hour, replies, Thanks a lot. Now I have no idea where I am.

But by the 1980s, as personal computers were becoming commonplace, scientists started to think about the other side of the Heisenberg coin. They speculated that the vast internal states of quantum systems might be put to work for computing. It wouldnt matter that a quantum chips internal deliberations cannot be observed; what mattered was that they might deliver accurate results to the outer world. Since quanta are how nature works, the answers would be coming, so to speak, from the horses mouth. Suddenly, Heisenbergs quantum uncertainty began to look less like a limitation than a resource.

Richard Feynman had started exploring the prospect of quantum computing decades earlier. When our computers get faster and faster and more and more elaborate, he predicted in 1959, we will have to make them smaller and smaller. But there is plenty of room to make them smaller.

Preskill, Caltechs Richard P. Feynman Professor of Theoretical Physics, has something of Feynmans sense of humorresponding to a Twitter poll, he said he became a scientist because I dont mind being confused most of the timeand something of his showmanship. Preskill kicked off a black-tie celebration of Feynmans legacy a few years back by singing an ode to quantum computing that hed written to the tune of South Pacifics Some Enchanted Evening:

Quantums invitingJust as Feynman knew.The futures excitingIf we see it through!Once we have dreamt itWe can make it so.Once we have dreamt itWe can make it so!

Preskill readily rattles off potential practical benefits of quantum computingfrom more efficient solar cells to quantum-entangled space telescopes orbiting the sunbut a scientist of his stature doesnt devote decades to a subject just to stimulate spin-offs. Preskill wants to use quantum computers to simulate nature itself, investigating realms of reality beyond the reach of observation and experiment.

Quantum physics, discovered by Max Planck in 1900 and largely defined by 1930, revealed that the fundamental building blocks of nature are not particles or waves but quanta. (Quanta are the irreducible packets of information that can be extracted from any process.)

Generations of theoretical work and laboratory experiments have confirmed the validity of the quantum approach. As the Caltech physicist Sean Carroll writes, nature is quantum from the start. Quantum mechanics isnt just an approximation of the truth: It is the truth.

There is, however, a conspicuous gap in quantum theory: gravity, the force that gathered together the incoherent masses emerging from the big bang to make galaxies, stars, and the planet we live on. Exquisitely accurate quantum theories account for the behavior of the other three fundamental forceselectromagnetism and the strong and weak nuclear forcesbut gravity is much too weak to play a significant role in most laboratory experiments. Using existing technology to probe quantum gravity would require constructing a particle collider the size of the solar system. Experiments conducted at the edge of a black hole might be fruitful, but the nearest black hole is 3,000 light-years from Earth.

Quantum gravity is hard, Preskill notes, because you cant do experiments.

It might be possible, though, to use quantum computers to simulate how gravity works. Einstein having shown that gravity curves space, and that space and time are two aspects of the same phenomenon, quantum simulations could lay bare the nature of space and time.

Simulations using conventional computers are already widely successful. Formula One race drivers put in long hours on simulators before getting to the track, and commercial pilots use simulators to acquaint themselves with new models of aircraft. But a conventional computer cant even simulate the behavior of a hundred atoms for a millionth of a second, much less that of quanta roiling at the edge of a black hole. The only known way to get quantum is to go quantum.

Quantum computers have the advantage of working in the same way as the systems theyd be simulating. As Feynman argued in 1982, Nature isnt classical, dammit, and if you want to make a simulation of nature, youd better make it quantum mechanical.

By golly its a wonderful problem, he added, because it doesnt look so easy.

The scientific potential of using quantum computers to simulate quantum gravity can be summarized by a single, rather astounding fact: any quantum system can simulate any other quantum system, provided it has at least as many qubits as the system being simulated.

As Preskill puts it, a quantum computer using enough entangled qubits could simulate efficiently any physical process that occurs in nature. He expects such simulations to reveal the deeper quantum process that generates space and time. Space-time comes from the emergent properties of this underlying system, he asserts.

What is understood can be controlledalthough this isnt always obvious at first. Einstein discovered that enormous amounts of energy are locked inside atoms, but he thought it unlikely that the energy could ever be extracted to generate power. Today, nuclear power generates roughly 14 percent of the worlds electricity. Electrons were once regarded as so utterly exotic that physicists at a 1911 annual dinner toasted, To the electron! May it never be of any use to anybody! Yet so many uses were found that the global electronics industry is currently valued at over a trillion dollars.

What, then, might an understanding of quantum space-time enable humans to do?

Preskill expects that it might become possible to create new worlds.

I really believe this is going to happen, he said in his Caltech office, leaning back and smiling pleasantly, as one might expect of a would-be creator of universes.

Timothy Ferris is an emeritus professor at UC Berkeley and the author of a dozen books, among them Seeing in the Dark and Coming of Age in the Milky Way. He produced the Golden Record, an artifact of human music and other sounds of Earth launched aboard the twin Voyager interstellar spacecraft now exiting our solar system.

Quantum computers work by accessing the complex internal states of quantum entitieswhich in the most promising current systems are supercooled electrons.

Entangling multiple quantum systems increases their power exponentially. Two qubits can have four states.

With three qubits, its eight.

With four qubits, its sixteen.

A 100-qubit quantum computer, if one can be built, would outperform a conventional computer the size of planet Earth.

A 300-qubit machine might do better than an ordinary computer made out of every atom in the observable universe. That kind of computing power could model complex molecules and other quantum systems, making it possible to learn how quantum gravity might behave.

Googles quantum-supremacy experiment used a 53-qubit chip to do, in three minutes and 20 seconds, what a conventional computer might accomplish in days, months, or thousands of years.

To function properly, a quantum chips entangled electrons or other particles must be kept coherent by isolating them from interference from the outer world. Most of the quantum computers currently at the performance forefront are kept cold and dark to discourage outside interference that would otherwise cause them to decohere, wiping out their calculations.Running a quantum computer involves sending microwave pulses to the chipeach pulse exciting the entangled quantum system into producing an answering pulse that delivers the results of its calculationswithout decohering it. In practice this means hitting the chip with millions of pulses per second, riding on the edge of decoherence, and doing the same calculation multiple times to enable a signal to emerge from the noise. Much better control over the quantum systems will be required to reduce the noise before quantum computers become fully operational.

Computer science is no more about computers than astronomy is about telescopes.

Edsger Dijkstra

In 1488, the artist Leonardo da Vinci sketched a flying machine. His 1505 Codex on the Flight of Birds advanced the idea that human flight should be possible. After all, birds can fly. Should it not be possible for some kind of deviceperhaps with larger wings and some kind of superhuman energy to power itto allow humans to fly, too?

But despite Leonardos profound insight and engineering genius, nobody could get the idea to work for some 400 yearsnot until 1903, when two bicycle mechanics constructed a flying machine.

Some 30 years later, Charles Lindbergh flew across the Atlantic. And some 30 years after that, thousands of people were crossing oceans and continents, sipping cocktails, and complaining about the movie.

Once the tipping point was reached, progress accelerated dramatically. Yet in the 1890sa few years before the Wright brothersseveral inventors had come close to creating flying machines. They could glide. Balloons could float. But actual human flight was just out of reach.

Something similar is happening today in the field of quantum computing. It seems like it might be possible to radically improve on what can be computed. After all, quantum systems exist in nature. Should it not be possible for us to build some kind of machine to harness the quantum facts of natureand thereby vastly transform what can be constructed and what can be calculated?

The question becomes this: Are we in the Leonardo era of quantum computing, fantasizing about a mere possibility, or are weas in the 1890sin the antechamber of the future, where, within a few more years, and with some clever engineering, a revolution will occur in our daily lives?

Will Hearst

Visit link:

The Most Interesting Machine in the World - Alta Magazine

How the UK can lead the quantum revolution – Verdict

They say technology moves in dog years and its not hard to see why. Moores Law states that the number of transistors on a computer chip doubles every two years through ever-smaller circuitry, producing greater performance and energy efficiency.

The first computers were entirely mechanical, constructed from cogs and wheels, and took up most of a room and originally entire buildings. Software as we now know it would come decades later. The team behind the equipment that took man to the moon using punch cards to code programmes, inadvertently further developing software as a discipline in the process.

After this very analogue world came a digital revolution that paved the way for computers and smartphones becoming commonplace. If we treat Moores Law as the existing gospel or at least a very true long-term guide following the claims by IBM and Google we have arrived at a pivotal moment where all eyes are now falling upon the technology that will spark the next leap forward quantumcomputing.

Last week, the global science community marked International Day of Light, in recognition of the fundamental role it plays in humanitys most advanced technology that will transform industry and society alike. One such monumental event is the comingquantumrevolution, of which light is a critical enabler.

At the moment, digital computing functions through units of information as bits binary code written as 0 or 1, signalling off or on.Quantumtechnology is completely different. At its heart, its about going beyond binary and manipulating particles in theirquantumstate, where they can be on or off and every other combination all at the same time in the terminology ofquantumphysics this is named superposition.

Its thesequantumbits, known as qubits, that create near limitless potential and will form the structure for the next age of computing.

To put that into perspective, the processing abilities of aquantumcomputer made up of more than just 50 qubits could surpass the capabilities of the most powerful supercomputer on Earth right now. Or, as Nobel Laureate Bill Phillips said,quantumcomputing is as big of a departure as the first computer is from the abacus.

Contrary to popular belief, lasers are not all lightsabers and Bond weaponry. Its by using the worlds purest light that scientists have been able to supercool atoms to temperatures below those found in deep space. Its in this state that we can use their purequantumnature to power computers, clocks, compasses and the most accurate devices humanity has ever built.

In practice, this could well mean solving equations and creating models of the universe, our climate, the physical world at a level of complexity that humans simply cannot even comprehend. The greatest challenges that humanity is facing in the 21stcentury, require new science quantumcomputing, sensing and time keeping will dramatically revolutionise many of todays industries but also create new ones.

This is about much more than having faster processing power;quantumtechnology has the potential to fundamentally reinvent the way we tackle computing and interact the world around us.

Science and technology are having their shining moment once more the public and governments appreciation of the importance and power of science is renewed due to Covid-19. We are recognising the benefit of science and technology not only for tackling the immediate and long-term effects of the virus, but also a means for the nations economic recovery from this crisis.

And there are many that believe Britain could be a real crucible for thequantumleap myself included.

Get the Verdict morning email

For starters, the UK has already been behind some of the first commercial applications ofquantumtechnology. My company has worked with Imperial College London to develop aquantumcompass a device that could re-invent the world of global navigation.

For the most part, GPS provides a reliable and effective form of navigation using satellites. But its far from perfect: it is open to spoof-signaling from pirates and reliant on signals that bounce off buildings and GPS is not accessible everywhere. Thequantumcompass removes these challenges by using sensors to measure the properties of supercool atoms, without the need to receive communications from satellites.

Last year, the UK government announced a $194m investment, part of a $1bn program, to commercialise quantumtechnology to take it out of the laboratory and deploy it in real-world applications. This sees the UK alongside the likes of the US and China leading the way.

Beyond fresh investment, the UK uniquely benefits from the presence of several of the worlds best universities focusing onQuantumTechnology and close proximity to a burgeoning community of commercial partners. As much as the right policies are needed, taking frontier science out of the laboratory and into the real world requires a critical mass of the right partnerships.

The UK has a major role to play in the comingquantum age and to stand at the frontier. With the Covid-19 crisis turning our eyes to the radical solutions for the worlds problems and whats next for technology, we should focus on the immense potential of aquantumfuture and take full advantage of the excellence that resides within these shores.

Read more: UK quantum computing investment to triple over next five years

More:

How the UK can lead the quantum revolution - Verdict

Exascale: Cleaner-burning Gasoline Engines, Cities Powered by Wind, Nuclear Reactors That Fit on a Tabletop – HPCwire

When the US Department of Energy (DOE) boots up the worlds first generation of exascale supercomputers next year, researchers hope to find some of the most elusive questions of modern science suddenly closer to being solved.

The two machinesFrontier at Oak Ridge National Laboratory (ORNL) in Tennessee and Aurora at Argonne National Laboratory (ANL) in Chicagopromise the fastest computing horsepower in history of more than 1.5 exaflops apiece. Theyll represent the culmination of a 5 year effort across six national laboratories with a price tag of roughly $1.8 billion.

The quest for exascale officially began in 2015, when the White House laid out marching orders for the National Strategic Computing Initiative, a whole-of-government effort designed to create a cohesive, multi-agency strategic vision and Federal investment strategy, executed in collaboration with industry and academia, to maximize the benefits of HPC for the United States.

That Initiative gave rise to the DOEs Exascale Computing Initiative, focused on making the computing leap, and theExascale Computing Project (ECP), focused on building a comprehensive software ecosystem consisting oftarget applications, anexascale computing software stack, andaccelerated hardware technologyinnovations primed to take full advantage of the newfound processing power.

To prepare the nations first exascale-ready applications, two dozen teams of scientists and engineers have worked around the clock since, stringing together computer code and writing new algorithms to tackle everything fromenergy science and productiontoinvestigating cures for cancertopredicting natural disasters.

Were on track, and well be ready to go on Day One, said Doug Kothe, the ECPs director. I think these applications were developing are going to be the scientific and engineering tools of the trade for decades to come.

An exaflop amounts to 1 quintillionthats 1018or a billion billioncalculations per second, five times faster than the highest speeds available on todays top-performing supercomputer Summit at ORNL, which clocks in at 200 petaflops200 quadrillion calculations per second, or 1015.

For perspective, the average human brain consists of about 100 billion neurons. Multiply those brain cells by 15 million, and theyll approach the problem-solving muscle of one such exascale machine.

Fire up those circuits, and the eureka moments cant help but follow, said Kothe.

Its a real game-changer, he said. I think its going to be a translational moment once we pick up that extra computing speed and capacity that gets us over the remaining obstacles. You cant plan for scientific breakthroughs, but if you have that tremendous technology at your fingertips, they will happen.

Those breakthroughs could include new ways to appease the worlds ravenous hunger for energytobuild cleaner-burning combustion engines,harness wind poweron an unprecedented scale, evenshrink a nuclear reactorto the size of a desktop. Insights gained could help slash pollution and double or triple the efficiency of existing fossil-fuel resources to ease the transition to a greener energy economy.

Its about leap-frogging, said Tom Evans, a distinguished researcher at ORNL. In some of these fields, the leap could be very large. Were not machining screws in any of these projects. These are ambitious questions. Our need for energy is only going to grow, so lets go where the existing science cant take us.

Exascale computings promise rests on the ability to synthesize massive amounts of data into detailed simulations so complex previous generations of computers couldnt handle the calculations. The faster the computer, the more possibilities and probabilities that can be plugged into the model to be tested against whats already knownhow a satellite might react under various conditions in space over time, how cancer cells might respond to new treatments, how a3D-printed designmight hold up under strain.

The process helps researchers target their experiments and fine-tune designs while saving the time and expense of real-world testing. Scientists at ORNL, for example, recently used simulations on Summit to trim a list of more than 8,000 potential drug compounds that mightfight the coronavirusdown to the 77 likeliest candidates.

I wouldnt call it a crystal ball, but its almost like a time machine, said Steve Hamilton, an ORNL scientist working on an application todesign smaller, modular nuclear reactors. Think of it as a virtual experiment. Were learning in minutes, hours, or days what might otherwise take years to discover.

Hamilton hopes to use that virtual laboratory to perfect designs for the next generation of nuclear fission reactors, big enough to power a small community but small enough to fit inside the average living room with such built-in safety features as auto-shutoff or a removable core. The designs could be modular, built out of 3D-printed parts and assembled onsite.

If we can build a virtual reactor and run simulations of how it would behave if it were built, we dont have to do as many physical experiments, Hamilton said. But because these are new designs, we dont have as much experimental data to fall back on, and we want the modeling to be as accurate as possible.

The details of that modeling extend to mapping the behavior of radioactive isotopes constantly colliding and to tracing the steady flow of coolant through the reactor core, for simulations that add up to half a million or more lines of computer code.

If something is meters across, we want to simulate it down to the millimeter, Hamilton said. What we can do right now on Summitthe fastest computer in the worldis along the lines of modeling a single state of the reactor, basically a snapshot of a moment in time. With exascale, we hope to simulate an entire reactor cycleabout a couple years of usein the space of about 24 hours of wall-clock time. Were talking about the difference between a snapshot and a timeline.

The work wont stop when the model is finished.

This is a demo, Hamilton said. Were just setting the table. Its going to be up to the private players to take these tools and lessons and then apply them. We wont be building the reactors, but we want to build tools that will help us assess how these designs work in the real world.

Fission reactors typically take decades to build and receive permits due to the massive designs employed by previous generations to power cities and chunks of states. Going smaller could shrink those costs and speed the time from blueprints to reality.

The current reactors use these giant stainless-steel components made at only one or two facilities in the world, things that typically have to be imported, Hamilton said. By going with a smaller design, were hoping to decrease manufacturing costs by making something that can be manufactured in a variety of places. The reactor could be maybe the size of an office or living room, minus the containment structure and all the other components.

Success could pave the way for even smaller reactors, micro-powerplants with nuclear cores that could power a remote military base or a mobile disaster response. Modular cores could be added and removed like batteries. An earthquake, tidal wave or other natural disaster threatens the reactor as in Fukushima, Japan, in 2011? Pop out the core and haul it to safety.

Some companies are already exploring those possibilities. NuScale Power plans to bring a modular reactor capable of generating up to 720 megawatts online in Utah as early as 2026. Idaho National Laboratory plans to provide low enriched uranium to fuel a 1.5 megawatt reactor being built by the company Oklo that could begin operation by 2024.

From the perspective of these companies, they need to manage to their own timelines and not be completely dependent on a project that DOEs scientists are managing, Hamilton said. Theyre moving ahead. If were able to work with them, we can use the exascale modeling to evaluate the improvements and improve the economic viability of these reactors. Nuclear fission provides about 20 percent of our power nationwide right now. If these reactors become more viable, maybe we can increase that share.

At the other end of the nuclear spectrum, researchers hold out similar hopes ofmodeling a reactorthat could generate the power of a star from a few drops of seawater.

After a half-century of study, trial and error has yet to yield a successful commercially controlled thermonuclear fusion reaction. Scientists joke that the big breakthrough is always 20 years away.

The power and speed of exascale could make the difference through 3D modeling and evaluation aided by artificial intelligence, said Amitava Bhattacharjee, a professor of astrophysical sciences at the Princeton Plasma Physics Laboratory.

I can think of no higher aspiration, he said. The fuel for fusion is virtually limitless. The energy released would be clean and sustainable and much greater than the amount needed to get the nuclear reactions going. The challenge is that were trying to duplicate the nuclear reaction of the sun under controlled laboratory conditions and within a confined device. These are complex and expensive experiments. It is important to develop high-fidelity and predictive computer simulations that can optimize the design and performance of such experimentsand even choose between them through careful validation studies.

The simulated approach has worked for other industries. Aircraft manufacturers once relied on hands-on testing to try out new designs before adopting virtual models.

We previously used wind tunnels all over the country to design and test airframes, Evans said. Now we use them much more sparingly and efficiently. Well still need ground experiments, but well need far fewer of them to validate the designs because the simulations have gotten us 80 percent of the way there.

The bigger the idea, the bigger the simulation. Humans learned centuries ago how to harness the energy of a passing breeze to cross the ocean or power a grist mill.

But the idea of the modern wind farm came about only recently, and it is an ideal challenge for exascale, said Mike Sprague, a senior scientist at the National Renewable Energy Laboratory.

The industry knows how to build a single turbine, Sprague said. Thats based on experience and hands-on knowledge. What we dont have yet are computer models that can predict accurately what wind turbines are going to do when you put them together in a wind-farm setting where these blades are rotating, turning, yawing, and creating a wake that affects turbines downwind. We dont really understand the complex dynamics that are going on and how they interplay.

Spragues project means breaking every inch of each turbine, every potential movement of each blade, and every gust of wind at each potential speed into bricks of equations andbuilding a virtual wind farmthat could be applied to all circumstances. Offshore wind farms of floating turbines at sea add even more variables to the mix.

The most recent simulation by Sprague and his team of an operating turbine took 6 billion equations. It lasted about 17 seconds.

He believes a full exascale simulation could reveal how to position turbines to maximize energy production, cut costs, and make the best use of available terrain.

We want to be able to build up a grid in a virtual wind plant and basically watch a wind front move through at 10 miles per second, for example, and see how the turbines react, Sprague said. We need to understand exactly whats going on, and I think were definitely going to get there with exascale. Then we can walk into the ground experiment to validate what we think we know.

Wind generated about 7.3 percent of the electricity in the United States last year, according to the US Energy Information Administration. Sprague believes exascale could lead to innovations that could push wind to a greater share of that market.

Its not going to be the lightbulbnot on its own, Sprague said. But its a gateway to reducing the cost of energy and making wind competitive with fossil fuels. The private players can take this foundation and build on it to try to make that vision a reality.

Even if breakthroughs in nuclear and wind power materialize, dependence on fossil fuels wont disappear overnight. Other exascale projects focus on finding cleaner approaches to the internal combustion engines that have powered the world for the past century.

Exascale simulations could enable improvements to fossil-fuel engines that would bridge the gap and reduce pollution during the transition, said Jacqueline Chen, a senior scientist at Sandia National Laboratories. She hopes todevelop science-based combustion modelsthat can be used to design high-efficiency, low-emission engines based on the unprecedented level of detail provided by exascale.

Its a stopgap, Chen said. Combustions going to be around for quite some time still, especially for aviation. These models could be used by the fuel and trucking industries to develop clean-burning, fuel-efficient combustion engines that would be competitive for the next 2030 years while were still trying to electrify the powertrain. Even a couple points of increased fuel efficiency go a long way, and efficiencies of 50 percent or higher equate to a huge amount of savings.

But some of the dynamics of the fossil-fuel combustion process, which generated more than 60 percent of the electricity in the United States last year, remain as slippery as those of the atomic fuel cycle. Modeling an engine, like modeling a nuclear reactor, requires mapping legions of moving, reacting particles.

These can be sorbent or oxygen carrier particles, said Madhava Syamlal, a senior fellow for computational engineering at the National Energy Technology Laboratory. As theyre moving around inside the reactor, there are chemical reactions, heat and mass transfer, happening the whole time. We want to capture all those processes and model them, but there are some very complex reactor geometries involved. We need to track where those particles are and get a complete picture at the scale of each of the individual particles.

Syamlals work of the past 30 years focuses onmodeling gas-solid reactors, also known as fluidized bed reactors or chemical looping reactors, that avoid direct contact between fossil fuels and air in an effort to capture carbon from emissions and avoid pumping hydrocarbon pollutants into the atmosphere. Current technology cant accommodate the number of calculations needed to model a pilot-scale chemical-looping reactor.

If were able to do this simulation, it will be incredible, Syamlal said. Well be able to move technology development ahead at least 5 years, and we can apply the capability to a whole variety of industrial processes. Today on a state-of-the-art computer, we can model about 5million particles. Weve been able to run simulations on Summit at about 40 million particles. Our goal is to simulate a reactor that contains 5 billion particles by the end of this project.

Existing codes would take 2 or more years to run such a simulation. We hope to develop a code that can use an exascale machine to increase the speed and resolution by a thousand-fold and get the results in a couple of days.

The possibilities dont stop there. Some researchers hope to use exascale simulations toexplore foundational principles of science, from designing enhanced particle accelerators to reenacting the Big Bang.

Weve never observed a supernova directly, Evans said. We can simulate that with exascale and see if it correlates or matches with what weve observed from earth. Some questions will still be too complex.

And not everyone expects exascale to provide all the answers.

Designing a fusion reactor is so complicated, with so many layers of science and engineering, a whole-device model could easily go beyond resources offered at the exascale level, Bhattacharjee said. Our hunger for more powerful computers is not likely to stop. But exascale will get us a long way there. Things are hard to predict. We need to understand all the pieces of the puzzle and how they come together, because there will be surprises in the wholewhich will be more than the sum of the parts. Be prepared for surprises.

Some of those surprises could be long in coming. Others might come quicker.

Exascales computing speed could deliver immediate benefits in such sectors asthe nations utility grid. Simulations could predict scenarios for massive power failures like those seen during the California wildfires, balance demand, pinpoint weak spots in the delivery system and devise workarounds to keep the electric currents flowing to consumers. Power companies could build parallel digital models to run alongside the grid in real time and help promptly identify the cause when the wires stop humming and the lights blink out.

You could think of it as a digital twin, said Kothe. We want to be able to ask what-if scenarios to prepare for emergencies and manage the delivery. Solar and wind power are going to be more intermittent. We need to plan for ebbs and flows of those energy sources on the grid and be able to counter with other sources. What if we lose 1020 sources at once? This is where the efficacy of exascale simulation comes in.

The average power plant wont have the luxury of an exascale computer onsite. But the apps built to run on machines likeFrontierat ORNL andAuroraat ANL will be designed to scale down to the industrial and consumer level, eliminating the shortcuts relied on by simpler models and drawing on exascale findings to close the circuit and produce reliable conclusions.

When you need to talk about operational decisions, youre talking about a matter of minutes or seconds, Kothe said. This will give answers you can bank on when you need them.

Read more:

Exascale: Cleaner-burning Gasoline Engines, Cities Powered by Wind, Nuclear Reactors That Fit on a Tabletop - HPCwire

Total partners with Cambridge Quantum Computing on CO2 capture – Green Car Congress

Total is stepping up its research into Carbon Capture, Utilization and Storage (CCUS) technologies by signing a multi-year partnership with UK start-up Cambridge Quantum Computing (CQC). This partnership aims to develop new quantum algorithms to improve materials for CO2 capture.

Totals ambition is to be a major player in CCUS and the Group currently invests up to 10% of its annual research and development effort in this area.

To improve the capture of CO2, Total is working on nanoporous adsorbents, considered to be among the most promising solutions. These materials could eventually be used to trap the CO2 emitted by the Groups industrial operations or those of other players (cement, steel etc.). The CO2 recovered would then be concentrated and reused or stored permanently. These materials could also be used to capture CO2 directly from the air (Direct Air Capture or DAC).

The quantum algorithms which will be developed in the collaboration between Total and CQC will simulate all the physical and chemical mechanisms in these adsorbents as a function of their size, shape and chemical composition, and therefore make it possible to select the most efficient materials to develop.

Currently, such simulations are impossible to perform with a conventional supercomputer, which justifies the use of quantum calculations.

Excerpt from:

Total partners with Cambridge Quantum Computing on CO2 capture - Green Car Congress

Supercomputer-Powered Text Mining Tool Combs Through COVID-19 Studies – HPCwire

The COVID-19 pandemic is producing massive amounts of data and that data is producing a positive avalanche of academic literature. To help sift through those tens of thousands of research papers and synthesize COVID-19 knowledge, researchers at Lawrence Berkeley National Laboratory have produced a text mining tool powered by supercomputing and machine learning.

The tool, called COVIDScholar, uses natural language processing (NLP) to scan academic papers on COVID-19 and make the results easily searchable. It was developed following the White House Office of Science and Technology Policys mid-March call to action on AI tools for data and text mining against COVID-19. Within a week, the Berkeley Lab researchers had an early version of the tool operational.

Our objective is to do information extraction so that people can find non-obvious information and relationships, said Gerbrand Ceder, a Berkeley Lab scientist who is helping to lead the project, in an interview with Berkeley Lab. Thats the whole idea of machine learning and natural language processing that will be applied on these datasets.

Smart big data analysis tools like these are necessary to make sense of the COVID-19 literature, which quickly reached overwhelming levels. Theres no doubt we cant keep up with the literature, as scientists, Kristin Persson, another Berkeley Lab scientist leading the project. We need help to find the relevant papers quickly and to build correlations between papers that may not, on the surface, look like theyre talking about the same thing.

Within a month, the team had collected over 61,000 research papers in the field, with around 200 more appearing every day. COVIDScholar incorporates automated scripts that pull those papers, standardize them and index them for searching. Within 15 minutes of the paper appearing online, it will be on our website, said Amalie Trewartha, one of the lead developers of the tool.

On the surface, COVIDScholar is an advanced search engine: it returns results, sorted into subcategories, and recommends similar articles. But soon, its functionality will run much deeper. Were ready to make big progress in terms of the natural language processing for automated science, said John Dagdelen, another of the lead developers. You can use the generated representations for concepts from the machine learning models to find similarities between things that dont actually occur together in the literature, so you can find things that should be connected but havent been yet.

To run COVIDScholar, the researchers turned to supercomputers at the National Energy Research Scientific Computing Center (NERSC). NERSCs current flagship supercomputer is Cori, a Cray XC40 system rated at 14 Linpack petflops. (Edison, its previous XC30-based flagship, was retired around this time last year.)

It couldnt have happened somewhere else, Trewartha said. Were making progress much faster than wouldve been possible elsewhere. Its the story of Berkeley Lab really. Working with our colleagues at NERSC, in Biosciences, at UC Berkeley, were able to iterate on our ideas quickly.

To access COVIDScholar, click here.

Read this article:

Supercomputer-Powered Text Mining Tool Combs Through COVID-19 Studies - HPCwire

ORNL’s Summit Ranks on Graph500 List Using Only a Fraction of its Computing Power – HPCwire

April 30, 2020 For the first time ever and using only a fraction of its processing power, an Oak Ridge National Laboratory (ORNL) supercomputer has entered the Graph500 ranking, a list published twice a year that benchmarks the speed at which a computer performs graph operations.

The Summit supercomputer, located at the Oak Ridge Leadership Computing Facility (OLCF)a US Department of Energy (DOE) Office of Science User Facility at ORNLplaced fourth in the ranking as announced during the 2019 International Conference for High Performance Computing, Networking, Storage, and Analysis (SC19).

Graphs are present in everything we do, said Ramki Kannan, team lead for Computational Artificial Intelligence and Machine Learning in the Computer Science and Mathematics Division (CSMD) and one of the researchers working on the benchmark. Relationships between the elements of transportation systems, the reach certain people can have in social networks, and the steps involved in commercial transactions are all examples of graph problems that are part of our daily lives, he said.

Graph theory is also applicable to many other areas, such as in the mapping of biological networks of viruses and bacteria. An example of this would be using graphs to help understand the spread of COVID-19 by contact-tracing.

Fewer resources, big results

To enter the ranking, supercomputers must solve a specific type of synthetically generated graph, called a Kroneckergraph. The graphs generated by this model satisfy many of the properties found in real-world networks, including networks that are hierarchically organized into communities. In Kronecker graphs, every node is described by a sequence of attributes, and the probability of a relationship between them depends on such characteristics.

The graph Summit had to resolve to rank on Graph500 included over 1 trillion vertices with approximately 16 trillion relationships among them.

ORNLs Summit used a total of 86,016 CPU cores to solve the problem, which amounts to 45 percent of its CPU computing power, which is a far smaller fraction of its total compute power because GPUs were not used for this benchmark.

In contrast, the supercomputer that ranked first in solving this taskChinas Sunway TaihuLightused 10,599,680 CPU cores to make the same solution happen.

According to the ORNL Graph500 teamwhich also included scientists Hao Lu of the National Center for Computational Sciences (NCCS), Kamesh Madduri and Piyush Sao of CSMD, and visualization specialist Michael Matheson and software engineer Drew Schmid of NCCSthe result achieved by Summit is a good example of computing power used efficiently.

Most teams look at this ranking and think How can we optimize and grow our systems to rank first? But we looked at it from a different perspective. We wanted to know how fast we could solve this graph problem by using fewer resources. The fact that we made it in fourth place under that philosophy is a truly remarkable thing, Kannan said.

Although other teams had been preparing to submit their metrics for years, the ORNL researchers had only 9 months to measure and submit Summits performance.

Summits place in the ranking is based on a metric called Giga Traversal Edges Per Second (GTEPS), but the system ranks first in other common metrics such as GTEPS per node, Mega TEPS per core, and GTEPS to peaknode memory bandwidth, according to observers of the ranking.

Graph500 complements the Top500 list, a biannual ranking of the worlds fastest supercomputers. Summit has placed first in the Top500 since June 2018.

About Oak Ridge National Laboratory

UT-Battelle LLC manages Oak Ridge National Laboratory for DOEs Office of Science, the single largest supporter of basic research in the physical sciences in the United States. DOEs Office of Science is working to address some of the most pressing challenges of our time. For more information, visithttps://energy.gov/science.

Source: Andrea Schneibel, Oak Ridge Leadership Computing Facility (OLCF)

View post:

ORNL's Summit Ranks on Graph500 List Using Only a Fraction of its Computing Power - HPCwire

University of Delaware Biologists Are Using Supercomputer Simulations to Analyze the Coronavirus – HPCwire

April 28, 2020 Two University of Delawareresearchers have been awarded aNational Science Foundationgrant to study the novel coronavirus that causes COVID-19. Theyre using the kinds of high-tech supercomputing tools that previously led them to new insights into other viruses, such as those that cause AIDS and Hepatitis B.

Juan Perilla and Jodi Hadden-Perilla were funded through NSFs Rapid Response Research (RAPID) program.

The researchers will use computer simulations to analyze the molecular structure of the virus that has led to the current pandemic. Learning more about the structure is essential to understanding viral entry into and infection of human cells, a first step in developing novel drugs and vaccines to combat the disease, Perilla and Hadden-Perilla said.

If you understand how something works, you can understand how to make it stop working, Hadden-Perilla said of the need to analyze how the virus functions and how it infects people. We need to know the atomic structure so researchers can determine ways to target it as they work to develop treatments and vaccines.

Using an infrastructure that connects them to their lab computers and supercomputing resources, the researchers are focusing on using supercomputers to perform molecular simulations at the atomic level.

These simulations allow scientists to study the way molecules move to learn how they carry out their functions. Computer simulations are the only method that can reveal the motion of molecular systems down to the atomic level; theyre sometimes referred to as computational microscopes.

Viruses arent static, Perilla noted, so simulations of the coronavirus are key to understanding its components and functions.

Perilla and Hadden-Perilla said their teams work could have an immediate impact on the pandemic. To enhance that potential, the researchers plan to disseminate their results broadly and quickly, estimating that they could have the basics of their model a first step in the process in place within a few weeks.

About The National Science Foundation

The National Science Foundation (NSF) is an independent federal agency that supports fundamental research and education across all fields of science and engineering. In fiscal year (FY) 2019, its budget is $8.1 billion. NSF funds reach all 50 states through grants to nearly 2,000 colleges, universities and other institutions. Each year, NSF receives more than 50,000 competitive proposals for funding and makes about 12,000 new funding awards.

Source: The National Science Foundation

Read more:

University of Delaware Biologists Are Using Supercomputer Simulations to Analyze the Coronavirus - HPCwire

LUMI: the EuroHPC pre-exascale system of the North – insideHPC

Tomasz Malkiewicz from CSC

In this video from Supercomputer Frontiers Europe 2020, Tomasz Malkiewicz from CSC presents: LUMI: the EuroHPC pre-exascale system of the North.

The EuroHPC initiative is a joint effort by the European Commission and 31 countries to establish a world-class ecosystem in supercomputing to Europe (read more at https://eurohpc-ju.europa.eu/). One of its first concrete efforts is to install the first three precursor to exascale #supercomputers. Finland, together with 8 other countries from the Nordics and central Europe, will collaboratively host one of these systems in Kajaani, Finland. This system, LUMI, will be the one of the most powerful and advanced computing systems on the planet at the time of its installation. The vast consortium of countries with an established tradition in scientific computing and strong national computing centers will be a key asset for the successful infrastructure. In this talk we will discuss the LUMI infrastructure and its great value and potential for the research community.

LUMI supercomputer highlights:

Timeline:

Machine room construction: September 2019 October 2020System procurement: November 2019 July 2020System installations: Q4/2020Operations: Q1/2021-Q4/2026

Check out our insideHPC Events Calendar

Go here to see the original:

LUMI: the EuroHPC pre-exascale system of the North - insideHPC

‘I love you’: How a badly-coded computer virus caused billions in damage and exposed vulnerabilities which remain 20 years on – CNN

Skinny, with a mop of black hair falling to his eyebrows, he appeared to barely register the journalists' shouted questions, his only movement the occasional dabbing of sweat from his face with a white towel. Seated to his right, de Guzman's lawyer Rolando Quimbo had to lean in close to hear the 23-year-old's mumbled response, which he then repeated in English for the waiting press.

"He is not really aware that the acts imputed to him were indeed done by him," the lawyer said. "So if you ask me whether or not he was aware of the consequences I would say that he is not aware."

Twenty years on, the ILOVEYOU virus remains one of the farthest reaching ever. Tens of millions of computers around the world were affected. The fight to contain the malware and track down its author was front page news globally, waking up a largely complacent public to the dangers posed by malicious cyber actors. It also exposed vulnerabilities which we are still dealing with to this day, despite two decades of advances in computer security and technology.

This account of the virus is based on interviews with law enforcement and investigators involved in the original case, contemporaneous CNN reporting and reports by the FBI, Philippines police and the Pentagon.

On the afternoon of May 4, 2000, Michael Gazeley was in his office at Star Computer City, a warren of IT companies and shops selling electronics and gadgets overlooking Hong Kong's Victoria Harbor.

That connectivity cut both ways, however, as Gazeley was reminded of that afternoon.

All the phones in his office started ringing at once. First were his clients, then came non-customers, all calling frantically in the hope that Network Box could help stop a virus that was screaming through their systems, destroying and corrupting data as it went.

They all told the same story: Someone in the office had received an email with the subject "ILOVEYOU" and the message, "kindly check the attached LOVELETTER coming from me." When they opened what appeared to be a text file actually an executable program masquerading as one the virus quickly took control, sending copies of itself to everyone in their email address book. Those recipients, thinking the email was either some weird joke or a serious declaration of love, opened the attachment in turn, spreading it even further.

Office email servers were soon clogged as thousands of love letters went back and forth, disseminating the virus to more people. It turned out to be much worse than just a self-propelling chain letter. At the same time as it was replicating itself, the ILOVEYOU virus destroyed much of the victim's hard drive, renaming and deleting thousands of files.

Many of the increasingly panicked callers Gazeley was fielding inquiries from did not have backups, and he had the awkward job of explaining to them that many of their files everything from spreadsheets and financial records to photos and mp3s were likely lost for good.

"This wasn't something that people were used to as a concept, they didn't realize that email could be so dangerous," said Gazeley, recounting the first calls.

Two years earlier, Hollywood star Meg Ryan asked "is it infidelity if you're involved with somebody on email?" as the movie "You've Got Mail" introduced people to the idea of cyber-romance and that email could be used for something other than boring office work.

From Hong Kong, where the virus crippled the communications and ravaged file systems of investment banks, public relations firms and the Dow Jones newswire, the love bug spread westward as the May 4 workday started.

Graham Cluley was on stage at a security conference in Stockholm, Sweden, when the virus hit Europe. He had just finished describing an unrelated virus which targeted a now-defunct operating system, hijacking users' accounts to broadcast messages to their coworkers, including "Friday I'm in LOVE." This, Cluley cracked, was likely to cause severe embarrassment for most people, but could potentially lead to some office romance.

As the conference broke for coffee, attendees' mobile phones and pagers began going off wildly. Several guests approached Cluley, asking if the virus he'd described was spread via email. He assured them it wasn't and, anyway, it was limited to a niche system that most people didn't use.

"They said, Well, that's weird because we're suddenly getting loads of emails with the subject line 'I love you,'" Cluley said in an interview from his home in the United Kingdom.

When Cluley turned on his own phone, he was bombarded with notifications of missed calls, voice mails and text messages. Back home, Cluley's employer, the anti-virus firm Sophos, had been getting "absolutely hammered" with phone calls from clients begging for help and journalists trying to understand what the hell was going on.

Cluley raced to the airport to catch a flight to London, and even traded phone batteries with a generous taxi driver as the constant stream of messages drained his Nokia cellphone of power. When he landed in the United Kingdom, a car was waiting to whisk him to a TV studio to discuss what had by now become one of the biggest tech stories in the world.

Unlike today, when many email services are run via centralized servers think Outlook.com or Gmail companies in 2000 were running email off the same servers on which they hosted their website. This could be janky, slow and startling insecure.

Back then, Cluley said, "many companies didn't have in place filters their email gateways to try and stop spam, let alone viruses."

From there, almost every major military base in the country barring a handful that didn't use Outlook watched as their email services were crippled and forced offline for hours as the problem was fixed.

Across the Potomac River, at the FBI's Washington, DC, headquarters, Michael Vatis was scrambling to get a handle on the crisis.

As anti-virus companies slowly began rolling out patches, stemming the damage and enabling companies to come back online, attention within the FBI turned to tracking down those responsible. The investigation was led by the New York field office, which soon found evidence pointing back east, beyond Hong Kong, to the Philippines.

"In a very short period of time, we ended up identifying individuals in the Philippines and seeking the assistance of Philippine law enforcement," said Vatis, now a partner at the New York law firm Steptoe. "And a very short time after that, the Philippine authorities ultimately made an arrest."

Both the technical fix and first break in the case came so fast because, for all its rapid dissemination around the world, the ILOVEYOU virus was clumsily coded and startlingly unsophisticated. It mashed together several existing pieces of malware and did little to hide its workings.

"Every single victim of the love bug got a copy of the love bug's code, the actual source code," said Cluley, the Sophos analyst. "So it was simple to write an antidote. It was no more complex than any of the other thousands and thousands of viruses we'd seen that day. But of course, this one was particularly successful at spreading itself."

As well as containing the blueprint for defeating it, the code also included some lines pointing to the identity of its author. It contained two email addresses spyder@super.net.ph and mailme@super.net.ph both of which were based in the Philippines. There was also a reference to GRAMMERSoft Group, which it said was based in the country's capital.

Without the servers to send information to and it appears the virus's author was never able to access what was sent to the server, or at least act upon it ILOVEYOU became purely an engine of chaos and destruction. It churned through email inboxes around the world and deleted files, while not actually serving the apparent original purpose of scraping passwords.

Ramones, a curly-haired 27-year-old who worked at a local bank, seemed like an unlikely computer hacker, and investigators wondered if they had arrested the wrong guy. Attention turned to the apartment's two other residents: Ramones' girlfriend, Irene de Guzman, and her brother, Onel.

Onel de Guzman who was not in the apartment when it was raided, and could not be found was a student at AMA Computer College. The college was home to a self-described hacking group, the now-defunct GRAMMERSoft, which specialized in helping other students cheat on their homework. While police could not prove initially that de Guzman was a member, officials at the school shared with them a rejected final thesis he had written, which contained the code for a program bearing a startling resemblance to ILOVEYOU.

In the draft thesis, de Guzman wrote that the goal of his proposed program was to "get Windows passwords" and "steal and retrieve internet accounts [from] the victim's computer." At the time, dial-up internet access in the Philippines was paid for by the minute, in contrast to the blanket-use fees in much of Europe and the United States. De Guzman's idea was that users in the developing world could piggyback on the connections of those in richer countries and "spend more time on [the] internet without paying."

Reading his proposal, de Guzman's teacher was outraged, and wrote "we don't produce burglars" and "this is illegal" in the margins. But while the thesis would cost de Guzman his degree, his teacher's argument about illegality would be proven incorrect.

After several days out of the public eye, de Guzman appeared at the press conference in Quezon, flanked by his lawyer and sister. Asked whether he might have been responsible for the virus, he responded through his lawyer: "It is possible."

"He did not even know that the actions on his part would really come to the results which have been reported," his lawyer said. To a ripple of laughter from reporters, the lawyer added, after a mumbled consultation with de Guzman: "The internet is supposed to be educational so it should be free."

Asked what he felt about the damage caused by the virus, de Guzman said "nothing, nothing."

While Philippines lawmakers did rush through a law criminalizing computer hacking soon after the ILOVEYOU incident, it could not be applied retroactively.

Two decades on, this reaction still annoys Cluley, the Sophos investigator. "It's the kind of thing that has you thumping your head against a wall in frustration," he said. "This was when malware was just beginning to get a little nastier and a little more malicious and more financially motivated."

"This wasn't the message we wanted to give young people, that this was all right."

"It had an enormous effect," said Vatis, the former NIPC director. "It was really worldwide front page news for at least several days in a way that computer attacks had not been in the past."

While previous attacks had caused more direct damage, and those in the future would be more sophisticated and far more effective in their goal, they were also much more limited in scope. Other viruses have targeted specific locations, businesses or governments. ILOVEYOU could affect just about anyone running Windows Outlook.

"It hit home in a way that other previous attacks did not," Vatis said. "It made people aware that this is not just something that happens to defense agencies or owners of websites, this is something that can happen to any Joe or Jane sitting at home on the computer or in the office, and it can shut you down and really disrupt your ability to operate."

And while email clients have gotten better at filtering out malicious-seeming messages, the main weakness that ILOVEYOU exploited remains impossible to fix.

"You can update your operating systems or you can have the best email filters in the world, but you can't patch the human brain," said Cluley.

"Humans are always the weak link," Vatis said. "It's almost always easier to exploit a human through some social engineering gambit than it is to crack, you know, some technological defensive measure."

One thing that has changed somewhat since ILOVEYOU is how prepared most companies are for such an incident. Most at least have some kind of anti-virus protection, and back up their data. But all the experts who tackled ILOVEYOU two decades ago agreed that there remains a startling degree of complacency over potentially devastating cyber attacks.

"What's frightening is that 20 years after, there are still plenty of organizations who don't take this seriously until they are hit," said Gazeley, the Hong Kong cybersecurity expert. "So many people still don't plan ahead."

What largely prevents such an attack is that most companies and individuals outsource running email servers to those who know how to do it best primarily Microsoft and Google and rely on them to filter incoming messages, cut out spam and warn of potential attacks.

Were a worm like ILOVEYOU to find a way past those filters, and spread fast enough to prevent the companies rolling out a patch, the possibility of it doing major damage remains. There is no reason to expect that the average user has grown any less complacent today. With email providers doing most of the work in spotting dodgy messages, they may actually be more so.

Vatis said that the potential effect on online communications of such a worm could be "devastating," as could the knock on the global economy as companies go offline or lose business all at once. He compared the situation to people who avoid getting vaccinated for the flu every year.

"That's not a problem for society as a whole until the vaccination rate drops below a certain percentage," he said. "And then you have a lot of people getting really sick."

Link:

'I love you': How a badly-coded computer virus caused billions in damage and exposed vulnerabilities which remain 20 years on - CNN

Pete Lau on the wonders of 5G – Fortune India

Where would the smartphone be 10 years down the line?

The smartphone of the futurewhatever the devicewill be one touch-point of many. And each person will have an account, which is unique. People ask: Will the phone be the centre of the future or the smart display? And my answer is its the account which lives in the cloud and functions on the cloud, and is, therefore, your connection for interaction with everything. So, your personal supercomputer and your super assistant is that account functionality in combination with the cloud.

What about privacy?

Everything will be cloud-based. Security will also be cloud-based or cloud-centric. It [security] would be a challenge, but privacy would be the biggest challenge, as you have so many touch points and incredible computing power. It is not just a challenge for the smartphone industry but also for the entire technology or mobile Internet industry of connectivity. If we look at, for example, in China, transactions fora store or a hotel can be done through a smartphone by using facial recognition. That is an example of something that is not even device-based, but actually cloud database-powered face recognition. And then in the here and now, the future will only consist of more of such things. So, privacy gets the highest priority.

There are no set standards for 5G technology among companies and countries. Is that a challenge?

I believe working out a common standard would be inevitable, even though I am not an expert in network technology.

How will you develop the 5G ecosystem? Whats your plan for India in light of the current situation in the telecom sector?

To answer the second question first, what we see in India is that the carriers, the government, and the relationship, and what is evolving, it is behind some markets. But globally, 5G transition is moving forward; for example, at the flagship level for chipset offerings, there arent any that are not 5G. So, this sort of transition is going to be brought forward and I believe India would be able to follow quickly. We saw this in 2014 when we were starting in the Indian market with 4G and the quick transition into 4G in India. Regarding the 5G ecosystem, One Plus is not just looking to build an ecosystem, but is focussing on the foundation of a seamlessly connected user experience; it is really about getting the platform right for whatever we do and whatever we create.

How will the consumption of data change because of 5G? How will data change the behaviour of users?

I see data consumption in the next five years going up significantly. With the realisation of seamless connectivity, peoples behaviour, and their consciousness of data consumption will decline or perhaps move quickly to the point of something that people dont even care about. Seamless connectivity will also enable data consumption to perhaps be 10-100 times of what it is now in the next five years. Theres not so much a specific consciousness of the fact that you are consuming so much data. Its just the reality of the way things will be.

So consuming data would become part of the system?

Indeed. Because people wont have that focus or care or concern for data; perhaps children born in 2020 would never know what data or the concept of data consumption actually is.

Why should a customer move to 5G?

I see it as the human pursuit of speedsomething thats unending. Many think 4G LTE speed is good enough. But there will be a transition when the current speeds wont be enough. For instance, we could compare 4G LTE to what was available 10years ago: services available now [say, video streaming on mobile phones] versus the technology available then. As a joke, you can say it will make people lazy. But the thing is, it [5G] will allow people to focus on what matters most to them. A whole lot of things can be handled by the super assistant we talked about. For example, a business trip. The whole process of planning, tickets, check-ins, locations, and timeall of that can be taken care of by it. Life for humanity will be more convenient.

What are the industries 5G would disrupt? Or will it disrupt the way a human functions?

From my perspective, its definitely impacting everything. We can already see that with the transition to 4G, the number of industries impacted was significant. Some of the reports looking at the impact of 5G across industries are also showcasing what the technology would become in 5-10 years.

You recently completed six years in India. How has the countrys smartphone market changed, and how have you helped change it?

Six years ago, when we came to India, there were many brands in the marketboth local and international. But if we look at the number of brands now, it is fewer. In 2014, the average price of a device sold was under $100, from what I remember. And 4G was not very well dispersed. For most new players, the takeaway was to launch a product that hits that price point of $100 or less, maybe around 5,000. But we were launching a product that was already over 20,000. People would ask: Who are you selling that to? The results show demand, from the users perspective, for that type of product was there, and will continue to be there. Six years later weve seen tremendous change. Brands are much more focussed and the average quality of the offering is much higher. If you look at our contribution to the industry, it would be around quality standards. By consistently following this higher standard of quality, we created confidence in the industry that with a quality product, users will know, and understand, and rally behind that product.

(This interview was originally published in the March 2020 issue of the magazine.)

See the original post:

Pete Lau on the wonders of 5G - Fortune India