Lost Eight-Billion Light Years of the Universes Evolution – The Daily Galaxy –Great Discoveries Channel

Posted on Jun 18, 2020 in Astronomy, Black Holes, Physics

Its likely there are another two million gravitational wave events from merging black holes a pair of merging black holes every 200 seconds and a pair of merging neutron stars every 15 seconds that scientists are not picking up, says Rory Smith at OzGrav (ARC Center of Excellence in Gravitational Wave Discovery), about a new method of detection being tested that means that we may be able to look more than 8 billion light years further than we are currently observing. This will give us a snapshot of what the early universe looked like while providing insights into the evolution of the universe, adds Smith.

Smith and his colleagues at Australias Monash University have developed a method to detect the presence of these weak or background events that to date have gone unnoticed, without having to detect each one individually. The method is currently being test driven by the Advanced LIGO-VIRGO gravitational-wave detector network that recorded data from 35 merging black holes and neutron stars in 2019.

The paper, recently published in the Royal Astronomical Society journal, details how researchers will measure the properties of a background of gravitational waves from the millions of unresolved black hole mergers.

Evolutionary Rosetta Stones Binary Black Hole Mergers

Binary black hole mergers release huge amounts of energy in the form of gravitational waves and are now routinely being detected by the Advanced LIGO-Virgo detector network. According to co-author, Eric Thrane from OzGrav-Monash, these gravitational waves generated by individual binary mergers carry information about spacetime and nuclear matter in the most extreme environments in the Universe. Individual observations of gravitational waves trace the evolution of stars, star clusters, and galaxies, he said.

Spacetime Uncertainty LIGO Going Quantum

By piecing together information from many merger events, we can begin to understand the environments in which stars live and evolve, and what causes their eventual fate as black holes., says Thrane. The further away we see the gravitational waves from these mergers, the younger the Universe was when they formed. We can trace the evolution of stars and galaxies throughout cosmic time, back to when the Universe was a fraction of its current age.

Vast Amounts of Missed Data

The researchers measure population properties of binary black hole mergers, such as the distribution of black hole masses. The vast majority of compact binary mergers produce gravitational waves that are too weak to yield unambiguous detections so vast amounts of information is currently missed by our observatories.

Flashbacks Gravitational Waves Create Memories of the Universe

Moreover, inferences made about the black hole population may be susceptible to a selection bias due to the fact that we only see a handful of the loudest, most nearby systems. Selection bias means we might only be getting a snapshot of black holes, rather than the full picture, Smith warned.

The analysis developed by Smith and Thrane is being tested using real world observations from the LIGO-VIRGO detectors with the program expected to be fully operational within a few years, according to Smith.

The Daily Galaxy, Sam Cabot, via Phys.org and Australian Research Council Centre of Excellence for Gravitational Wave Discovery

Image credit: The OzStar Super Computer

See more here:

Lost Eight-Billion Light Years of the Universes Evolution - The Daily Galaxy --Great Discoveries Channel

Supercomputer predicts final Premier League table – and it is good news for Watford and West Ham – Hertfordshire Mercury

Premier League football returns this week for Watford and their top flight rivals.

When the action finally resumes on Wednesday, it will have been three months since the campaign was suspended due to the coronavirus crisis and there is a very short timeframe for the season to be concluded, with 92 matches still to be played, all of which will be televised on live TV and take place behind closed doors.

But there are still some big issues to be determined in the remaining games, with several clubs battling to avoid relegation, including Watford, West Ham and Brighton, who currently sit just outside the bottom three, with Bournemouth, Aston Villa and Norwich City currently in the relegation zone.

Renowned data analysts FiveThirtyEight have assessed the remaining fixtures and crunched the data to predict how the final rankings will look - and it's great news for Watford and West Ham.

Video Unavailable

Click to playTap to play

Play now

According to their calculations, Bournemouth, Aston Villa and Norwich are predicted to suffer relegation to the Championship, with West Ham, Watford and Brighton narrowly avoiding the drop.

Meanwhile, its great news for Leicester, Chelsea and potentially Manchester United when it comes to the top four battle.

Man Citys appeal against a two-year ban from UEFA competitions was heard last week with a decision expected in July. If upheld, fifth-place could secure Champions League football - which would benefit United, according to these predictions.

You can see the full predictions below.

2019/20 predicted Premier League table

1. Liverpool - 101 points

2. Man City - 81 points

3. Leicester - 67 points

4. Chelsea - 64 points

5. Manchester United - 62 points

6. Wolves - 57 points

7. Sheffield United - 55 points

8. Tottenham - 55 points

9. Arsenal - 53 points

10. Everton - 51 points

11. Burnley - 49 points

12. Crystal Palace - 48 points

13. Southampton - 44 points

14. Newcastle - 44 points

15. Brighton - 39 points

16. Watford - 39 points

17. West Ham - 38 points

18. Bournemouth - 36 points

19. Aston Villa - 34 points

20. Norwich - 30 points

Excerpt from:

Supercomputer predicts final Premier League table - and it is good news for Watford and West Ham - Hertfordshire Mercury

Using nature to beat COVID-19 – whnt.com

Posted: Jun 18, 2020 / 10:38 PM CDT / Updated: Jun 18, 2020 / 10:38 PM CDT

HUNTSVILLE, Ala. Whittling 50,000 down to 125 sounds pretty daunting, especially when youre talking about analyzing chemical compounds. Researchers are taking that first number of naturally occurring compounds and finding which of them have an effect on the COVID-19 virus.

In this case, the original big number became the promising 125. We have been looking at what chemicals out there in nature in plants, for instance, could be used to help us kill the virus or cure the virus, said Dr. Jerome Baudry, the leader of the effort at the Baudry Lab at UAH.

Ultimately the Baudry Lab in partnership with researchers at the University of Mississippi, will be testing some 400,000 naturally occurring compounds. The first tests were done in just days, and there is a good reason for that.

Ten years ago we probably would have been able to do it, but it would have taken months and we were able to do it in a few days thanks to this supercomputer that weve had at our disposal, said Dr. Baudry.

The Hewlett Packard Cray Sentinel super computer is located in Texas. It made it possible to test the 50,000 compounds in multiple ways against proteins in the virus. The goal was find which compounds inhibited the virus from doing what it wants to do. Were finding a way to use todays technology, and do something special.

Link it to this very old and ancestral knowledge of humanity about what plants exist, and what they do. Will they kill you? Will they save you, said Dr. Baudry.

Its the second question that is most on the mind of Dr. Baudry and all the people helping with this project. As the compounds are tested, the ones that appear to have an effect on the the virus will be sent to lab and tested on live virus. Make no mistake, this is a project that isnt looking for help years from now.

If we are really efficient, I would say within a year, probably a year is a safe optimistic, but safe estimate. It may take more than that. Normally it takes more than ten years to find a drug. We have to cut that down by an order of magnitude, and we think we can. said Dr. Jerome Baudry.

So, the first results are in, and everyone is excited, but the work is not nearly over. No one is sure which plants, fungi, or even bacteria will provide the magic compound. Dr. Baudry said it may very well be a combination, something like the combination of drugs that fight HIV and AIDS.

View post:

Using nature to beat COVID-19 - whnt.com

Have Some Dad Fun: Tell Him That Self-Driving Cars On Earth Might Discover Intelligent Life On Other Planets And See What He Says – Forbes

Talking with your dad and having some fun.

Dads know everything.

Well, almost.

Ask him if there is intelligent life beyond our planet.

Nobody yet knows, though there are plenty of efforts to find out.

The Search for Extraterrestrial Intelligence (SETI) has been going on for many years, seeking to answer that very question.

The most common means for conducting this vexing search consists of examining electromagnetic pulses coming from outer space. By intercepting such pulses as they are radiating across space, we hope to spot anything that might be a telltale clue of intelligent life that is beaming out those rays.

It could be that some intelligent creatures are purposely trying to send us a message, doing so from far away, and they are hoping that we are astute enough to detect the message. In that sense, the communique could be a purposeful one.

Or, it might be that there are intelligent creatures inadvertently broadcasting electromagnetic exhaust or spillover from the machines theyve made and by how they live and travel on their own planet.

In that case, we might get lucky and detect the leakage, remarkably discovering the intelligent life and yet perhaps it has not yet discovered us.

What are the odds of making such an incredible discovery?

You might have heard of the famous Drake equation, a formula that was devised in the early 1960s by scientist Frank Drake to help estimate the odds of their being intelligent life in our galaxy. His equation is relatively simple and yet powerful enough to have been long-lasting. Many have expanded upon his equation.

In any case, he had tried to estimate the odds that there are detectable civilizations in the Milky Way galaxy.

By monitoring narrow-bandwidth radio signals and doing copious analysis of the signals, maybe we can ferret out that intelligent life is close to home (somewhere in our own galaxy). Various scientists have played with Drakes equation and some say that the probability of there being other intelligent life in our galaxy and that we are able to detect them is near to zero (so close to zero that we should assume it is zero), while others claim that it is definitely a non-zero chance and we have a reasonable basis to keep looking.

Recently, the Astrobiological Copernican Limit theory has been proposed, which postulates that there are perhaps 36 such planets, but there is controversy associated with this latest approach and some still argue that the actual number remains at or much closer to zero.

We can continue to look even if the odds are slim.

People are undertaking the off-world search for various reasons.

One reason is out of pure curiosity.

Another is that if there is intelligent life, maybe we can learn something from them that will help us.

Yet another reason is the sci-fi portrayal that maybe an intelligent life will ultimately come to take over our planet, and thus we ought to find them before they start their invasion.

As part of the search, computers can be used to examine the radio signals coming from outer space that scientists are in the process of collecting. It is a tedious effort by computers and involves mathematically looking for patterns within the radio signals.

By-and-large, the radio waves are just noise, random bits of this or that, and the assumption is that if there a distinct pattern within the signals, it could mean that those are emanating as a purposeful signal.

Supercomputers of massive computational capability have been and are continuing to be used to examine the voluminous radio signal data.

It is a never-ending task.

Years ago, some enterprising searchers realized that it might be possible to harness everyday desktop computers and laptops to also aid in the electronic hunt.

A screen saver program was developed that could be easily loaded onto a PC and be used as an active participant in the search. Essentially, via the Internet, segments of radio signal data could be downloaded and the computer would crunch away, ultimately reporting its analysis back to the master cloud-based search system.

If you could get lots and lots of home computers doing this, and if you carefully coordinated the data being parceled out, you could do as much or even more than a supercomputer might be able to do.

Some liken this to the democratization of the search for intelligent life, while others say it is merely a practical way to leverage the millions upon millions of everyday desktop and laptop computers that now exist on our planet.

Those that download and employ the software are willingly allowing their computers to be used in the search effort. Much of the time your desktop computer is likely idle and has nothing especially important to do.

Why not let it participate in a larger than life kind of effort, quietly aiming to discover intelligent life elsewhere?

You might say that you dont want to know whether there is other intelligent life, and therefore decide to not be part of the search. Sure, thats fine.

Or, you might not want your computer to be used for anything other than for your own purposes. Thats fine too.

Those that relish conspiracy theories are apt to even believe that if their computer happens to be the one that detects intelligent life, those intelligent beings might decide that the owner of that particular computer is the first to go.

Ironically, you could have done yourself in by simply participating in the search process.

For those of you that have already participated in the search, you likely made use of SETI@home, which has been provided and maintained by the University of California Berkeley (heres the link).

Earlier this year, they announced that the SETI@home software was going into hibernation and that they would no longer be distributing new tasks. Meanwhile, the SETI@home message boards are continuing to operate, and they are working fervently on the back-end data analysis.

As they say, maybe they will find ET.

Heres an intriguing question: Could the advent of true self-driving cars potentially help us in the search for discovering intelligent life on other planets?

Lets unpack the matter and see.

The Levels Of Self-Driving Cars

It is important to clarify what I mean when referring to true self-driving cars.

True self-driving cars are ones that the AI drives the car entirely on its own and there isnt any human assistance during the driving task.

These driverless cars are considered a Level 4 and Level 5, while a car that requires a human driver to co-share the driving effort is usually considered at a Level 2 or Level 3. The cars that co-share the driving task are described as being semi-autonomous, and typically contain a variety of automated add-ons that are referred to as ADAS (Advanced Driver-Assistance Systems).

There is not yet a true self-driving car at Level 5, which we dont yet even know if this will be possible to achieve, and nor how long it will take to get there.

Meanwhile, the Level 4 efforts are gradually trying to get some traction by undergoing very narrow and selective public roadway trials, though there is controversy over whether this testing should be allowed per se (we are all life-or-death guinea pigs in an experiment taking place on our highways and byways, some point out).

Since semi-autonomous cars require a human driver, computer processing capabilities are typically less powerful than the computers used on truly autonomous cars. As will be explained shortly, the powerful computers employed in true self-driving cars will be the key to the suggestion that driverless cars can help find intelligent life outside of our planet.

For semi-autonomous cars, it is important that I mention a disconcerting aspect, namely that in spite of those human drivers that keep posting videos of themselves falling asleep at the wheel of a Level 2 or Level 3 car, do not be misled into believing that you can take away your attention from the driving task while driving a semi-autonomous car.

You are the responsible party for the driving actions of the car, regardless of how much automation might be tossed into a Level 2 or Level 3.

Self-Driving Cars And The Search For Intelligent Life

For Level 4 and Level 5 true self-driving cars, since they are going to be equipped with quite powerful computers, we can consider how those self-driving cars can be an aid in the search for intelligent life.

The AI software will be running on the on-board in-car computers and has the revered life-or-death role of driving the car.

There isnt a human driving the car.

Occupants inside a self-driving car are passengers.

While a self-driving car is in motion, the AI is dutifully churning away and examining the sensory data to figure out what the driving scene consists of. The AI must interpret the data and make decisions about what the self-driving car should do next. This is a computationally intensive task and requires some rather impressive computing capabilities to be included in the self-driving car.

To get software updates for the AI system, there is an on-board electronic communication capability known as OTA (Over-The-Air). The OTA is also used to take the collected data from the on-board sensors and push it up into the cloud, allowing cloud-based servers to use the data to uncover additional Machine Learning (ML) and Deep Learning (DL) improvements about the driving task.

At some point, if the driverless car is an EV (Electrical Vehicle), it likely needs to be parked and plugged into a charger to get the electrical power pumped back up. While the self-driving car is sitting there, presumably the AI has nothing much to do. The computers on-board the driverless car are relatively idle at that time.

This brings us to a crucial point to be considered.

You could potentially use those idle computing cycles to search for intelligent life.

One means of leveraging the topnotch processors of a self-driving car would be to engage them in the same kind of radio signal processing that your desktop computer can do. A downloaded and bona fide variant of a SETI program could be residing in the computer memory of the self-driving car and be activated when the car is parked and doing nothing else of merit.

Via OTA, radio signal data would be downloaded into the on-board computer memory, and once the analysis is done, the results could be pushed back into the cloud.

Might as well use the self-driving car on-board computers for something that can possibly help mankind.

Now, it might seem puzzling to think that a solitary self-driving car is going to somehow demonstrably help in this matter. Keep in mind that there are about 250 million conventional cars in the United States today. Eventually, inexorably, it is assumed that those conventional cars will be retired and gradually be replaced by true self-driving cars.

Some argue that we might not need the same number of driverless cars, meaning that we might end-up with some lesser number of driverless cars to provide the equivalent transport volume as todays 250 million conventional cars.

Meanwhile, an equally compelling argument is that we might end up with more driverless cars than the number of todays conventional cars, doing so because of the principle of induced demand. Induced demand is the concept that once you start something new it can bring forth added demand that was previously being suppressed.

If people that today are mobility disadvantaged opt to use driverless cars, and if we all become expectant of near-instantaneous mobility-on-demand, the number of driverless cars needed to fulfill societal needs could well exceed the number of todays conventional cars.

Anyway, putting aside this unresolved debate about the count, perhaps we can all agree that there is likely to be at least some hundreds of millions of driverless cars in our future.

If all those millions upon millions of self-driving cars were using their top-end computers to analyze the radio signals, during idle moments, it would be a huge boost in the extraterrestrial search effort.

It could be a resounding game-changer in the search for intelligent life.

Fleet owners of driverless cars could establish a SETI search capability into their fleet.

As a passenger, you might be utterly unaware that the fleet is supporting the intelligent life search effort. Or, the fleet owner might intentionally want you to know about the search activities, using their largess as a kind of marketing ploy to lure you to using their set of self-driving cars.

Ive so far emphasized that the on-board computers would be only leveraged when the self-driving car is parked and has no other task at hand, but this is not the only circumstance that might allow for doing the radio signal analyses.

As a human driver, you know that there are times while driving a car that involves sitting still and idling such as when you are waiting at a red light, or waiting for a pedestrian crossing in the crosswalk.

During those idle moments, while the vehicle is still in traffic, the on-board self-driving computer could spare a few cycles and digest further the proffered radio signal data.

We can up the ante.

Your driverless car is on the freeway and zipping along. Assume that there is no other significant traffic nearby. The driving scene is barren of anything other than simply driving straight ahead. In theory, the on-board self-driving computers could potentially do some alien outer space life searching during those moments too.

Conclusion

Two birds with one stone.

You can have self-driving cars and meanwhile also be undertaking monumental search efforts to discover intelligent life on other planets.

It seems like a great combo deal.

There are though some potential drawbacks.

First, some might argue that any spare moments of the on-board self-driving computers ought to go toward the number one priority of driving the car.

Even though a car is perhaps sitting at a red light, there is still the opportunity to be continually examining and re-examining the driving scene. The argument can be made that the on-board computers in a self-driving car should be exclusively used toward the driving task, at all times, including even when the self-driving car is parked (reviewing the totality of the driving efforts of the day, finding improvements in how to do a better job at driving in the future).

Another concern is that the SETI-like program used to search for intelligent life might somehow go awry. Suppose the specialized search software causes the on-board computers to get into a locked-up loop and those self-driving computers are unable to be switched over into the driving mode.

Not a good outcome.

Worse too, suppose someone attaches a computer virus to the extraterrestrial search program. A fleet owner that has downloaded the search software is providing a goldmine form of access to the nefarious computer virus maker. In a Trojan horse manner, the evildoer virus could be easily pushed out to millions of self-driving cars, doing so under the guise of trying to help mankind.

You can see why there are some that eschew the idea of using self-driving cars to aid in the intelligent life search.

Should though all that state-of-the-art computing power inside the self-driving car be doing nothing of consequence when there is otherwise idle time?

Some ask whether we can just make sure to put in place enough safeguards to ensure that the search for intelligent life by self-driving cars is intelligently and safely devised.

Right now, the automakers and tech firms are struggling with simply trying to get self-driving cars to drive properly, let alone be worried about the search for intelligent life. You likely wont see anyone directly considering this topic for years to come, only once the advent of true self-driving cars seems more assured.

One final thought.

Suppose that the sooner we could find intelligent life, the sooner we might learn of tech advances that we havent yet conceived of. Perhaps any delay in using self-driving cars for finding intelligent life might postpone our discovering that we can beam humans, just like in Star Trek, being able to do away with automobiles of any kind.

Well, all in all, consider asking your dad about the matter and see what he says.

Im sure hell know best.

Read this article:

Have Some Dad Fun: Tell Him That Self-Driving Cars On Earth Might Discover Intelligent Life On Other Planets And See What He Says - Forbes

Sugar-Coating Disguise Allows for Coronavirus Infection – UC San Diego Health

According to Mary Poppins, a spoonful of sugar helps the medicine go down. In the case of coronavirus, a cloak of sugar helps the virus infect. This sugary-coating disguise, made of molecules called glycans, tricks the human immune system into identifying the microbe as harmless. The resulting recognition failure keeps the body from generating the defensive antibodies needed to destroy the invading coronavirus.

Rommie Amaro, professor of chemistry and biochemistry, UC San Diego

Using the National Science Foundation-funded Frontera supercomputer at the Texas Advanced Computing Center (TACC), Professor ofChemistry and BiochemistryRommie Amaroalong with her UC San Diego colleagues and researchers from Maynooth University in Dublin, Ireland, led by Elisa Faddahas uncovered the atomic makeup of the coronavirus's sugary cloak. The simulation and modeling reveal that glycans also prime the coronavirus for infection by changing the shape of its spike protein. Scientists hope this basic research will add to the arsenal of knowledge needed to defeat the COVID-19 virus.

The more we know about it, the more of its abilities that we're going to be able to go after and potentially take out, Amaro said. It isof such great importance that we learn as much as we can about the virus. And then hopefully we can translate those understandings into things that will be useful either in the clinic or the streets; for example, if we're trying to reduce transmission for what we know now about aerosols and wearing masks. All these things will be part of it. Basic research has a huge role to play in the war against COVID-19. And I'm happy to be a part of it. It's a strength that we have Frontera and TACC in our arsenal.

Glycans coat each of the 65-odd spike proteins that adorn the coronavirus. The sugar-like molecules account for about 40 percent of the spike protein by weight. The spike proteins are critical to cell infection because they lock onto the cell surface, giving the virus entry into the cell.

Amaro, along with her UC San Diego colleagues Lorenzo Casalino, Zied Gaieb, Abigail Dommer, Emilia Barros and Bryn Taylor, explained that even to make an initial connection, one of the pieces of the spike protein in its receptor binding domain has to lift up. It is one of the things Fronterapart of the COVID-19 HPC Consortium along with San Diego Supercomputer Center at UC San Diegohelped reveal: that in the open conformation, there are two glycans that basically prop up the spike protein.

That was really surprising to see. It's one of the major results of our study. It suggests that the role of glycans in this case is going beyond shielding to potentially having these chemical groups actually being involved in the dynamics of the spike protein, said Amaro, a corresponding author of the study published online June 12, 2020, by bioRxiv,org, a preprint repository.

Glycan shield in SARS-CoV-2 spike. (A) Molecular representation of the Open. Glycans at several frames (every 20 ns) are represented with blue lines, and the receptor binding domain within chain A is highlighted with a cyan transparent surface. (B-C) Plot of the surface area covered by glycan shielding at multiple probe radii from 1.4 (water molecule) to 15 for the head (B) and stalk (C). The area of the protein covered by the glycans is depicted in blue, while the grey line is the accessible area of the protein without glycans. Highlighted in green is the area that remains accessible in the presence of glycans, which is also graphically depicted on the structure in the panels located above the plots. Credit: Lorenzo Casalino (UC San Diego), et al.

When that receptor binding domain lifts up into the open conformation, it actually lifts the important bits of the protein up over the glycan shield, Amaro said, adding that this contrasts with the closed conformation, where the shield covers the spike protein. Our analysis gives a potential reason why it does have to undergo these conformational changes, because if it just stays in the down position those glycans are basically going to block the binding from actually happening, she said, adding that the shifts in the conformations of the glycans triggered changes in the spike protein structure.

Amaro compared the action of the glycan to pulling the trigger of a gun. When that bit of the spike goes up, the finger is on the trigger of the infection machinery. That's when it's in its most dangerous modeit is locked and loaded, Amaro said. When it gets like that, all it has to do is come up against an ACE2 receptor in the human cell, and then it's going to bind super tightly and the cell is basically infected.

The research team used computational methods to build data-centric models of the SARS-CoV-2 virus, and then used computer simulations to explore different scientific questions about the virus. They started with various experimental datasets that revealed the structure of the virus. This included cryo-EM structures from the Jason McLellan Lab of The University of Texas at Austin; and from the lab of David Veesler at the University of Washington.

Their structures are really amazing because they give researchers a picture of what these important molecular machines actually look like, Amaro said.

SARS-CoV-2 virus spike protein system overview. (A) Sequence of the full-length spike protein contains the N-terminal domain (NTD), the receptor binding domain (RBD), the furin cleavage site, the fusion peptide (FP), the central helix (CH), the connecting domain (CD), the heptad repeat 2 (HR2) domain, the transmembrane domain (TD) and the cytoplasmic tail (CT). (B) Assembly of head, stalk, and cytoplasmic tail (CT) sections into a full-length model of the spike protein (C) Equilibrated, fully glycosylated and palmitoylated model of the Open system. (C-E) Magnified view of the N-/O- glycans (C, D) and S-palmitoylation of the cytoplasmic tail (E). Image by Lorenzo Casalino, et al.

Unfortunately, even the most powerful microscopes on Earth still can't resolve movement of the protein at the atomic scale.

What we do with computers is that we take the beautiful and wonderful and important data that they give us, but then we use methods to build in missing bits of information, Amaro said.What people really want to knowfor example, vaccineand drug developersare the vulnerabilities that are present in this shield.

The computer simulations allowed Amaro and colleagues to create a cohesive picture of the spike protein that includes the glycans.

The reason why the computer resources at TACC are so important is that we can't understand what these glycans look like if we don't use simulation, Amaro said.

In order to animate the dynamics of the 1.7 million atom system under study, a lot of computing power was needed, said Amaro.

That's really where Frontera has been fantastic, because we need to sample relatively long dynamics, microsecond to millisecond timescales, to understand how this protein is actually working. We've been able to do that with Frontera and the COVID-19 HPC Consortium, Amaro said. Now we're trying to share our data with as many people as we can, because people want a dynamical understanding of what's happeningnot only with other academic groups, but also with different pharmaceutical and biotech companies that are conducting neutralizing antibody development, she said, adding that basic research is making a difference in winning the war against the SARS-Co-V-2 virus.

This research was supported by NIH (GM132826), NSF (RAPID MCB-2032054), an award from the RCSA Research Corp., a UC San Diego Moore's Cancer Center 2020 SARS-COV-2 seed grant, the Visible Molecular Cell Consortium and the Irish Research Council.

Original post:

Sugar-Coating Disguise Allows for Coronavirus Infection - UC San Diego Health

Neocortex Will Be First-of-Its-Kind 800,000-Core AI Supercomputer – HPCwire

Pittsburgh Supercomputing Center (PSC a joint research organization of Carnegie Mellon University and the University of Pittsburgh) has won a $5 million award from the National Science Foundation to build an AI supercomputer designed to accelerate AI research in pursuit of science, discovery and societal good. The new machine, called Neocortex, couples two Cerebras CS-1 AI servers with a shared-memory HPE Superdome Flex server. PSC will make Neocortex available to researchers across the Extreme Science and Engineering Discovery Environment (XSEDE) later this year.

Each Cerebras CS-1 is powered by one Cerebras Wafer Scale Engine (WSE) processor, which contains 400,000 AI-optimized cores implemented on a 46,225 square millimeter wafer with 1.2 trillion transistors. A front-end HPE Superdome Flex server will handle pre- and post-processing of data flowing in and out of the WSE processors. The HPE Superdome Flex is provisioned with 32 Intel Xeon CPUs, 24 terabytes of memory, 205 terabytes of flash storage, and 24 network interface cards.

The Superdome Flex connects to each CS-1 server via 12 100 gigabit Ethernet links, providing 1.2 terabits per second of bandwidth between the machines. Thats enough bandwidth to transfer 37 HD movies every second, said Nick Nystrom, chief scientist, Pittsburgh Supercomputing Center. The Neocortex team is considering implementing the network on a single switch to explore allowing the two CS-1s to interface directly at 1.2 terabits per second.

The WSE processor inside the CS-1 provides 9 petabytes per second of on-die memory bandwidth, equivalent to about a million HD movies per second, by Nystroms math.

Neocortex (named after the region of the brain responsible for higher-order brain functions, including language processing) is the first CS-1 installation funded by the NSF and the first publicly announced CS-1 cluster. Cerebras debuted its Wafer Scale Engine last August at Hot Chips and the CS-1 system unveiling followed at SC19 in November. The Department of Energy was the flagship customer; single-node CS-1 systems are deployed at Argonne National Lab and Lawrence Livermore National Lab.

Describing the impetus for the technology partnering, Nystrom said that PSC saw the opportunity to bring together the best of two worlds the extreme deep learning capability of the server CS-1, and the extreme shared memory of the Superdome Flex with HPE.

With shared memory, you dont have to break your problem across many nodes. You dont have to write MPI, and you dont have to distribute your data structures. Its just all there at high speed, he added.

Both Cerebras and PSC expressed their expectation that the system will be able to take on a new class of problems, beyond what is available with traditional GPUs.

Were just scratching the surface of sort of a new class of AI models; we know of additional models that have been difficult to get running on graphics processing units and we are extremely eager to be partnering with pioneering researchers to show the world what these models might be able to do, said Andrew Feldman, Cerebras cofounder and CEO. His list of target examples includes models with separable convolutions or models with native and induced sparsity, both coarse and fine grained, graph neural networks with irregular sparse connections, complex sequential models, and very large models where parallelism is desirable.

Even with current best-in-class PSC machines, like the GPU-based Bridges and Bridges-AI, research is constrained, said Paola Buitrago, principal investigator and PSC director of artificial intelligence and big data, noting there is clearly a need for more compute, and fast interconnect and storage.

Artificial intelligence in 2012 started this kind of renaissance, thanks to neural networks being implemented on GPUs, Buitrago shared in an interview with HPCwire. GPUs absolutely do well with matrix operations, which is one of the main operations in our neural networks, but they werent designed for AI. Now with the Cerebras technology, we see a machine that is specifically designed for AI and for the potential optimizations in deep learning. We are excited to explore how it can speed up and transform what is currently happening in deep learning, allowing us to explore more and more ambitious science and reducing the time to curiosity.

Buitrago expects Neocortex to be more powerful than the PSC Bridges-AI system by a few orders of magnitude. Providing further characterization of the systems potential, Cerebras Feldman said the tuned system cluster with Cerebras wafer-scale cores and the pre-processing machine from HPE will have the power of 800-1,500 traditional GPUs, or or about 20 racks worth of graphics processing with a single rack of Cerebras.

Naturally, PSC will be putting Neocortex through its paces to see if this claim bears out. The Neocortex group at PSC has identified a number of benchmarks as being important to the community. These were selected to demonstrate the capability of the system when it hits the ground, and the system will, of course, continue to mature over time, said Nystrom, adding they will be evaluating the system with all the big complex networks that are very challenging right now, including LSTM.

In addition to LSTM, we expect Neocortex will be very good at graph convolutional networks, important in all kinds of science, said Nystrom. And then over time across CNNs. So well be using those initially, and well be engaging early users to demonstrate scientific impact. Thats very important to the National Science Foundation.

Buitrago said that their users who are bounded by current hardware are in large part working on natural language processing and working with transformer type networks, including BERT and Megatron-LM, where the models are quite big with hundreds of millions and billions of parameters, adding, thats a specific use case that we will be enabling with the Neocortex system.

The number of applications that need AI is growing, encompassing virtually all fields of science, many drawing on computer vision, text processing, and natural language processing. We want to explore use cases that come specifically from science streaming needs, said Buitrago. So we are working with cosmology researchers, people doing image analysis for healthcare where they need to [handle] the high resolution images and also images in more than two dimensions and seeing how to address what are the best solutions for those specific use cases.

The project partners are particularly enthused about harnessing AI for social good. Drug discovery, more accurate weather prediction, improved materials for increased solar energy generation and understanding large plant genomes to boost crop yields are just a few of the areas PSC expects will benefit from Neocortex as well as the upcoming Bridges-2 system (see slide below right for system details).

Both Neocortex and Bridges-2 also built with HPE will be deployed in the fall. Were launching two supercomputers in the same season, Nystrom declared. PSC has never done that before.

As with Bridges-2, 90 percent of time on Neocortex will be allocated through XSEDE. Well have a long early user period, but theres also discretionary capacity for industry to work with us too, to use the worlds most advanced AI capability to develop their capacity for industrial competitiveness and for translational research, said Nystrom.

Theres also a concerted focus, via the NSF-funded OpenCompass program, to collect and document best practices for running artificial intelligence at scale and communicate those to the open science community. This dovetails with a mission of PSC to support non-traditional users (from history, philosophy, etc.) and users who are just getting started with AI.

Neocortex will support the most popular deep learning frameworks and will be federated with PSCs new Bridges-2 supercomputer, creating a singularly powerful and flexible ecosystem for high performance AI, data analytics, modeling and simulation.

Both Neocortex and Bridges-2 will be available at no cost for research and education, and at cost-recovery rates for industry users.

PSC will present a tutorial on AI hardware at PEARC (July 26-30) and will be talking more about the Neocortex system and what to expect. More details will be forthcoming at https://pearc.acm.org/pearc20/.

Go here to read the rest:

Neocortex Will Be First-of-Its-Kind 800,000-Core AI Supercomputer - HPCwire

Take a Virtual Tour of Hawk, the New HLRS Supercomputer – HPCwire

Hawk, the latest and greatest supercomputer at the High-Performance Computing Center of the University of Stuttgart (HLRS), was inaugurated just a few months ago. Now, HLRS is offering an inside look at the supercomputer through an immersive virtual tour that allows anyone to walk around the large installation.

The walkthrough (Hawk-through?) is available here. Users can click or use their arrow keys to move themselves around the virtual space in their browser (or even on a virtual reality headset). The space is peppered with digitally added information points, and hovering over any of those will offer information about various aspects of Hawks hardware, facilities and operations. The tool also includes a measurement mode, a top-down floor plan view and a dollhouse view that allows for a holistic of the entire 3D mapped structure and its 44 cabinets.

We provide our computational resources to scientists all over Europe, Germany, and other industry users from certain domains like automotive and aeronautics, said Bastian Koller, managing director of HLRS, in one of the informational videos provided by the walkthrough. In AMD we found a good partner [], which goes beyond just buying simple hardware bits from them, but [also] being able to address the challenges and the problems of our customers in the best possible way by providing us [with the] best technology at this point in time. Koller expects that the collaboration will continue well into the future, helping HLRS to address issues like energy efficiency in supercomputing.

Hawk is an HPE Apollo 9000 system with 5,632 nodes spread across its 44 cabinets, each node carrying dual AMD Epyc Rome 7742 CPUs. The system also boasts 1.44 total petabytes of memory, a Mellanox InfiniBand HDR200 interconnect and 25 petabytes of disk storage. Overall, Hawk delivers around 26 peak petaflops enough to make it 3.5 more powerful than its predecessor at HLRS (Hazel Hen) and place it 35th on the most recent Top500 list of the worlds most powerful publicly ranked supercomputers.

Hawk expands the University of Stuttgarts already excellent research infrastructure with an additional flagship system, said Wolfram Ressel, rector of the University of Stuttgart, when Hawk was inaugurated in February. It will enable cutting edge academic and industrial research in a wide range of contexts where simulation and big data play important roles. In this way the new high-performance computer also makes an important contribution to realizing the University of Stuttgarts vision, Intelligent systems for a sustainable society.

To visit the virtual tour of Hawk, click here.

Here is the original post:

Take a Virtual Tour of Hawk, the New HLRS Supercomputer - HPCwire

Premier League relegation battle: Who will survive? Aston Villa, Bournemouth, Brighton and West Ham among – talkSPORT.com

A festival of football beckons for fans when the Premier League season restarts in just a few days.

Players will no doubt be delighted to get back playing the game they love, but the pressure will be straight back on for them, especially for those fighting to keep their clubs in the division.

Its going to be a real fight for survival, with games coming thick and fast over the next six weeks.

Norwich, Aston Villa, Bournemouth, Watford, West Ham and Brighton are all in danger of relegation, with less than a quarter of the campaign left to go.

Southampton and Newcastle could also be dragged into the battle for survival but look to have just enough points on the board and probably need only one win or two to guarantee safety.

Here, talkSPORT.com takes an in-depth look at what each of the relegation-threatened sides must do to remain in the Premier League

What happened before football stopped

The Canaries have it all to do when the season restarts as theyre currently six points off safety.

Theyve never really hit form and had only won one of their last six league matches before the season was halted.

Remaining games

Norwich's remaining PL fixtures 19/20

Southampton (H) June 19

Everton (H) June 24

Arsenal (A) July 1

Brighton (H) TBC

Watford (A) TBC

West Ham (H) TBC

Chelsea (A) TBC

Burnley (H) TBC

Manchester City (A) TBC

Key player: Temmu Pukki

If Norwich are to stay up, theyll need Pukki to rediscover the form he had at the start of the season, which saw him score six goals in his first five league matches.

At that point, Norwich were twelfth on six points having even claimed a win against Manchester City.

Getty Images - Getty

Mission

The players will fight until its mathematically impossible but as well as needing to be near-perfect in their final nine matches, Norwich will need to rely on other results to stay up.

Thats why the club should perhaps start preparing for life in the Championship a few good results after the restart could restore some morale and set them up well for next season.

Whats being said

Not many are backing Norwich to stay up, while manager Daniel Farke has even admitted that the club will need a little miracle to survive.

talkSPORT Super Computer predicts

20th (Relegation)

getty

What happened before football stopped

Defence has been Villas main issue this season, conceding 56 goals in the league, which is more than any other club.

Dean Smiths side are in horrid form, losing their last four, including a dreadful 4-0 defeat at Leicester in March, which was the last Premier League match to have been played before the season stopped.

Remaining games

Aston Villa's remaining PL fixtures 19/20

Sheffield United (H) June 17

Chelsea (H) June 21

Newcastle (A) June 24

Wolves (H) June 27

Liverpool (A) TBC

Manchester United (H) TBC

Crystal Palace (H) TBC

Everton (A) TBC

Arsenal (H) TBC

West Ham (A) TBC

Key player: Jack Grealish

The captain has easily been Villas best player this season and the fans will be looking to him to steer the side away from danger.

A mention should also go to midfielder John McGinn. It looked like his season was over when he sustained an ankle injury in December but the stoppage has allowed him time to recover and hell surely play a part in the remaining ten matches Villa have left.

Getty Images

Mission

Beat Sheffield United in their first match of the restart and Villa are out of the relegation zone, thats how close it is.

They have some tough games later on though, including matches against Chelsea, Liverpool and Man United. Their final match at West Ham could be crucial.

Whats being said

Danny Murphy told talkSPORT.com: Villas a difficult one because theyve got some talented players. The clubs huge and has got a great fanbase. I dont know if theyve got enough firepower though.

talkSPORT Super Computer predicts

18th (Relegation)

Getty Images - Getty

What happened before football stopped

Theyve never been ones to challenge for the European spots but the amount theyve struggled this season will have surprised many.

Bournemouth are currently 18th and were in bad form before the seasons stoppage as they were without a win in their last four, losing three of them.

Remaining games

Bournemouth's remaining PL fixtures 19/20

Crystal Palace (H) June 20

Wolves (A) June 24

Newcastle (H) July 1

Manchester United (A) TBC

Tottenham (H) TBC

Leicester (H) TBC

Manchester City (A) TBC

Southampton (H) TBC

Everton (A) TBC

Key player: Aaron Ramsdale

Goals have been hard to come by for the Cherries this term theyve scored 29 goals in as many games. And with Ryan Fraser set to not play beyond the terms of his contract, their options going forward will be looking quite bare.

This is why it may be wise for Eddie Howe to build a team thats tough to break down during the final few matches and at the heart of this will be goalkeeper Ramsdale. He may only have four clean sheets this season but hes had a number of great games which has led to calls for him to be picked for England.

GETTY

Mission

Whats being said

Murphy told talkSPORT.com: Bournemouth have had a terrible season by their standards. I think they may stay in the relegation zone.

talkSPORT Super Computer predicts

17th

Getty Images - Getty

What happened before football stopped

Watford were languishing at the foot of the table with eight points from 15 matches when Nigel Pearson was put in charge.

The former Leicester boss has steadied the ship but the Hornets are by no means out of the woods yet and were in bad form before the season was halted, winning just one of their last seven league matches. Only goal difference is keeping them above the relegation zone.

Remaining games

Watford's remaining PL fixtures 19/20

Leicester (H) June 20

Burnley (A) June 25

Southampton (H) June 28

Chelsea (A) TBC

More here:

Premier League relegation battle: Who will survive? Aston Villa, Bournemouth, Brighton and West Ham among - talkSPORT.com

The World Health Organization’s truth-cleansing pandemic | TheHill – The Hill

You thought the World Health Organizationsjobwas direct and coordinate authority on global pandemics? Forget it. Last month, the WHO produced its "Manifesto for a healthy recovery from COVID-19." Far from addressing its own lamentable failure to halt the spread of the virus, the document is little more than a demand for a global Green New Deal dolled up in the garb of public health.

The pandemic, WHOs director-general, Dr. Tedros Ghebreyesus,tells us, is a reminder of the intimate and delicate relationship between people and planet. Efforts to make the world safer from another one are doomed unless they address "the critical interface between people and pathogens." Human pressure on the environment, the WHO claims, increases the risk of new infectious diseases. Recovery plans from the pandemic should therefore lessen our impact on the environment, so as to reduce the risk at source, as if new deadly viruses are randomly transmitted from wild animals to people wandering through forests, rather than in Chinese wet markets or, in some instances, even cultivated in research labs.

Arguing for a quick energy transition, the WHO says the costs of renewable energy are dropping. Exactly why, say, burning coal carries a higher risk of unleashing the next pandemic rather than cutting down forests from whence the COVID-19 virus supposedly came, in order to make way for wind farms, the WHO doesnt say. As Michael Moores movie"Planet of the Humans"vividly shows, wind and solar require enormous land-takes and have huge environmental impacts.

But the WHOs recovery manifesto isnt about science and rationality. Its the soul of Thomas Malthus entering public health. Restoring a pristine environment is the goal, humanity becomes the problem, and industrialization by harnessing nature for the purpose of human flourishing is the original sin. The WHOs message that environmental degradation caused the pandemic is exactly what influential audiences in the West want to hear.

Right now, our relationship with nature is broken, the World Economic Forumsays. The best way to avoid future pandemics? itasks, then answers: "Protect the natural world. Is this science or superstition? People in 14th century Europe lived far closer to nature than us. They also had much shorter lives and experienced the worlds worst-ever pandemic. It traveled at terrifying speed, so fast it would strike a village or a town almost as soon as news arrived that the pestilence was near, Ben Gummer, a former British government minister and author of The Scourging Angel: The Black Death in the British Isles,writes in arecent essayon why life after COVID-19 will be much the same as life before it.

Using a super-computer to explain the spread of the disease it covered 3,000 miles in 18 months Gummer concluded that the Black Death was transmitted between people and not by rats, most likely by touch and breath. Also similar to COVID-19 are the narratives used to explain the pestilence: "A dangerous imbalance in nature, a corruption that reflected the sinfulness of men and women, something that could only be put right by Gods divine justice, and the purgative means was this cleansing pandemic.

Sound familiar? Whether you are a member of the global metropolitan elite or a credulous boomer rube, Gummer writes, there is a meta-explanation to suit your taste. An age informed by science should be able to focus exclusively on scientific explanations. But science and reason are nothing compared to the enduring Malthusian substructure of sin, punishment and redemption that underpins the modern environmental movements belief in pestilence and catastrophe as natures just punishment for a sinful civilization.

Cui bono who benefits? Dr. Tedros is circling the wagons and doing his alleged allies in Beijing a favor. By blaming the pandemic on humanitys for that, read the Wests willful violation of nature, it lets Beijing off the hook for covering up the early spread of the virus and blanks out the much-debatedpossibilitythat the novel coronavirus had been cultured in a laboratory. Dr. Tedros knows what hes doing. The elites in the West are being played like a Stradivarius so the lessons from the pandemic go unexamined.

Either Dr. Tedros goes, or its time to defund the WHO.

RupertDarwallis a senior fellow atRealClearFoundation, a nonprofit organization in partnership with RealClear Media Group that reports and analyzes public policy and civic issues. He is the author of numerous books including"The Climate Noose"(2020) and Green Tyranny: Exposing the Totalitarian Roots of the Climate Industrial Complex (2017). A strategy consultant and policy analyst, he was a special adviser to the United Kingdoms chancellor of the exchequer under Prime Minister John Major.

Read more:

The World Health Organization's truth-cleansing pandemic | TheHill - The Hill

Supercomputer Research Redesigns Drugs Without the Side Effects – HPCwire

Weve all heard the commercials: a drug promises amazing results for treating a disease, and then the remainder of the commercial is filled with a mind-numbingly long list of potential side effects. Side effects plague prescription drugs, sometimes prompting drug approval agencies to reject the drug or making patients wonder if the cure can be worse than the disease. Now, researchers from Stanford University have leveraged the Summit supercomputer at Oak Ridge National Laboratory (ORNL) to work on redesigning those drugs without the side effects.

What if we could redesign drugs to keep their benefits while eliminating their unwanted side effects? said Ron Dror, the associate professor of computer science at Stanford University whose lab is leading the research, in an interview with Stanford Universitys Tom Abate. Drors lab targeted drugs that work with G protein-coupled receptors (GPCRs), proteins that are found in all human cells and which serve as the attachment points for a wide range of drugs everything from psychedelics like LSD to blood pressure medications.

Drugs that attach to GCPRs cause multiple simultaneous reactions in the protein, which is responsible for many side effects. ORNL is no stranger to GCPRs: just a few months ago, they highlighted research exploring applications of machine learning in drug design for GCPRs. Drors lab, on the other hand, set out to use supercomputing power to simulate a GCPR attached to a series of different molecules, aiming to pin down how each molecule changed the ordering of the GCPRs atoms.

To run these detailed simulations, the researchers turned to ORNLs Summit supercomputer, which is currently the most powerful publicly ranked supercomputing in the world according to the most recent Top500 list. Summits 4,608 nodes (each powered by two IBM Power9 CPUs and six Nvidia Volta GPUs) deliver 148 Linpack petaflops.

The results were promising, and based on what they found, the researchers designed a set of molecules that in the simulations, at least produced the desired atomic reordering without also inducing the atomic shifts that produce undesirable side effects. While there is a long road between this research and any drug that may be approved for human consumption, the results are a promising milestone in the path toward a new era of drug design.

In addition to revealing how a drug molecule could cause a GPCR to trigger only beneficial effects, weve used these findings to design molecules with desired physiological properties, which is something that many labs have been trying to do for a long time, Dror said. Armed with our results, researchers can begin to imagine new and better ways to design drugs that retain their effectiveness while posing fewer dangers.

The research discussed in this article was published as Molecular Mechanism of Biased Signaling in a Prototypical G proteinCoupled Receptor in the February 2020 issue of Science. The article, which is accessible here, was written by Carl-Mikael Suomivuori, Naomi R. Latorraca, Laura M. Wingler, Stephan Eismann, Matthew C. King, Alissa L. W. Kleinhenz, Meredith A. Skiba, Dean P. Staus, Andrew C. Kruse, Robert J. Lefkowitz and Ron O. Dror.

Continue reading here:

Supercomputer Research Redesigns Drugs Without the Side Effects - HPCwire

‘Supercomputer’ predicts where Tottenham will finish in Premier League table – The Spurs Web

Tottenam Hotspur have been backed to finish in eighth place followingSportradar and The Sun teaming up to predict the remaining results of the Premier League season.

The sports data provider, Sportradar have revealed their take on the remaining 92 games of the Premier League season to give their final standings in the top flight.

After running the remaining fixtures through an artificial intelligence system, it is claimed that Spurs will finish in the same position in which they currently find themselves in.

However, the prediction sees North London rivals Arsenal finish below them in ninth on goal difference.

Occupying third and fourth place remains with Leicester City and Chelsea while Manchester United are tipped for fifth place which could earn them a Champions League spot if Manchester Citys European ban stands.

Sheffield United and Wolves would both take Europa League places if Citys ban does remain with Spurs possibly earning European football dependent on who wins the FA Cup later this season.

Spurs Web Opinion

It is very tight up towards the middle of the top half of the table. One bad result could prove disastrous in our aim to climb the table into a Champions League spot but one bad result for one of our rivals could prove huge. No computer can predict the future unfortunately.

The rest is here:

'Supercomputer' predicts where Tottenham will finish in Premier League table - The Spurs Web

11 Raspberry Pi projects for everyone: From beginners to pros – Android Authority

The Raspberry Pi has grown from being a curiosity, maybe even a novel idea, to be a key item for anyone wanting to learn programming, electronics, robotics, IoT, and more. The little board is accessible both in terms of price and learning curve, and has a sufficiently big enough fan base that there are thousands of ready-made projects, just waiting for you to try.

We take a look at the best Raspberry Pi Projects for beginners, advanced users, and even children so youll be sure to find a project that interests you!

The Raspberry Pi is a Single Board Computer (SBC) that allows makers, enthusiasts, and hobbyists to develop and tinker with software and hardware to create all kinds of projects from simple electronic circuits (like a flashing LED) to full-scale robots with computer vision and machine learning! The original Raspberry Pi was released in 2012. Since then there have been several models and variations. Today the main choice is between the Raspberry Pi Zero W and the Raspberry Pi 4.

The former is an inexpensive single-core 32-bit CPU based board, which costs just $10 and has 512MB of RAM. The latter is more performance driven by comparison, and more expensive. It has a 64-bit quad-core CPU and comes with at least 2GB of RAM, with options available for 4GB and 8GB.

Both models support Wi-Fi, Bluetooth, USB 2.0, and HDMI. The Raspberry Pi 4 is able to drive two 4K monitors, offers Gigabit wired Ethernet, and includes two USB 3.0 ports. The key to the success of the Pi is not only the price/features but the universal support for 40 General Purpose Input and Output pins (GPIO pins). These pins allow programs running on the Pi to read or write digital signals. This means it can read data from sensors (like a temperature sensor) or control other peripherals like an LCD display or a stepper motor. When this is coupled with the Pis camera support then you now have a board that can interact with its environment via sensors, displays, motors, cameras, and more.

The Raspberry Pi is a great way to start learning new software skills, as well as hardware skills. For the beginner it is important to start doing both, not to neglect one or the other. So here are some Raspberry Pi projects that are ideal for beginners combining software skills with hardware know-how.

1. About me In this project, you will learn how to write a Python program to tell people about yourself. You will learn the very basics of Python, as well as create some ASCII art!

2. Introduction to Physical Computing Learn how to use the GPIO pins on your Raspberry Pi to interface with electronic components, such as LEDs and switches. Learn how to wire a variety of electronic components to the Raspberry Pi, plus how to interact with them using Python. The project covers LEDs, Passive Infrared Motion Sensors, switches, buzzers, and more.

3. Time-lapse animations with a Raspberry Pi Learn how to write a small script to capture multiple images, using a Pi camera over a long period of time. You can then unlock the of power of time-lapse photography by combining them into an animated GIF. Along the way, you will learn how to use the Pi camera, advance your Python skills, and learn how to use ImageMagick to create animated GIFs.

4. Raspberry Pi Supercomputer Cluster Supercomputers are expensive, use lots of electricity and need heavy-duty cooling. However, using Raspberry Pi boards you can build a supercomputer cluster and program it just like the real deal, but without needing a direct connection to a power station! With this project, you will learn the fundamentals of distributed computing and gain an understanding of how supercomputers are built and programmed to solve some of the worlds most complex problems.

5. Use any Raspberry Pi to build a NAS A step by step guide Just about any Single Board Computer (SBC) like a Raspberry Pi, Orange Pi, ODROID or NVIDIA Jetson can be used to create Network Attached Storage (NAS). Really the only prerequisites are that the board can run Linux, has a USB port, and has networking. After that, it just comes down to performance. This project will take you step-by-step through all the stages of using a Raspberry Pi to share its attached storage over your local area network. If you are interested in the various performance levels that can be achieved using RAID and the Raspberry 4 then check out Build a Raspberry Pi NAS with 4 Hard Drives and RAID.

6. Raspberry Pi 4 as a Network Router The Raspberry Pi 4 is very versatile. Among its many talents is the ability to forward network traffic from one network interface to another. In this video, I will show you how to create a router between two wired Ethernet networks and how to make a Wi-Fi router.

7. Flight Tracking Using a Raspberry Pi Most commercial aircraft send out ADS-B messages with the planes location, velocity, altitude, and call sign. Using a Raspberry Pi and a DVB-T USB dongle you can receive these messages and track flights in your area. You can also upload this data to services like Flightradar24, which helps make real-time flight data available to millions of aviation enthusiasts, and also gets you a free Flightradar24 Business Plan subscription (a $499.99/year value).

Write a C# app on the Raspberry Pi and run it on a Windows PC .Net Core is a cross-platform version of .NET that is free and open source. It supports Windows on x86, x64 and ARM, as well as Linux on x64 and ARM. That means you can write and compile a C# program on the Raspberry Pi, copy it to a Windows PC and it will run!

MQTT with a Raspberry Pi and an Arduino MQTT allows data to be sent from IoT devices to smartphones or up to the cloud. MQTT (MQ Telemetry Transport) can be used on microcontrollers like the Arduino or on boards like the Raspberry Pi. Here is a full overview and demo using Android, Mosquitto on Raspberry Pi, and an Arduino.

8. Lost in Space Scratch is a block-based visual programming language aimed mainly at children. It teaches the principles of programming using its block interface. In this project, you will learn how to program your own animation using loops.

9. Getting Started with Minecraft Pi If you like Minecraft then the good news is that there is a free version of this popular sandbox open-world building game for the Raspberry Pi. Plus, it comes with a programming interface! As a result, you can write commands and scripts in Python to build things in the game automatically. A great way to combine programming with gaming fun!

10. Minecraft Selfies Learn how to use the Pi Camera to take selfies, and then using Python see how you can render the image using blocks in Minecraft. As part of the project, you will learn how to convert images to RGB values, and how to iterate over multiple lists and compare values.

11. GPIO soundboard Build a button-controlled soundboard that plays different noises when the buttons are pressed. You will learn how to play sounds in Python, and learn how to use the Python GPIO library to detect the button presses.

If you need some more general background information on the Raspberry Pi then check out these tutorial videos:

How to Use the Raspberry Pi Imager (Including Helpful Tip) Raspbian has been renamed as Raspberry Pi OS and there is a new way to make SD cards for the Pi The Raspberry Pi Imager. Here is a quick how-to tutorial including an important tip that saved me loads of time and stress.

Two Monitors on a Raspberry Pi 4 Demo and How To One of the great features of the Raspberry Pi 4 is its support for two screens. When I did my Raspberry Pi 4 review, I didnt have much of an opportunity to show how the dual-display support works. This video fixes that!

Intro to Docker using a Raspberry Pi 4 The Raspberry Pi 4 is an Arm-based Single Board Computer that comes with up to 4GB of RAM. That makes it a great platform for Docker. Here is an introduction to Docker containers using the Raspberry Pi 4.

If you need help withvi or with the Linux command-line then you will also find these videos useful:

Understanding Vi and Vim (Vi IMproved) in 10 Minutes If you ever need to edit a file from the Linux command line then the chances are you might need vi or vim. It is a quirky text editor, but powerful once you get to understand its ways! Here is a 10-minute tutorial.

Linux Directories Explained including /etc /home /var /proc /usr If you are new to Linux then the directory structure can be confusing, but within a few minutes you can understand the essentials.

10 Linux Terminal Commands for Beginners The Linux command line can be quite daunting. What do commands like ls, cd, pwd and less mean? When you see that blinking cursor, what is the first thing you should type? Find out more in my Linux terminal command and utilities tutorial.

If you arent sure which Raspberry Pi board you should buy then we have reviewed many of the popular models, including the Raspberry Pi Zero, the Raspberry Pi 3, and the Raspberry Pi 4.

Raspberry Pi 4 Review Dual displays and up to 4GB of RAM The Raspberry Pi 4 Model B is here and it is a significant upgrade over the Raspberry Pi 3. This new board uses a quad-core Cortex-A72 based processor. It supports dual 4K monitors and has built-in USB 3.0, Gigabit Ethernet and Bluetooth 5.0 Here is my full review.

Raspberry Pi 3 Model A+ Review The new Raspberry Pi 3 Model A+ has a 1.4GHz quad-core Cortex-A53 based CPU plus 5GHz wireless networking, improved thermals and a small form factor. All this for $25. Here is my full review.

Raspberry Pi 3 Model B+ review The Raspberry Pi 3 Model B+ (also known as the Raspberry Pi 3+) costs the same as the previous model, but has a slightly faster CPU, dual-band 802.11ac wireless, Bluetooth 4.2 and faster Ethernet. Here is my full review and hands-on.

Raspberry Pi Zero W review The Raspberry Pi Foundation recently launched the Raspberry Pi Zero W, a new variant that adds built-in Wi-Fi and Bluetooth. Here is my video review and there is also a Raspberry Pi Zero W written review as well.

Raspberry Pi Zero review The new Raspberry Pi Zero is small, elegant and cheap. At just $5 this board brings you desktop Linux with 512MB of RAM and a VideoCore IV GPU. Dont forget to check out the Raspberry Pi Zero written review.

Here is the original post:

11 Raspberry Pi projects for everyone: From beginners to pros - Android Authority

Where to Invest $5,000 Right Now – The Motley Fool

The stock market has been in manic-depressive mode as of late, surging early in June on the euphoria of the country's economic reopening, as well as a surprise jobs gain in the June 5 labor report. However, the market gave almost all of those gains back late last week, after some gloomy commentary from Federal Reserve officials, who now plan to keep interest rates at zero until at least 2022.

What does all of the volatility mean? That you can pick up shares of great companies to buy and hold for the long term. And while stocks relating to the reopening economy have shown a big rally lately, I think it may be time to refocus on stronger technology-related companies that help power the new, more digitized economy. And with lower interest rates here until at least 2022, these companies, which can grow even in a depressed economy, should fetch a premium down the road.

In that light, here are three rock-solid companies that play into these long-term trends. Got an extra $5,000? Then you should think about scooping up shares of these three top companies today.

Image source: Getty Images.

With businesses reeling from COVID-19 and many companies allowing work-from-home for the foreseeable future, securing enterprise communications among a distributed workforce is more important than ever. Thus, cybersecurity solutions are at a premium as never before.

Not only is the cybersecurity sector poised for long-term growth, but CrowdStrike (NASDAQ:CRWD) also appears to have a novel solution poised to take market share within the industry. CrowdStrike combines its software-based Falcon agents, which can be deployed to any "end point" in an enterprise's IT stack over the cloud, with a centralized artificial intelligence-based Threat Graph that uses all agent data to continuously improve algorithms for the entire customer base. Thus, the more customers CrowdStrike gets, the better its threat detection algorithms, which helps attract more customers, and on and on.

As proof of CrowdStrike's effectiveness, look no further than its blockbuster recent results, reported on June 2. Total revenue was up a whopping 85%, with core subscription revenue up 89%. Annual recurring revenue was up 88%, and the company's subscription customer count more than doubled, up 105%.

Also unusual for a cloud-based software-as-a-service company, CrowdStrike is generating some serious cash flow, although GAAP net profits are still negative. Operating loss improved from $25.8 million in the year-ago quarter to $22.6 million in the first quarter, but operating cash flow surged to $98.6 million from just $1.6 million a year ago, and free cash flow increased to $87 million, up from a free cash flow loss of $16.1 million a year ago.

Even if COVID-19 cases surge in a second wave and the economy stagnates, enterprises are still going to need cutting-edge solutions to secure their infrastructure and avoid the costly breaches we've seen over the past few years. In addition, CrowdStrike's growth and margin expansion are some of the best you'll find in the entire market, making the stock a buy even after a strong recent run.

Another company poised to grow no matter what the economy is doing is European semiconductor equipment manufacturer ASML Holdings (NASDAQ:ASML). Unlike many other companies in the semiconductor and memory space, ASML has seen its stock rocket higher, to even exceed where it was to start the year.

That's because ASML has a differentiated offering, having cracked the code on Extreme Ultraviolet Lithography technology. EUV is a mission-critical technology needed to produce more advanced semiconductor chips and DRAM memory at scale over the next decade, and ASML has a monopoly on it.

While the chip sector, and therefore semicap equipment companies, have traditionally been cyclical, and thus wouldn't be a great place to invest in a recession, things may be different this time around. Leading-edge semiconductors are crucial to making the digital economy run, powering cloud computing, artificial intelligence, 5G communications, and the Internet of Things. While ASML's first quarter revenue was affected by COVID-19, that was entirely due to supply issues, not demand. Management noted on the earnings release, "The demand outlook is currently unchanged and we have not encountered any push-outs or cancellations this year."

Furthermore, leading-edge semiconductors are now seen as a strategically important to both companies and countries alike. In fact, the U.S. Congress just announced a bipartisan bill to subsidize the semiconductor industry to the tune of $22.8 billion, as it aims to build semiconductor manufacturing capacity within the United States.

The building of additional, and perhaps redundant, semiconductor manufacturing plants in the U.S. would only mean additional demand for companies like ASML, and maybe especially ASML, since EUV is so crucial to the production of leading-edge semiconductors. So despite its strong run, ASML still looks like a strong pick to play these future technologies today.

If we're all stuck at home, streaming shows on our phones, ordering items on e-commerce websites, and accessing our work on cloud data centers, what do all of those things need? Servers, and lots of them. Super Micro Computer (NASDAQ:SMCI) makes servers for enterprise, cloud, and consumer customers all across the world. In contrast to the more standardized server offerings from Dell Technologies (NYSE:DELL) or HP Enterprise (NYSE:HPE), Super Micro makes more customized server solutions for specific end-use cases.

As servers become more important for different types of workloads, among artificial intelligence applications, 5G base stations, on-premises or cloud data centers and more, Super Micro could benefit. Also important, Super Micro actually does a fair amount of manufacturing in the U.S., which is somewhat rare, along with significant operations in Taiwan. Finally, as ESG concerns take hold of the corporate world, Super Micro's emphasis on environmentally friendly "green" computing should also resonate with customers going forward as well.

Of these three stocks, Super micro is the value stock of the bunch. Currently, it only trades at 16.5 times earnings, but that's even overrating the company's multiple. Super Micro has $301 million in cash versus just $33 million in debt, yielding $268 million in net cash, or 18.4% of Super Micro's market cap. In addition, Super Micro is still undertaking some extra costs related to remediating an accounting snafu from a few years ago, which has since been remedied. Should those costs fall off going forward and the company begins returning that excess cash to shareholders, and Super Micro is actually trading at something more like a low-teens multiple.

With COVID-19 still out there accelerating all of these digital trends, investors should look to buy stocks of companies that play to the digital future on these big market pullbacks. As such, CrowdStrike, ASML, and Super Micro Computer all look like solid additions to your portfolio today.

Read the original:

Where to Invest $5,000 Right Now - The Motley Fool

Man City 7-3 Arsenal, Liverpool collapse and more silliness – Football365.com

The sight of football at the end of this interminable tunnel means the triumphant return of the supercomputers. Poor Liverpool.

Sun downThere is something ironic about this opening paragraph appearing in The Sun:

Graham Potter has urged people to educate themselves more on the subject of racism.

And the nations best-selling newspaper continues to display wilful ignorance about the role it has played in embedding a lack of cultural education and understanding deep into society.

And how about this for a start to the third paragraph?

Racism has come back under the spotlight

How good of you to make legitimate concerns over systemic and extensive discrimination sound like the latest fad that will soon go out of fashion.

Campbell soupMediawatch tends to wince a little when the Daily Mail afford Martin Samuel calling someone a f***ing black c*** is ultimately meaningless as the words may be offensive but they do not go anywhere a platform to discuss race.

But here we are:

Clearly, when just six of 91 League managers are black, there are issues. Yet, individual cases have individual complexities. Raheem Sterling cited Sol Campbell and Ashley Cole this week, juxtaposing their stunted progress with that of Frank Lampard and Steven Gerrard.

Cole, however, made his last appearance for Derby County in the play-off final on May 27, 2019. Last October, he went to work as a coach with Chelseas Under 15 academy.

Gerrard played his final game for Los Angeles Galaxy on November 6, 2016, and turned down a job at Milton Keynes Dons that month. He then began as a youth coach at Liverpool in February 2017 and was given the Under 18 team the following season. He didnt take the Rangers job until June 2018 meaning he was exactly where Cole is now at the same stage of his coaching career. There was no fast track, no golden ticket.

Aside from the job interview with a literal League One club before he had actually retired. That sounds an awful lot like a fast track and a golden ticket. Mediawatch must have missed the part where Ashley Cole turned Shrewsbury Town down while he was still an active player.

As for Campbell, he has battled to shrug off a reputation as a loner, difficult to know and a mystery to many of his teammates.

Are you genuinely suggesting someone with all the necessary coaching qualifications and a storied playing career was ignored by Football League clubs until seven years after his retirement because he is a loner? That is theonlyreason you can think of as to why he has managed a League Two and League One club while Gerrard and Lampard strolled into much bigger jobs at a younger age?

But dont worry, Sol, because while you might not land Tottenham after Jose Mourinho, you should get a better opportunity soon.

And if you dont, its because youre a loner who is difficult to know. Coming from Martin Samuel, that is very much a case of the pot calling the kettle, erm

OK, computerThe sight of football at the end of this interminable tunnel means the actual return of the only thing anyone is really interested in: supercomputers.

Thursday brought a lazy effort from theDaily Star, featuring Sheffield United overtaking Manchester United and Norwich picking up 16 points from a possible 27.

The Sun do it properly. They have teamed with Sportradar and their Simulated Reality technology to predict absolutely everything.

How does this all work? Well, Simulated Reality football matches reflect team form and normal match conditions, which is a neat trick when Premier League games behind closed doors after a solid three-month break is unprecedented.

But credit to them: they have listed every single predicted result. Like Tottenham drawing 3-3 with Everton, or Aston Villa hammering Wolves 3-0.

The supercomputer also comes up with such entirely believable conclusions as:

1) Villa randomly thrashing Wolves in between eight defeats and a draw with West Ham on the final day.

2) Liverpool earning 14 more points from their final nine games; Norwich get 12.

3) Liverpool losing as many games (three) in their final nine fixtures as they have in their previous 75. They both falter just a little and well and truly limp over the line.

4) It being a disaster for Arsenal that they finish level on points with Tottenham, for whom finishing eighth would presumably be a boost.

5) The suggestion that West Ham stay in good form by following up their win against Tottenham with a 1-0 defeat of Chelsea.

6) Manchester City beating Arsenal 7-3. Mind you

But which three teams were relegated?

The current bottom three because supercomputers rarely change positions of actual importance.

Who got into next seasons Champions League?

The only teams to change places in the top half are Sheffield United and Wolves, who just go from seventh to sixth and vice versa. So yeah.

Who ended up in the Europa League spots?

As above. Its almost as if supercomputers base their predictions on recent precedent, extrapolating previous results over the entire season, thus not actually changing anything of consequence.

Well, we have got all the answers and we are sure our results will be the biggest talking point for action-starved football supporters since the coronavirus lockdown began.

Finally, something to agree on. This is literally the biggest talking point for action-starved football supporters since March 11. Nothing has happened in the 93 days since; certainly nothing as important as a predicted Premier League table that alters the positions of four mid-table clubs and shuffles the current bottom three around while keeping them in the relegation places.

Simulated Reality can even tell you how many shots on and off target each team had in every game and how many corners.

It truly is crazy how far playing FIFA can get you. And how much sh*t can be hid behind the guise of those meddling supercomputers.

Rash decisionMarcus Rashford: Manchester United striker helps 20m childrens meal fund Sky News.

Ill keep fighting Marcus Rashford on meals campaign BBC Sport.

Marcus Rashford confirms he will supply 3MILLION meals to vulnerable people as Man Utd star promises to keep fighting The Sun.

Marcus Rashford helps raise 20m for kids meal fund as Manchester United star knows the problems for real talkSPORT.

And still nothing from Simon the players didnt want to help in the first place Jordan. Weird.

No way, Jose

Football365s shithouse headline of the dayMan Utd can sign Sancho on two conditions German football expert

Because only a German football expert could possibly know that this transfer will only happen this summer if Sancho absolutely wants to leave and brings a club who pays the transfer fee.

Recommended reading of the daySachin Nakrani chats with Gabriel Clarke.

Miguel Delaney talks to Dimitar Berbatov.

Originally posted here:

Man City 7-3 Arsenal, Liverpool collapse and more silliness - Football365.com

NASA Supercomputer Used to Fight COVID-19 – Voice of America

A consortium of U.S. government agencies and private industry is using the U.S. space agency NASAs supercomputer to help fight the COVID-19 pandemic, examining everythingfromhow the virus interacts with cells in the human body,to genetic risk factors,to screening for potential therapeutic drugs.

The consortiumwas organized by the White House Office of Science and Technology Policy and includes industry partners IBM, Hewlett Packard Enterprise, Amazon, Microsoft and others, as well as the Department of Energys National Laboratories, the National Science Foundation, and several universities.

The consortium is a pairing up supercomputing resources with proposals for using high-end computing power for COVID-19 studies. The agencys supercomputer is housed at NASAs Ames Research Center in northern California, and, while it is usually used for Earth and space-related projects, it has time reserved for national priorities.

Supercomputers are suited for processing large amounts of data and are invaluable for NASAs usual projects, such as running simulations used to hunting for planets outside our solar system, studying the behavior of black holes, or designing aeronautic or aerospace vehicles.

Likewise, it is well-suited forrunning simulations to help researchers understand COVID-19. The computer-run simulations help researchers understand how the coronavirus reacts on the cellular and molecular level.

The NASA computer so far is being used to study geneticriskfactorsin the virus that may lead toRespiratory DistressSyndrome, (ARDS); develop 3D molecular geometry to search forpossibledrugtherapies against the virus, research the coronavirusproteinshelland how itmay besusceptible to drugs or vaccines, and toidentify COVID-19-relatedbiomarkersand how they react with the human body tocausereactions.

Follow this link:

NASA Supercomputer Used to Fight COVID-19 - Voice of America

Supercomputer Simulations Explain the Asteroid that Killed the Dinosaurs – HPCwire

The supercomputing community has cataclysms on the mind. Hot on the heels of supercomputer-powered research delving into the fate of the neanderthals, a team of researchers used supercomputers at the DiRAC (Distributed Research using Advanced Computing) high-performance computing facility to simulate a different extinction event: the asteroid that killed the dinosaurs.

Over the last half-century, researchers have become increasingly confident that a massive asteroid impact killed off the vast majority of the dinosaurs, causing vast climatic changes that rendered much of life on Earth impossible. Scientists believe that the impact occurred around what is now the Yucatn Peninsula in Mexico, creating a 93 mile-wide, 12-mile deep crater (the Chicxulub crater) that remains gouged into the continental crust to this day.

Rewinding 66 million years is no easy task. The researchers hailing from Imperial College London, the University of Freiburg and the University of Texas at Austin said that they used supercomputing resources at DiRAC to run the first ever 3D numerical simulations to reproduce the whole Chicxulub impact event, from the moment the asteroid struck the ground until the final crater was formed. Previous simulations, they said, had only covered the first few seconds of the impact and worse, had operated on a 2D plane, and thus were only able to consider head-on collisions by the asteroid.

Thankfully, the DiRAC resources allowed for a much more robust analysis. The simulations revealed that the asteroid likely struck Earth at an angle of around 60 degrees, at which point billions of tons of sulfur exploded into the atmosphere, blocking the sun. According to the researchers, this was more or less the worst-case scenario for the dinosaurs, causing the maximum negative effect possible. The researchers also gathered new insights about the formation of the ring of mountains within the crater and the uplift of dense mantle rocks miles beneath the Earths surface.

When you study a complex problem such as crater formation, a key challenge is the number of variables you have to consider, said Mark Wilkinson, director of DiRAC and a professor at the University of Leicester. DiRACs computing services allow researchers to reduce the time-to-science the time it takes to make a breakthrough by providing access to both the computers themselves and technical support teams who give guidance on how to use them. To date, DiRAC has provided about two million core hours of computing time to this project and its great to see that they have already made such exciting new discoveries.

To read the study, which was published as A steeply-inclined trajectory for the Chicxulub impact in the May 2020 issue of Nature Communications, click here.

Header image: a painting by Donald E. Davis depicting the devastating asteroid impact via Wikimedia Commons.

The rest is here:

Supercomputer Simulations Explain the Asteroid that Killed the Dinosaurs - HPCwire

COVID-19 HPC Consortium Expands to Europe, Reports on Research Projects – HPCwire

The COVID-19 HPC Consortium, a public-private effort delivering free access to HPC processing for scientists pursuing coronavirus research some utilizing AI-based techniques has expanded to more than 56 research teams and extended to supercomputing centers and programs in Europe.

IBM, which in March helped form the consortium with the White House Office of Science & Technology and the U.S. Department of Energy, has issued an update on the programs growth and on its research work. Members ofPRACE, the Partnership for Advanced Computing in Europe, have pledged to lend their supercomputing platforms to the effort, including the Swiss National Supercomputing CentresPiz Daint, the sixth ranked supercomputer in the world, according to the Top500 list. And the UK Research and Innovation (UKRI)will make available three of its supercomputers, includingARCHER, a 2.55 petaflops system based at the University of Edinburgh. Other systems include the UKRIs Science and Technology Facilities Council (STFC) DIRAC supercomputing facility and the Biotechnology and Biological Sciences Research Councils (BBSRC) Earlham Institute, Norwich, UK.

COVID-19 is a global problem, so its important that we bring the tools to solve it to as many places across the globe as we can, Dave Turek,IBMVP of Technical Computing, told us. Thats why, even though the consortium originated in the U.S., were focused on adding members from other regions to enable supercomputing-fueled discovery where researchers need it.

Theconsortiums aggregate supercomputing power, now at 430 petaflops (IBMs Summit, at Oak Ridge National Lab and the worlds no. 1 supercomputer, is a 148.6 petaflops machine per the Linpack benchmark) supports research work in bioinformatics, epidemiology and molecular modeling of up to trillions of bits of pieces of data.

Projects include deep learning-based COVID-19 drug discovery.Innoplexus, Frankfurt, Germany, is using the consortiums compute power to train and improve the generative process, and the company reports it has identified five potentially promising molecules.

Researchers atUtah State University, in collaboration withLawrence Livermore National Laband the University of Illinois, are mapping how virus-laden droplet clouds are transported and and settle within hospitals and other indoor environments. The research involves complex multiphase turbulence simulations.

At theUniversity of Utah, researchers using the IBM Longhorn supercomputer are studying how the potential energy generated by atoms can give an overall molecule a positively or negatively charged force field that attracts or repels other molecules. Using AMBER molecular simulation software developed by one of the researchers, IBM said the scientists can measure experimental results to within one hundred-millionth of a centimeter, a measure that is imperceptible to all but the strongest microscopes, a capability used to combat the Ebola outbreak in 2014. The researchers have generated more than 2,000 molecular models of COVID-19-relevant compounds ranked based on the molecules force field energy estimates.

India-based Novel Techsciences is working to identify phytochemicals from among Indias 3,000 medicinal plants and anti-viral plant extracts that, its hoped, can act as natural drugs against the SARS-Cov 2 protein targets. Other work will be done to identify plant-derived compounds that could help tackle multi-drug resistance that may arise as the coronavirus evolves, according to IBM.

And atNASA, researchers are examining genetic traits for COVID-19 susceptibility, defining risk groups with genome analysis and supercomputer-enhanced DNA sequencing, IBM said. A goal of the work is to identify patients suited for clinical trials of vaccines and antivirals. The virus, the researchers state, seems to cause pneumonia, triggering an inflammatory response in the lungs called acute respiratory distress (ARDS), IBM reported. The researchers want to identify patients who are more prone to developing ARDS for clinical trials.

Beyond providing HPC access, the consortiums primary function is matchmaking between researchers with projects suitable for supercomputing. The platform providers themselves provide on-boarding and technical support.

Related Coverage:

DOE COVID Consortium Drives Faster, More Collaborative Science

DOE Expands on Role of COVID-19 Supercomputing Consortium

Here is the original post:

COVID-19 HPC Consortium Expands to Europe, Reports on Research Projects - HPCwire

Sandia to Receive Fujitsu A64FX-based System – HPCwire

ALBUQUERQUE, N.M., May 26, 2020 This spring, Sandia National Laboratories anticipates being one of the first Department of Energy laboratories to receive the newest A64FX Fujitsu processor, a Japanese Arm-based processor optimized for high-performance computing.

Arm-based processors are used widely in small electronic devices like cell phones. More recently, Arm-based processors were installed in SandiasAstra supercomputer, where they are the frontline in a DOE effort to keep competitive the market of supercomputer chip providers.

Being early adopters of this technology benefits all parties involved, said Scott Collis, director of Sandias Center for Computing Research.

Penguin Computer Inc. will deliver the new system the first Fujitsu PRIMEHPC FX700 with A64FX processors.

This Fujitsu-Penguin computer offers the potential to improve algorithms that may not perform well on GPU (graphics processing unit) accelerators, Collis said. In these cases, code performance is often limited by memory speed, not the speed of computation. This system is the first that closely couples efficient and powerful Arm processors to really fast memory to help breakdown this memory-speed bottleneck.

Said Ken Gudenrath, Penguins director of interactions with DOE, Our goal is to provide early access to upcoming technologies.

Sandia will evaluate Fujitsus new processor and compiler using DOE mini- and proxy-applications and share the results with Fujitsu and Penguin. Mini- and proxy-apps are small, manageable versions of applications used for initial testing and collaborations. They are also open source, which means they can be freely modified to fit particular problems.

Said James Laros, program lead of Sandias advanced-architectures technology-prototype program called Vanguard, tasked to explore emerging techniques in supercomputing, This acquisition furthers the labs research and development in Arm-based computing technologies and builds upon the highly successful Astra platform, the worlds first petascale Arm-based supercomputer.

Processor maximizes green computational power

The 48-core A64FX processor was designed for Japans soon-to-be-deployed Fugaku supercomputer, which incorporates high-bandwidth memory. It also is the first to fully utilize wide vector lanes that were designed around Arms Scalable Vector Extensions. These wide vector lanes make possible a type of data level parallelism where a single instruction operates on multiple data elements arranged in parallel.

The new processors efficiency and increased performance per watt provides researchers with significantly greater fractions of usable peak performance, said Sandia manager Robert Hoekstra. The Japanese supercomputing team at the RIKEN Center for Computational Science has partnered with Fujitsu and focused on increasing vectorization and memory bandwidth to maximize the computational power of the system. The result is that an early A64FX-based system sits atop the Green500 list of most efficient supercomputers.

In addition to expanding Sandias efforts to develop new suppliers by advancing Arm-based technologies for high-performance computing, this acquisition also supports DOEs collaboration with the Japanese supercomputing community. Cooperation with the RIKEN center is part of amemorandum of understanding signedin 2014 between DOE and the Japanese Ministry of Education, Culture, Sports, Science and Technology. Both organizations have agreed to work together to improve high performance computing, including collaborative development of computing architectures.

About Sandia National Laboratories

Sandia National Laboratories is a multimission laboratory operated by National Technology and Engineering Solutions of Sandia LLC, a wholly owned subsidiary of Honeywell International Inc., for the U.S. Department of Energys National Nuclear Security Administration. Sandia Labs has major research and development responsibilities in nuclear deterrence, global security, defense, energy technologies and economic competitiveness, with main facilities in Albuquerque, New Mexico, and Livermore, California.

Source: Sandia National Laboratories

Read more here:

Sandia to Receive Fujitsu A64FX-based System - HPCwire

Research Computing Team Studies Supercomputer Reliability – HPCwire

May 26, 2020 Researchers running demanding computations, especially for projects like infectious disease modeling that need to be re-run frequently as new data becomes available, rely on supercomputers to run efficiently with as few failures of the software as possible. The more jobs that fail, the less science can get done.

Understanding why some jobs fail and what can be done to make supercomputers more reliable is the focus of a recent project led by Saurabh Bagchi, a professor of electrical and computer engineering, and ITaP senior research scientist Carol Song.

The project, which began almost five years ago and was supported by three awards from the National Science Foundation (award numbers 1405906, 1513051, and 1513197) totaling over $1.1 million, analyzed data from supercomputer systems at Purdue, as well as the University of Illinois at Urbana-Champaign and the University of Texas-Austin. At Purdue, theConteandHalsteadcommunity clusters were studied.

Among the conclusions Bagchi and Song have drawn:

Bagchi says these are practical takeaways that supercomputer systems administrators can implement to make applications run on their computers more reliably.

In addition to their own data analysis, Bagchi and Songs NSF grant funded the development of an open access repository known asFRESCO, where systems data from Purdues clusters and UT-Austins Stampede supercomputer is stored, as well as the teams conclusions and actionable suggestions for the people who run computer clusters. Theyve also included simple scripts that will let anyone run their own data analysis on the data from the three schools. A similar repository houses the data from the Blue Waters supercomputer located at the National Center for Supercomputing Applications at the University of Illinois.

We really want the computing community to benefit from this resource, says Bagchi, of the open source repositories.

Rajesh Kalyanam, a software engineer on Songs team, developed the technical infrastructure to collect data from supercomputers, and Stephen Harrell, a former ITaP scientific applications analyst, helped get the data from the Purdue clusters onto the FRESCO repository.

FRESCO not only serves the computer systems researchers designing more dependable systems, it also has the potential to help researchers develop and test new big data algorithms, as well as train students in applying data science methods on real-world datasets, says Song. We in ITaP Research Computing are collaborating with faculty on both fronts.

The team has published their findings in a recent paper to be presented at the upcomingDependable Systems and Networks conference, which will be held virtually in June. That papers first author is Rakesh Kumar, one of Bagchis former graduate students who is now employed at Microsoft. Ravishankar Iyer, the George and Ann Fisher Distinguished Professor of Engineering and professor of electrical and computer engineering at the University of Illinois, is the lead investigator from Ilinois. Other researchers on the team include Ashraf Mahgoub from Purdue; Saurabh Jha, Zbigniew Kalbarczyk, William T. Kramer from the University of Illinois; and Todd Evans and Bill Barth from the University of Texas.

Source: Adrienne Miller, Information Technology at Purdue (ITaP)

View original post here:

Research Computing Team Studies Supercomputer Reliability - HPCwire

Calculating Your Way to Antivirals | In the Pipeline – Science Magazine

My intent is to start mixing in some non-coronavirus posts along with my pandemic science coverage you know, like the blog used to be way back earlier in the year (!) Todays subject might be a good transitional one its an article in the New England Journal of Medicine on coronavirus drug discovery, but the points it raises are generally applicable.

How to Discover Antiviral Drugs Quickly is the attention-getting title. The authors are all from Oak Ridge, not known as a center of drug discovery, but the connection is the massive computational resource available there. Their Summit supercomputer is recognized as currently the most powerful in the world, which is a moving target, of course Oak Ridge itself is expecting an even larger system (Frontier) next year, and other labs in China, etc., are not sitting around idly, either.

The authors note that The laborious, decade-long, classic pathway for the discovery and approval of new drugs could hardly be less well suited to the present pandemic. I dont think anyone would argue with that, but it slides past a key point: it could hardly be less well suited to any other disease were trying to treat, either. Right? Is there any therapeutic area thats best served by these timelines as opposed to something quicker? So this is not a problem peculiar to the coronavirus situation, although it does make for a more dramatic disconnect than usual.

Docking and Screening

The paper makes the case for high-throughput ensemble docking of virtual compound libraries. Many readers here will be familiar with the concept, and some of you are very familiar indeed. If this isnt your field, the idea is that you take a three-dimensional representation of a candidate molecule and calculate its interactions (favorable and unfavorable) with a similar three-dimensional representation of a protein binding site for it. Youre going to be adding those up, energetically, and looking for the lowest-energy states, which indicate the most favorable binding. If that sounds straightforward, thats because I have grievously oversimplified that description. Lets talk about that.

Among the biggest complications is that both the molecules of interest and their binding site can generally adopt a number of different shapes. Thats true even when theyre by themselves some of the bonds can rotate (to one degree or another) at room temperature without much of an energetic penalty, and taken together that gives you a whole ensemble of reasonable structures, each with a somewhat different shape. A real kicker is that the relative favorability of these depends first on the compounds (or the binding sites) interactions with itself: they could swivel around to the point, perhaps, where it starts to bang into itself, or you could rotate a bond to where nearby groups start to clash a bit, or you could cause a favorable interaction (or break one up) with such movements. And these energetic calculations are also affected by each partners interaction with solvent water molecules, which are numerous, highly mobile, and interacting with each other at the same time. Finally, the relative energies of each partner will be affected by the other partner. As a target molecule approaches a binding site, a dance begins with the two partners shifting positions in response. You can have situations (for example) where there might be a favorable binding arrangement at the end of such a process, but no good way to get to it by any step-by-step route. The whole field of molecular dynamics is an attempt to figure out this process frame-by-frame, and if you thought getting a static picture was computationally intensive, MD will eat all the computing cycles you can throw at it. (Heres an older post on that topic, but many of its issues are still relevant). One thing that becomes clear is that there may well be some arrangements of either partner along the way that would be considered unfavorable if you calculated them alone in a vacuum or surrounded by solvent, but which make perfect energetic sense when theyre interacting with the other partner nearby.

Practitioners in this area will also appreciate that all those energetic calculations that the last long paragraph relied on are not so straightforward, either. Binding energy involves both an enthalpic term and an entropic one, and these can work in the same direction or can largely cancel each other out (a common situation). Even such an apparently straightforward step as displacing a water molecule from a proteins binding site (to make room for a candidate small molecule) can be enthalpically favorable or unfavorable and entropically favorable or unfavorable, too. These calculations involve (among other things) the interactions of hydrogen bonds (very important), of charged or potentially charged groups such as acids and amines, of local concentrations of electron density such as pi-electron clouds and around electronegative atoms, and of plain noncharged alkyl groups that can attract each other weakly or strongly repel each other if theyre jammed together too closely.

Theres a lot going on, and dealing with all of these things computationally is always going to involve a list of tradeoffs and approximations, no matter what your hardware resources. Skilled molecular modelers will know their way around these, realize the weaker points in their calculations, and adjust their methods as needed to try to shore these up. Less skilled ones (and let me tell you, I am one of those) might be more likely to take some softwares word for it, whether thats a good idea or not. These various software approaches all have their strong points and weak ones, which might reveal themselves to the trained eye as the molecules (and the relative importance of their interacting modes) vary across a screen.

Now, all this is to point out that while speeding up the calculations is a very worthy goal, speeding up calculations that have underlying problems or unappreciated uncertainties in them will not improve your life. The key is, as always, to validate your results by experiment and to their credit, the Oak Ridge authors specifically make note of this. This is a good way to expose weaknesses in your approach that you wouldnt have appreciated any other way, which sends you back for another round of calculation (improved, one hopes).

Virtual screening of this sort has been a technique in drug discovery for many years now, and its usefulness varies. Sometimes it really does deliver an enhanced hit rate compared to a physical assay screen, and sometimes it doesnt (and sometimes you never really know, because youre doing the virtual one because the real-world screen isnt feasible at all). Its definitely improved over the years, though the methods for calculating the energies involved are better, and we can evaluate more far more shapes and conformations more quickly. But its important to realize that the larger the screen, the more work needs to be done up front to set it up properly heres a post on a paper that goes into that topic.

What Screening Gets You

And now we come to the bad news section, when we ask: how much time does one save in a drug-development process through improvements in high-throughput screening? Unfortunately, the answer is, mostly, not all that much. The laborious parts come after the screen is done, and theyre pretty darn laborious. Hits that come out of a screen have to be modified by medicinal chemists for potency, selectivity (against the things you know you should worry about, anyway), metabolic stability and clearance, toxicology (insofar as you understand it), and other factors besides, not all of which will be working in the same direction. Some of these things can be helped a bit by computational approaches, sometimes. But not all, and definitely not always.

And all this is before you even think about going into clinical trials. But those are the really hard part, where we have, for new investigational drugs, a 90% failure rate. None of the most common reasons for those failures are addressed by the supercomputer screen that started off the project. One big problem is that you may have picked the wrong target, and another big one is that your drug may end up doing something else to patients that you didnt want. Neither of those problems are amenable yet to calculation, especially not the kind that the NEJM paper is talking about. You have pick a target before you start your screen, of course, and you get ambushed later by toxicology that you never even knew was coming. Its not that we dont want a computational way to avoid such nasty surprises that would be terrific but nothing like that is on the horizon yet. Billions of dollars, big ol stacks of cash, are waiting for the people who figure out how to do such things. But no one can do them for you at the moment.

Now, I understand that the early computational screens against coronavirus proteins were for repurposing existing drugs, which is indeed a really good idea its the way to get something into the front lines the quickest. But the Oak Ridge folks ran that screen back in February (and good for them for doing it). The last paragraph of the current article is a bit vague, but as it ascends into the clouds it seems to be aiming for something more than repurposing. That, though, will subject it to just those problems mentioned in the last paragraph.Virtual screening gets a lot of effort thrown at it because, honestly, its just a lot more amenable to a computational approach, so far, than the really hard parts are. People can do it, so they do.

In the end, though, screening is just not a rate-determining step. Making it faster is no bad thing, but its like cutting a couple of minutes off your trip to the airport to catch a six-hour flight.

Read more from the original source:

Calculating Your Way to Antivirals | In the Pipeline - Science Magazine