Researchers say deep learning will power 5G and 6G cognitive radios – VentureBeat

For decades, amateur two-way radio operators have communicated across entire continents by choosing the right radiofrequency at the right time of day, a luxury made possible by having relatively few users and devices sharing the airwaves. But as cellular radios multiply in both phones and Internet of Things devices, finding interference-free frequencies is becoming more difficult, so researchers are planning to use deep learning to create cognitive radios that instantly adjust their radio frequencies to achieve optimal performance.

As explained by researchers with Northeastern Universitys Institute for the Wireless Internet of Things, the increasing varieties and densities of cellular IoT devices are creating new challenges for wireless network optimization; a given swath of radio frequencies may be shared by a hundred small radios designed to operate in the same general area, each with individual signaling characteristics and variations in adjusting to changed conditions. The sheer number of devices reduces the efficacy of fixed mathematical models when predicting what spectrum fragments may be free at a given split second.

Thats where deep learning comes in. The researchers hope to use machine learning techniques embedded within the wireless devices hardware to improve frequency utilization, such that the devices can develop AI-optimized spectrum usage strategies by themselves. Early studies suggest that deep learning models average 20% higher classification accuracy than traditional systems when dealing with noisy radio channels, and will be able to scale to hundreds of simultaneous devices, rather than dozens. Moreover, the deep learning architecture developed for this purpose will be usable for multiple other tasks, as well.

One key challenge in implementing deep learning for this application is the massive amount of data that will need to be processed rapidly to do continuous analysis. Deep learning can rely on tens of millions of parameters, and here might require measurements of over a hundred megabytes per second of data on amillisecond level. This is beyond the capability of even the most powerful embedded devices currently available, the researchers note, and low latency demands that the results not be processed in the cloud.

So the goal will be to help shrink deep learning models to the point where they can run on small devices, and use complex testing facilities wireless data factories to improve the software as hardware improves, including raising its resilience against adversarial attacks. The researchers expect to use the learning in both 5G millimeter wave and future 6G terahertz hardware, which are expected to become even more ubiquitous than 4G devices over the next two decades, despite their ultra high-frequency signals susceptibility to physical interference.

See the original post here:
Researchers say deep learning will power 5G and 6G cognitive radios - VentureBeat

AI used to predict Covid-19 patients’ decline before proven to work – STAT

Dozens of hospitals across the country are using an artificial intelligence system created by Epic, the big electronic health record vendor, to predict which Covid-19 patients will become critically ill, even as many are struggling to validate the tools effectiveness on those with the new disease.

The rapid uptake of Epics deterioration index is a sign of the challenges imposed by the pandemic: Normally hospitals would take time to test the tool on hundreds of patients, refine the algorithm underlying it, and then adjust care practices to implement it in their clinics.

Covid-19 is not giving them that luxury. They need to be able to intervene to prevent patients from going downhill, or at least make sure a ventilator is available when they do. Because it is a new illness, doctors dont have enough experience to determine who is at highest risk, so they are turning to AI for help and in some cases cramming a validation process that often takes months or years into a couple weeks.

advertisement

Nobody has amassed the numbers to do a statistically valid test of the AI, said Mark Pierce, a physician and chief medical informatics officer at Parkview Health, a nine-hospital health system in Indiana and Ohio that is using Epics tool. But in times like this that are unprecedented in U.S. health care, you really do the best you can with the numbers you have, and err on the side of patient care.

Epics index uses machine learning, a type of artificial intelligence, to give clinicians a snapshot of the risks facing each patient. But hospitals are reaching different conclusions about how to apply the tool, which crunches data on patients vital signs, lab results, and nursing assessments to assign a 0 to 100 score, with a higher score indicating an elevated risk of deterioration. It was already used by hundreds of hospitals before the outbreak to monitor hospitalized patients, and is now being applied to those with Covid-19.

advertisement

At Parkview, doctors analyzed data on nearly 100 cases and found that 75% of hospitalized patients who received a score in a middle zone between 38 and 55 were eventually transferred to the intensive care unit. In the absence of a more precise measure, clinicians are using that zone to help determine who needs closer monitoring and whether a patient in an outlying facility needs to be transferred to a larger hospital with an ICU.

Meanwhile, the University of Michigan, which has seen a larger volume of patients due to a cluster of cases in that state, found in an evaluation of 200 patients that the deterioration index is most helpful for those who scored on the margins of the scale.

For about 9% of patients whose scores remained on the low end during the first 48 hours of hospitalization, the health system determined they were unlikely to experience a life-threatening event and that physicians could consider moving them to a field hospital for lower-risk patients. On the opposite end of the spectrum, it found 10% to 12% of patients who scored on the higher end of the scale were much more likely to need ICU care and should be closely monitored. More precise data on the results will be published in coming days, although they have not yet been peer-reviewed.

Clinicians in the Michigan health system have been using the score thresholds established by the research to monitor the condition of patients during rounds and in a command center designed to help manage their care. But clinicians are also considering other factors, such as physical exams, to determine how they should be treated.

This is not going to replace clinical judgement, said Karandeep Singh, a physician and health informaticist at the University of Michigan who participated in the evaluation of Epics AI tool. But its the best thing weve got right now to help make decisions.

Stanford University has also been testing the deterioration index on Covid-19 patients, but a physician in charge of the work said the health system has not seen enough patients to fully evaluate its performance. If we do experience a future surge, we hope that the foundation we have built with this work can be quickly adapted, said Ron Li, a clinical informaticist at Stanford.

Executives at Epic said the AI tool, which has been rolled out to monitor hospitalized patients over the past two years, is already being used to support care of Covid-19 patients in dozens of hospitals across the United States. They include Parkview, Confluence Health in Washington state, and ProMedica, a health system that operates in Ohio and Michigan.

Our approach as Covid was ramping up over the last eight weeks has been to evaluate does it look very similar to (other respiratory illnesses) from a machine learning perspective and can we pick up that rapid deterioration? said Seth Hain, a data scientist and senior vice president of research and development at Epic. What we found is yes, and the result has been that organizations are rapidly using this model in that context.

Some hospitals that had already adopted the index are simply applying it to Covid-19 patients, while others are seeking to validate its ability to accurately assess patients with the new disease. It remains unclear how the use of the tool is affecting patient outcomes, or whether its scores accurately predict how Covid-19 patients are faring in hospitals. The AI system was initially designed to predict deterioration of hospitalized patients facing a wide array of illnesses. Epic trained and tested the index on more than 100,000 patient encounters at three hospital systems between 2012 and 2016, and found that it could accurately characterize the risks facing patients.

When the coronavirus began spreading in the United States, health systems raced to repurpose existing AI models to help keep tabs on patients and manage the supply of beds, ventilators and other equipment in their hospitals. Researchers have tried to develop AI models from scratch to focus on the unique effects of Covid-19, but many of those tools have struggled with bias and accuracy issues, according to a review published in the BMJ.

The biggest question hospitals face in implementing predictive AI tools, whether to help manage Covid-19 or advanced kidney disease, is how to act on the risk score it provides. Can clinicians take actions that will prevent the deterioration from happening? If not, does it give them enough warning to respond effectively?

In the case of Covid-19, the latter question is the most relevant, because researchers have not yet identified any effective treatments to counteract the effects of the illness. Instead, they are left to deliver supportive care, including mechanical ventilation if patients are no longer able to breathe on their own.

Knowing ahead of time whether mechanical ventilation might be necessary is helpful, because doctors can ensure that an ICU bed and a ventilator or other breathing assistance is available.

Singh, the informaticist at the University of Michigan, said the most difficult part about making predictions based on Epics system, which calculates a score every 15 minutes, is that patients ratings tend to bounce up and down in a sawtooth pattern. A change in heart rate could cause the score to suddenly rise or fall. He said his research team found that it was often difficult to detect, or act on, trends in the data.

Because the score fluctuates from 70 to 30 to 40, we felt like its hard to use it that way, he said. A patient whos high risk right now might be low risk in 15 minutes.

In some cases, he said, patients bounced around in the middle zone for days but then suddenly needed to go to the ICU. In others, a patient with a similar trajectory of scores could be managed effectively without need for intensive care.

But Singh said that in about 20% of patients it was possible to identify threshold scores that could indicate whether a patient was likely to decline or recover. In the case of patients likely to decline, the researchers found that the system could give them up to 40 hours of warning before a life-threatening event would occur.

Thats significant lead time to help intervene for a very small percentage of patients, he said. As to whether the system is saving lives, or improving care in comparison to standard nursing practices, Singh said the answers will have to wait for another day. You would need a trial to validate that question, he said. The question of whether this is saving lives is unanswerable right now.

Here is the original post:
AI used to predict Covid-19 patients' decline before proven to work - STAT

One Supercomputers HPC And AI Battle Against The Coronavirus – The Next Platform

Normally, supercomputers installed at academic and national laboratories get configured once, acquired as quickly as possible before the money runs out, installed and tested, qualified for use, and put to work for a four or five or possibly longer tour of duty. It is a rare machine that is upgraded even once, much less a few times.

But that is not he case with the Corona system at Lawrence Livermore National Laboratory, which was commissioned in 2017 when North America had a total solar eclipse and hence its nickname. While this machine, procured under the Commodity Technology Systems (CTS-1) to not only do useful work, but to assess the CPU and GPU architectures provided by AMD, was not named after the coronavirus pandemic that is now spreading around the Earth, the machine is being upgraded one more time to be put into service as a weapon against the SARS-CoV-2 virus which caused the COVID-19 illness that has infected at least 2.75 million people (confirmed by test, with the number very likely being higher) and killed at least 193,000 people worldwide.

The Corona system was built by Penguin Computing, which has a long-standing relationship with Lawrence Livermore National Laboratory, Los Alamos National Laboratory, and Sandia National Laboratories the so-called Tri-Labs that are part of the US Department of Energy and that coordinate on their supercomputer procurements. The initial Corona machine installed in 2018 had 164 compute nodes, each equipped with a pair of Naples Epyc 7401 processors, which have 24 cores each running at 2 GHz with an all core turbo boost of 2.8 GHz. The Penguin Tundra Extreme servers that comprise this cluster have 256 GB of main memory and 1.6 TB of PCI-Express flash. When the machine was installed in November 2018, half of the nodes were equipped with four of AMDs Radeon Instinct MI25 GPU accelerators, which had 16 GB of HBM2 memory each and which had 768 gigaflops of FP64 performance, 12.29 teraflops of FP32 performance, and 24.6 teraflops of FP16 performance. The 7,872 CPU cores in the system delivered 126 teraflops at FP64 double precision all by themselves, and the Radeon Instinct MI25 GPU accelerators added another 251.9 teraflops at FP64 double precision. The single precision performance for the machine was obviously much higher, at 4.28 petaflops across both the CPUs and GPUs. Interestingly, this machine was equipped with 200 Gb/sec HDR InfiniBand switching from Mellanox Technologies, which was obviously one of the earliest installations of this switching speed.

In November last year, just before the coronavirus outbreak or, at least we think that was before the outbreak, that may turn out to not be the case AMD and Penguin worked out a deal to installed four of the much more powerful Radeon Instinct MI60 GPU accelerators, based on the 7 nanometer Vega GPUs, in the 82 nodes in the system that didnt already have GPU accelerators in them. The Radeon Instinct MI60 has 32 GB of HBM2 memory, and has 6.6 teraflops of FP64 performance, 13.3 teraflops of FP32 performance, and 26.5 teraflops of FP16 performance. Now the machine has 8.9 petaflops of FP32 performance and 2.54 petaflops of FP64 performance, and this is a much more balanced 64-bit to 32-bit performance, and it makes these nodes more useful for certain kinds of HPC and AI workloads. Which turns out to be very important to Lawrence Livermore in its fight against the COVID-19 disease.

To find out more about how the Corona system and others are being deployed in the fight against COVID-19, and how HPC and AI workloads are being intertwined in that fight, we talked to Jim Brase, deputy associate director for data science at Lawrence Livermore.

Timothy Prickett Morgan: It is kind of weird that this machine was called Corona. Foreshadowing is how you tell the good literature from the cheap stuff. The doubling of performance that just happened late last year for this machine could not have come at a better time.

Jim Brase: It pretty much doubles the overall floating point performance of the machine, which is great because what we are mainly running on Corona is both the molecular dynamics calculations of various viral and human protein components and then machine learning algorithms for both predictive models and design optimization.

TPM: Thats a lot more oomph. So what specifically are you doing with it in the fight against COVID-19?

Jim Brase: There are two basic things were doing as part of the COVID-19 response, and this machine is almost entirely dedicated to this although several of our other clusters at Lawrence Livermore are involved as well.

We have teams that are doing both antibody and vaccine design. They are mainly focused on therapeutic antibodies right now. They are basically designing proteins that will interact with the virus or with the way the virus interacts with human cells. That involves hypothesizing different protein structures and computing what those structures actually look like in detail, then computing using molecular dynamics the interaction between those protein structures and the viral proteins or the viral and human cell interactions.

With this machine, we do this iteratively to basically design a set of proteins. We have a bunch of metrics that we try to optimize on binding strength, the stability of the binding, stuff like that and then we do a detailed molecular dynamics calculations to figure out the effective energy of those binding events. These metrics determine the quality of the potential antibody or vaccine that we design.

TPM: To wildly oversimplify, this SARS-CoV-2 virus is a ball of fat with some spikes on it that wreaks havoc as it replicates using our cells as raw material. This is a fairly complicated molecule at some level. What are we trying to do? Stick goo to it to try to keep it from replicating or tear it apart or dissolve it?

Jim Brase: In the case of in the case of antibodies, which is what were mostly focusing on right now, we are actually designing a protein that will bind to some part of the virus, and because of that the virus then changes its shape, and the change in shape means it will not be able to function. These are little molecular machines that they depend on their shape to do things.

TPM: Theres not something that will physically go in and tear it apart like a white blood cell eats stuff.

Jim Brase: No. Thats generally done by biology, which comes in after this and cleans up. What we are trying to do is what we call neutralizing antibodies. They go in and bind and then the virus cant do its job anymore.

TPM: And just for a reference, what is the difference between a vaccine and an antibody?

Jim Brase: In some sense, they are the opposite of each other. With a vaccine, we are putting in a protein that actually looks like the virus but it doesnt make you sick. It stimulates the human immune system to create its own antibodies to combat that virus. And those antibodies produced by the body do exactly the same thing we were just talking about Producing antibodies directly is faster, but the effect doesnt last. So it is more of a medical treatment for somebody who is already sick.

TPM: I was alarmed to learn that for certain coronaviruses, immunity doesnt really last very long. With the common cold, the reason we get them is not just because they change every year, but because if you didnt have a bad version of it, you dont generate a lot of antibodies and therefore you are susceptible. If you have a very severe cold, you generate antibodies and they last for a year or two. But then youre done and your body stops looking for that fight.

Jim Brase: The immune system is very complicated and for some things it creates antibodies that remembers them for a long time. For others, its much shorter. Its sort of a combination of the of the what we call the antigen the thing about that, the virus or whatever that triggers it and then the immune system sort of memory function together, cause the immunity not to last as long. Its not well understood at this point.

TPM: What are the programs youre using to do the antibody and protein synthesis?

Jim Brase: We are using a variety of programs. We use GROMACS, we use NAMD, we use OpenMM stuff. And then we have some specialized homegrown codes that we use as well that operate on the data coming from these programs. But its mostly the general, open source molecular mechanics and molecular dynamics codes.

TPM: Lets contrast this COVID-19 effort with like something like SARS outbreak in 2003. Say you had the same problem. Could you have even done the things you are doing today with SARS-CoV-2 back then with SARS? Was it even possible to design proteins and do enough of them to actually have an impact to get the antibody therapy or develop the vaccine?

Jim Brase: A decade ago, we could do single calculations. We could do them one, two, three. But what we couldnt do was iterate it as a design optimization. Now we can run enough of these fast enough that we can make this part of an actual design process where we are computing these metrics, then adjusting the molecules. And we have machine learning approaches now that we didnt have ten years ago that allow us to hypothesize new molecules and then we run the detailed physics calculations against this, and we do that over and over and over.

TPM: So not only do you have a specialized homegrown code that takes the output of these molecular dynamics programs, but you are using machine learning as a front end as well.

Jim Brase: We use machine learning in two places. Even with these machines and we are using our whole spectrum of systems on this effort we still cant do enough molecular dynamics calculations, particularly the detailed molecular dynamics that we are talking about here. What does the new hardware allow us to do? It basically allows us to do a higher percentage of detailed molecular dynamics calculations, which give us better answers as opposed to more approximate calculations. So you can decrease the granularity size and we can compute whole molecular dynamics trajectories as opposed to approximate free energy calculations. It allows us to go deeper on the calculations, and do more of those. So ultimately, we get better answers.

But even with these new machines, we still cant do enough. If you think about the design space on, say, a protein that is a few hundred amino acids in length, and at each of those positions you can put in 20 different amino acids, you on the order of 20200 in the brute force with the possible number of proteins you could evaluate. You cant do that.

So we try to be smart about how we select where those simulations are done in that space, based on what we are seeing. And then we use the molecular dynamics to generate datasets that we then train machine learning models on so that we are basically doing very smart interpolation in those datasets. We are combining the best of both worlds and using the physics-based molecular dynamics to generate data that we use to train these machine learning algorithms, which allows us to then fill in a lot of the rest of the space because those can run very, very fast.

TPM: You couldnt do all of that stuff ten years ago? And SARS did not create the same level of outbreak that SARS-CoV-2 has done.

Jim Brase: No, these are all fairly new early new ideas.

TPM: So, in a sense, we are lucky. We have the resources at a time when we need them most. Did you have the code all ready to go for this? Were you already working on this kind of stuff and then COVID-19 happened or did you guys just whip up these programs?

Jim Brase: No, no, no, no. Weve been working on this kind of stuff for her for a few years.

TPM: Well, thank you. Id like to personally thank you.

Jim Brase: It has been an interesting development. Its both been both in the biology space and the physics space, and those two groups have set up a feedback loop back and forth. I have been running a consortium called Advanced Therapeutic Opportunities in Medicine, or ATOM for short, to do just this kind of stuff for the last four years. It started up as part of the Cancer Moonshot in 2016 and focused on accelerating cancer therapeutics using the same kinds of ideas, where we are using machine learning models to predict the properties, using both mechanistic simulations like molecular dynamics, but all that combined with data, but then also using it other the other way around. We also use machine learning to actually hypothesize new molecules given a set of molecules that we have right now and that we have computed properties on them that arent quite what we want, how do we just tweak those molecules a little bit to adjust their properties in the directions that we want?

The problem with this approach is scale. Molecules are atoms that are bonded with each other. You could just take out an atom, add another atom, change a bond type, or something. The problem with that is that every time you do that randomly, you almost always get an illegal molecule. So we train these machine learning algorithms these are generative models to actually be able to generate legal molecules that are close to a set of molecules that we have but a little bit different and with properties that are probably a little bit closer to what we what we want. And so that allows us to smoothly adjust the molecular designs to move towards the optimization targets that we want. If you think about optimization, what you want are things with smooth derivatives. And if you do this in sort of the discrete atom bond space, you dont have smooth derivatives. But if you do it in these, these are what we call learned latent spaces that we get from generative models, then you can actually have a smooth response in terms of the molecular properties. And thats what we want for optimization.

The other part of the machine learning story here is these new types of generative models. So variational autoencoders, generative adversarial models the things you hear about that generate fake data and so on. Were actually using those very productively to imagine new types of molecules with the kinds of properties that we want for this. And so thats something we were absolutely doing before COVID-19 hit. We have taken these projects like ATOM cancer project and other work weve been doing with DARPA and other places focused on different diseases and refocused those on COVID-19.

One other thing I wanted to mention is that we havent just been applying biology. A lot of these ideas are coming out of physics applications. One of our big things at Lawrence Livermore is laser fusion. We have 192 huge lasers at the National Ignition Facility to try to create fusion in a small hydrogen deuterium target. There are a lot of design parameters that go into that. The targets are really complex. We are using the same approach. Were running mechanistic simulations of the performance of those targets, we are then improving those with real data using machine learning. So now we now have a hybrid model that has physics in it and machine learning data models, and using that to optimize the designs of the laser fusion target. So thats led us to a whole new set of approaches to fusion energy.

Those same methods actually are the things were also applying to molecular design for medicines. And the two actually go back and forth and sort of feed on each other and support each other. In the last few weeks, some of the teams that have been working on the physics applications have actually jumped over onto the biology side and are using some of the same sort of complex workflows that were using on these big parallel machines that theyve developed for physics and applying those to some of the biology applications and helping to speed up the applications on these on this new hardware thats coming in. So it is a really nice synergy going back and forth.

TPM: I realize that machine learning software uses the GPUs for training and inference, but is the molecular dynamics software using the GPUs, too?

Jim Brase: All of the molecular dynamics software has been set up to use GPUs. The code actually maps pretty naturally onto the GPU.

TPM: Are you using the CUDA variants of the molecular dynamics software, and I presume that it is using the Radeon Open Compute, or ROCm, stack from AMD to translate that code so it can run on the Radeon Instinct accelerators?

Jim Brase: There has been some work to do, but it works. Its getting its getting to be pretty solid now, thats one of the reasons we wanted to jump into the AMD technology pretty early, because you know, any time you do first-in-kind machines its not always completely smooth sailing all the way.

TPM: Its not like Lawrence Livermore has a history of using novel designs for supercomputers. [Laughter]

Jim Brase: We seldom work with machines that are not Serial 00001 or Serial 00002.

TPM: Whats the machine learning stack you use? I presume it is TensorFlow.

Jim Brase: We use TensorFlow extensively. We use PyTorch extensively. We work with the DeepChem group at Stanford University that does an open chemistry package built on TensorFlow as well.

TPM: If you could fire up an exascale machine today, how much would it help in the fight against COVID-19?

Jim Brase: It would help a lot. Theres so much to do.

I think we need we need to show the benefits of computing for drug design and we are concretely doing that now. Four years ago, when we started up ATOM, everybody thought this was nuts, the general idea that we could lead with computing rather than experiment and do the experiments to focus on validating the computational models rather than the other way around. Everybody thought we were nuts. As you know, with the growth of data, the growth of machine learning capabilities, more accessibility to sophisticated molecular dynamics, and so on its much more accepted that computing is a big part of this. But we still have a long way to go on this.

The fact is, machine learning is not magic. Its a fancy interpolator. You dont get anything new out of it. With the physics codes, you actually get something new out of it. So the physics codes are really the foundation of this. You supplement them with experimental data because theyre not right necessarily, either. And then you use the machine learning on top of all that to fill in the gaps because you havent been able to sample that huge chemical and protein space adequately to really understand everything at either the data level or the mechanistic level.

So thats how I think of it. Data is truth sort of and what you also learn about data is that it is not always the same as you go through this. But data is the foundation. Mechanistic modeling allows us to fill in where we just cant measure enough data it is too expensive, it takes too long, and so on. We fill in with mechanistic modeling and then above that we fill in that then with machine learning. We have this stack of experimental truth, you know, mechanistic simulation that incorporates all the physics and chemistry we can, and then we use machine learning to interpolate in those spaces to support the design operation.

For COVID-19, there are there are a lot of groups doing vaccine designs. Some of them are using traditional experimental approaches and they are making progress. Some of them are doing computational designs, and that includes the national labs. Weve got 35 designs done and we are experimentally validating those now and seeing where we are with them. It will generally take two to three iterations of design, then experiment, and then adjust the designs back and forth. And were in the first round of that right now.

One thing were all doing, at least on the public side of this, is we are putting all this data out there openly. So the molecular designs that weve proposed are openly released. Then the validation data that we are getting on those will be openly released. This is so our group working with other lab groups, working with university groups, and some of the companies doing this COVID-19 research can contribute. We are hoping that by being able to look at all the data that all these groups are doing, we can learn faster on how to sort of narrow in on the on the vaccine designs and the antibody designs that will ultimately work.

The rest is here:
One Supercomputers HPC And AI Battle Against The Coronavirus - The Next Platform

RMACC’s 10th High Performance Computing Symposium to Be Held Free Online – HPCwire

BOULDER, Colo., April 22, 2020 The Rocky Mountain Advanced Computing Consortium (RMACC) will hold its 10thannual High Performance Computing Symposium as a multi-track on-line version on May 20-21.Registration for the event will be free to all who would like to attend.

The on-line Symposium will include presentations by two keynote speakers and a full slate of tutorial sessions.Another longtime Symposium tradition a poster competition for students to showcase their own research also will be continued. Competition winners will receive an all-expenses paid trip to SC20 in Atlanta.

Major sponsor support is being provided by Intel, Dell and HPE with additional support from ARM, IBM, Lenovo and Silicon Mechanics.

Links to the Symposium registration, its schedule, and how to enter the poster competition can be found atwww.rmacc.org/hpcsymposium.

The Keynote speakers areDr.Nick Bronn, a Research Staff Member in IBMs Experimental Quantum Computing group, andDr. Jason Dexter, a working group coordinator for the groundbreaking black hole imaging studies published by Event Horizon Telescope.

Dr. Bronn serves at IBMs TJ Watson Research Center in Yorktown Heights, NY.He has been responsible for qubit (quantum bits) device design, packaging, and cryogenic measurement, working towards scaling up larger numbers of qubits on a device and integration with novel implementations of microwave and cryogenic hardware.He will speak on the topic,Benchmarking and Enabling Noisy Near-term Quantum Hardware.

Dr.Dexter is a member of the astrophysical and planetary sciences faculty at the University of Colorado Boulder.He will speak on the role of high performance computing in understanding what we see in the first image of a black hole.Dr. Dexter is a member of both the Event Horizon Telescope and VLTI/GRAVITY collaborations, which can now image black holes.

Their appearances along with the many tutorial sessions continue the RMACCs annual tradition of showcasing cutting-edge HPC achievements in both education and industry.

The largest consortium of its kind, the RMACC is a collaboration among 30 academic and government research institutions in Arizona, Colorado, Idaho, Montana, Nevada, New Mexico, Utah, Washington and Wyoming. The consortiums mission is to facilitate widespread effective use of high performance computing throughout the 9-state intermountain region.

More about the RMACC and its mission can be found at the website:www.rmacc.org.

About RMACC

Primarily a volunteer organization, the RMACC is collaboration among 30 academic and research institutions located in Arizona, Colorado, Idaho, Montana, Nevada, New Mexico, Utah, Washington and Wyoming.The RMACCs mission is to facilitate widespread effective use of high performance computing throughout this 9-state intermountain region.

Source: RMACC

Read the rest here:
RMACC's 10th High Performance Computing Symposium to Be Held Free Online - HPCwire

Cryptocurrency predictions: what coins to choose in 2020? – Capital.com

Overview

The cryptocurrency sector has been in existence since 2009, the year Bitcoin (BTC), the top crypto based on market capitalisation, came into being.

Eleven years down the line, cryptocurrencies are now a part of modern society, with their number surpassing 3,000.

Cryptocurrency projections are crucial in distinguishing the profitable cryptocurrencies, based on optimal return on investment (ROI). As a result, cryptocurrency prices predictions are fundamental in determining what crypto coins to invest in 2020.

The coronavirus pandemic has been wreaking havoc so far this year, and this has triggered shocks in the financial markets. On March 12, commonly referred to as Black Thursday, the global markets nosedived, and cryptocurrencies were not spared.

For instance, Bitcoin lost 50 per cent of its value as its price reached $3,800. Since, it has witnessed a steady increase as it is currently trading at $6,856.

2020, therefore, seems to be a bright year for cryptocurrencies based on the strides being made by Bitcoin. For instance, payment giant Visa (V) recently partnered with Fold, a crypto startup, to provide a credit card where consumers will be rewarded with Bitcoin for spending in major companies and outlets like Uber (UBER), Amazon (AMZN), Airbnb, Nike (NKE), Dominos (DPZ) and Starbucks (SBUX). This move seeks to propel Bitcoins mainstream adoption.

Cryptocurrency forecasts have been rife, and 2019 proved to be a great year in the crypto space. For instance, cryptocurrency adoption skyrocketed as compared to the previous year.

Despite cryptocurrencies performing commendably in 2019, some stood out as better compared to others, and their bullish run continues in 2020.

Bitcoin continued its dominance as one of the best crypto performers in 2019. It started the year at $3,742 and reached an all-time high of $12,407 in June 2019. Its overall performance was outstanding because it closed the year at $7,293, up by 95 per cent.

One of the factors that prompted Bitcoins commendable performance was that crypto exchanges boosted its adoption through margin trading. As a result, BTCs trading volume was elevated.

At the time of writing Bitcoins price stood at $7,161.

Bitcoin Cash proved to be another significant player in the crypto space in 2019. BCH is a hard fork of Bitcoin meaning; it was crafted out of the leading cryptocurrency. Its price stood at $135 in mid-January 2019, and an all-time high of $480 was witnessed on June 26.

By the end of December 2019, BCH was hovering around $210, representing a 55.5 per cent hike. Presently, Bitcoin Cash is trading at $233.

Being one of the best crypto performers in 2019, NEO acts as both a cryptocurrency and an open-source platform that developers use to create decentralised applications or Dapps. Neo is currently ranked at position twenty-two among the leading cryptocurrencies according to CoinMarketCap.

NEO emerged as one of the best performing crypto based on the blockchain ecosystem it offers developers. During the start of 2019, it was trading at $7.5, but it closed the year at $9, up by 20%. Presently, its price stands at $7.45.

The coronavirus pandemic has compromised the start of 2020 as it has necessitated measures such as lockdowns, social distancing, and quarantines for it to be curbed.

There is no doubt that the Covid pandemic has made the global economy plummet, as many sectors have halted to a standstill. As a result, governments have found themselves being forced to incorporate solutions such as quantitative easing and zeroing interest rates.

Cryptocurrency predictions show that this may be an advantage for the crypto space as typically printing money devalues a currency, and inflation becomes inevitable. Cryptocurrency predictions 2020 have quite a positive spirit, in spite of the terrible circumstances.

Now, you may be asking yourself what crypto coins to invest in 2020 amid the tough times being witnessed across the globe. To answer this question, below, we have compiled a list of digital coins with the best cryptocurrency forecasts.

Coronavirus affects markets

Bitcoin has been setting the ball rolling in the crypto space as it has proven to be one of the best crypto performers over the years. A BTC bullish run is expected in 2020 based on the much-anticipated Bitcoin halving event that is approximately four weeks away. This is an occasion that happens after every four years, and mining rewards are reduced by 50 per cent, hence decreasing the rate of the supply of Bitcoin.

During this years halving event, the mining reward will be reduced from 12.5 BTC to 6.25 BTC. Judging from the previous two halving events that happened in 2012 and 2016, Bitcoins price is speculated to increase because supply reduces and demand rises.

Following the Bitcoin halving in 2016, a bull run was experienced, and BTC set a new record in December 2017 after hitting $20,000. Investors are optimistic a similar trend will be witnessed in 2020.

One of the bedrocks of any asset is scarcity, and this is presented by Bitcoin halving events as supply is slashed. It is the reason why pundits are betting for BTCs price to skyrocket in 2020.

Additionally, some analysts are optimistic that the decision by the Federal Reserve (Fed) to print more dollars to the tune of $6 trillion to enable the American economy to stay afloat will be advantageous to Bitcoin as it will attract more interest from people as the dollar will depreciate.

These cryptocurrency projections seem to make Bitcoin an ideal investment vehicle in 2020.

Over the years, Ethereum has almost played catch-up with Bitcoin. Along with Bitcoin, ETH is one of the best crypto performers. After all, it is favoured for its blockchain ecosystem that enables developers to create smart contracts and dapps.

Currently, the price of ETH stands at $170, and it has witnessed a spike in the first quarter of 2020, hitting a high of $281 in February.

Based on the latest cryptocurrency prices predictions, Ethereum will be among the best performing crypto in 2020, as investors are staking on its transition from the Proof of Work (PoW) network to a Proof of Stake (PoS) blockchain that will make generation of the currency less cumbersome.

Ripple, the third cryptocurrency standing at $0.1875, is a notable investment in 2020 because of its robust network. It has emerged to be a formidable force in payment, banking, and international commerce systems.

Ripple funded MoneyGram, a remittance guru, a whopping $11 million in 2019, for it to utilise its blockchain-based payment network. Cryptocurrency forecasts show Ripple services are being sought after by reputable companies making it an ideal investment asset in 2020.

NEO breaks many stereotypes as it is the first open-source cryptocurrency. As a result, it has the capability of transforming existing financial networks by intertwining real and digital assets.

These cryptocurrency projections place it as one of the best crypto performers of 2020.

When you ask yourself what crypto coins to invest in 2020, cryptocurrency price predictions can be part of the research you do to make your decision. Bitcoin, Ethereum, Ripple and Neo are touted to be among the best performing crypto this year.

If you think you are not ready to make long-term investment commitments, but still want to try to profit from the crypto volatility, you can do so through contracts for difference (CFD).

You can learn more about CFD trading with free online courses and find out how to trade crypto CFDs by reading our comprehensive guide.

Join Capital.com to always stay up to date with the latest crypto price news and spot the best trading opportunities.

Ready to get started?

Download Capital.com

Original post:
Cryptocurrency predictions: what coins to choose in 2020? - Capital.com

This AI-focused cryptocurrency is up 300%, but on-chain fundamentals spell trouble – CryptoSlate

While the global financial markets were melting down as coronavirus spread throughout the planet, there was one cryptocurrency that managed to weather the storm.

Numeraire, an altcoin brought to life to incentivize scientists to predict financial models using encrypted data, was able to enter a parabolic advance that has seen its price skyrocket over 3x in the past three months. The AI-centric cryptocurrency went from trading at a low of $6 in late January to recently hitting a high of $24.

The bull run that NMR experienced in such a short period of time is quite impressive given the current economic environment. However, everything could be part of a pump-and-dump scheme, according to Santiment.

The behavior analytics firm argued that Teeka Tiwari and his Palm Beach Research Group could be behind Numeraires sudden bullish impulse.

Dino Ibisbegovic, head of content and SEO at Santiment, explained:

As part of his Palm Beach Confidential programme, last month Teeka picked 5 coins that he believes will make you a multimillionaire. According to multiple people, NMR was included here as well Teekas coin picks often have a way of becoming a self-fulfilling prophecy, as PBC hopefuls swarm to buy the suggested tokens in bulk, making their charts look like its 2017 all over again.

Even though it is impossible to determine how much of the pump can be attributed to Tiwari, on-chain data reveals that one massive whale was preparing for this price action.

Indeed, a few weeks before Numeraire began surging there was a significant spike in the tokens exchange inflow. One address sent 100,000 NMR, worth roughly $620,000 at the time, to a Bittrex wallet on Jan. 26.

This transfer represents one of the biggest single-day moves in Numeraires history, affirmed Ibisbegovic.

Now that the AI-focused altcoin is up more than 300 percent, there are not any fundamental metrics that support a further increase in its price.

Ibisbegovic pointed out that the number of active addresses is declining as well as the network growth. The downtrend in these on-chain indexes can be considered as a negative sign.

Ibisbegovic said:

Unless the coins on-chain activity makes a strong u-turn in the very near future, its going to be increasingly difficult for the coin to support a sustained rally.

Given the state of Numeraires network, the entities behind the exponential upswing could now be preparing to dump their tokens as Marcel Burger, founder of digital assets consultancy BurgerCrypto.com, stated:

[Its] time to get rid of some NMR here. Was a nice ride. Happy to buy back in at lower levels again. Numerai is still one of those projects I really like, but thanks to palm beach confidential pumping it, I rather sell here to buy back lower.

Even if NMR continues surging, market participants must remain cautious about what is happening behind closed doors to avoid getting caught on the wrong side of the trend.

Numeraire, currently ranked #75 by market cap, is up 5.33% over the past 24 hours. NMR has a market cap of $54.33M with a 24 hour volume of $995.55K.

Chart by CryptoCompare

Numeraire is up 5.33% over the past 24 hours.

See the article here:
This AI-focused cryptocurrency is up 300%, but on-chain fundamentals spell trouble - CryptoSlate

Cryptocurrency Market Update: COVID-19 crisis can take BTC to the moon – FXStreet

Cryptocurrency experts believe that Bitcoin is well-positioned for a strong rally ahead and after the halving. The highly anticipated event will take place in the middle of May and result in Bitcoin's block reward reduction. Halving is scheduled to place every 210,000 blocks or every four years until the maximum supply of 21 million bitcoins has been created by the network.

According to Mati Greenspan, the founder of financial advisory outfit Quantum Economics, Bitcoin will outperform traditional assets due to the upcoming halving and flow of money injected in the financial system.

Bitcoin has been the best performing asset by far over the last year and over the last decade. With all the money being injected into the system at this time and the upcoming halving, I don't see any reason it wouldn't continue to outperform, he said as cited by Forbes.

BTC/USD has been trading above $7,500 since Friday, April 24. The price of the first digital coin has stayed mostly unchanged both on a day-to-day basis and since the beginning of Satruday. As the upside momentum has faded away, BTC/USD bulls need to take out $7,600 to get the recovery back on track.

ETH/USD recovered from the intraday low of $186.04 to trade at $194.40 by press time. The second-largest coin has gained over 3% both on a day-to-day basis and since the beginning of the day amid short-term bullish sentiments and high volatility. The next critical resistance is created by psychological $200.00.

XRP/USD broke above the upper range of the recent consolidation channel $0.1800-$0.1900 and settled at $0.1952 at the time of writing. The coin is moving within the short-term bearish trend in sync with the market. The volatility is high.

Original post:
Cryptocurrency Market Update: COVID-19 crisis can take BTC to the moon - FXStreet

Ethereum, XRP, Litecoin Turn Bullish on Bitcoins Strong Performance – Crypto Briefing

Investors appear to be growing optimistic about the future of the cryptocurrency market, which could lead to more upside price momentum. Key Takeaways

The cryptocurrency market is back in the spotlight after a bullish impulse pushed up prices for most digital assets. Indicators show further upward potential for Ethereum, XRP, and Litecoin, though important resistance levels are holding them down.

Since the Mar. 12 crash, Ethereum has been making a series of higher highs and higher lows. The bullish momentum has taken its price up more than 117%.

The smart contract giant surged from a low of $90 to a recent high of $190.

Despite the substantial price recovery over the last month, the TD sequential indicator estimates that Ether may have more upwards potential.

This technical index presented a buy signal the moment the current green two candlestick began trading above the preceding green one candlestick. If the bullish formation is validated by a further spike in demand, ETH could enter an upward countdown all the way up to a green nine candlestick.

Such a positive scenario seems likely given the amount of interest returning to the cryptocurrency industry, especially as Bitcoins halving event approaches.

Nonetheless, IntoTheBlocks In/Out of the Money Around Price model suggests that for Ether to continue reaching higher highs it would first need to move past the $200 resistance level. Approximately 1.2 million addresses bought nearly 7.8 million ETH around this price level.

An increase in the buying pressure behind Ether could allow it to break above this massive supply wall. If this happens, the bulls will likely take control of ETHs price action, validating the outlook presented by the TD sequential indicator.

Under such circumstances, the next levels of resistance to watch out are provided by the 127.2% and 161.8% Fibonacci retracement levels. These resistance barriers sit around $223 and $260, respectively.

Although everything seems to indicate that Ethereum has more room to go up, the global economic environment tells otherwise. Thus, an important support level to pay close attention to sits around the 78.6% Fibonacci retracement level and the rising trendline.

A daily candlestick close below $172 may invalidate the bullish outlook and increase the odds of a further decline towards $155 or $142.

For the first time since December 2018, the TD sequential setup suggests that it is time to buy XRP based on the 1-month chart. This technical indicator presented a bullish signal in the form of a red nine candlestick that has morphed into a green one candle due to the price action seen this month.

If Mays candlestick manages to move above Aprils monthly close, the bullish formation would likely be validated. This would indicate that XRP may surge for one to four monthly candlesticks or begin a new upward countdown.

Adding credence to the bullish outlook, the parabolic stop and reverse, or SAR, presented a buy signal on XRPs 1-day chart. Every time the stop and reversal points move below the price of an asset, it is considered to be a positive sign.

The parabolic SAR flip estimates that the direction of the trend for the cross-border remittances token changed from bearish to bullish.

Now, XRP would have to close above its 75-day exponential moving average to continue advancing further. By turning this resistance level into support, the odds for a move towards the 200-day exponential moving average, which sits around $0.23, increases substantially.

It is worth mentioning that XRP has been in a multi-year downtrend since the January 2018 peak. Since then, this altcoin is making a series of lower lows and lower highs. As a result, until it closes above the Feb. 15 high of $0.35, every bullish signal must be taken with caution.

Like the altcoins previously mentioned, Litecoin is also signaling that it is ready to resume its uptrend and climb higher. However, the 50-day exponential moving average is holding strong, preventing LTC from achieving its upside potential.

Since the beginning of the month, this barrier has been able to reject the price of Litecoin twice. Considering that resistance weakens the more times it is tested, sooner or later it could turn into support.

Breaking above the 50-day exponential moving average might send LTC towards the 100 or 200-day exponential moving average. These resistance levels sit at $49 and $54, respectively.

On the downside, however, investors pay close attention to the 23.6% Fibonacci retracement level since failing to hold could jeopardize the bullish outlook.

An increase in the selling pressure behind LTC that allows it to close below this support level could trigger a sell-off among market participants. Such a bearish impulse would likely send Litecoin down to try to find support around $39.

Regardless of the havoc that the pandemic has caused in the global financial markets, investors appear to be growing optimistic about what the cryptocurrency industry has to offer. Now, even Bloomberg analysts are bullish on Bitcoin, stating that this year it could transition toward a quasi-currency like gold.

The TIE has also seen an impressive rise in the number of Bitcoin tweets mentioning the halving. The cryptocurrency insights provider affirmed that halving mentions on Twitter surged over 63%. Meanwhile, the overall conversations on social media about BTC are up 6%.

As the block rewards reduction event approaches, the focus appears to be shifting towards Bitcoin. Nonetheless, the high levels of correlation in the cryptocurrency market suggest that an increase in the price of the flagship cryptocurrency could see the entire market following suit.

The information on or accessed through this website is obtained from independent sources we believe to be accurate and reliable, but Decentral Media, Inc. makes no representation or warranty as to the timeliness, completeness, or accuracy of any information on or accessed through this website. Decentral Media, Inc. is not an investment advisor. We do not give personalized investment advice or other financial advice. The information on this website is subject to change without notice. Some or all of the information on this website may become outdated, or it may be or become incomplete or inaccurate. We may, but are not obligated to, update any outdated, incomplete, or inaccurate information.

You should never make an investment decision on an ICO, IEO, or other investment based on the information on this website, and you should never interpret or otherwise rely on any of the information on this website as investment advice. We strongly recommend that you consult a licensed investment advisor or other qualified financial professional if you are seeking investment advice on an ICO, IEO, or other investment. We do not accept compensation in any form for analyzing or reporting on any ICO, IEO, cryptocurrency, currency, tokenized sales, securities, or commodities.

See full terms and conditions.

Here is the original post:
Ethereum, XRP, Litecoin Turn Bullish on Bitcoins Strong Performance - Crypto Briefing

Analyst Says One Cryptocurrency Is Defying Laws of Physics As Bitcoin (BTC) Whales Rise in Record Numbers – The Daily Hodl

Tezos (XTZ) continues to impress analysts as its price recovers rapidly from the crypto-wide crash in March.

The digital asset is trading at $2.37 at time of writing, up from around $1.36 in mid-March. A pseudonymous analyst who goes by the name Teddy tells his 30,000 followers on Twitter that Tezos is now flat out defying the laws of gravity.

Fellow analyst and Cointelegraph contributor Michal van de Poppe says the asset is currently battling an important line of resistance. If it can break through, a move to $2.60 could be in the cards. If not, a correction is likely in store.

Tezos is a smart contract proof-of-stake blockchain designed to allow holders of the coin to help power the network and earn rewards in return. XTZ is currently the 10th largest crypto asset by market cap, according to CoinMarketCap.

Weiss Rankings also recently ranked Tezos first among 120 cryptos in terms of technology, beating out leading cryptocurrencies like Bitcoin, Ethereum and XRP.

As Tezos and other assets continue to recover from Marchs Covid-19-related plummet, crypto whales appear to be accumulating Bitcoin at levels not seen since 2017.

The number of trading entities holding at least 1,000 BTC increased prior to the price drop last month and then accelerated during and after the crash, according to crypto analytics firm Glassnode.

Researchers say this is an optimistic sign that the largest crypto investors on the ledger believe BTC is poised for further growth. Analyst Cole Carner agrees.

Meanwhile, analyst Josh Rager saysBTC is still significantly correlated with the stock market, and a plunge in traditional markets could trigger another move to the downside.

BTC still moving steady with stock market. Again, not a 1:1 correlation, but no reason to draw $2k to $3k BTC meme charts. Unless the S&P 500 tanks, Bitcoin will hold above $6k. Potential future pullbacks in both markets but until stocks drop, Im not bearish on Bitcoin price.

Originally posted here:
Analyst Says One Cryptocurrency Is Defying Laws of Physics As Bitcoin (BTC) Whales Rise in Record Numbers - The Daily Hodl

Taxes Revolutionizing the Cryptocurrency Industry, Singapore Sets New Rules – Coin Idol

Apr 25, 2020 at 09:29 // News

As cryptocurrencies and blockchain technology gain root in the world, governments are becoming more involved in the industry through taxation.

Various countries have expressed various moods to cryptocurrencies. The truth is, cryptocurrencies have taken the world by storm, after it first launched in 2009 and grew a hundredfold in a short span. Many countries are still in the process of developing laws that touch every aspect of the crypto industry, including taxation. Given the volatile and virtual nature of cryptocurrencies, it has been a bit of a challenge to policymakers to cover it in full.

For instance, the United States now wants every citizen with savings in their cryptocurrency wallets to pay an income tax. The US Internal Revenue Services (IRS) sent at least 10,000 emails to cryptocurrency users urging them to clear off their taxes, as coinidol.com, a world blockchain news outlet, has reported.

Recently, Singapore has also joined the league of countries presently imposing some sort of fiscal policy to guide the crypto industry in the world. The Inland Revenue Authority of Singapore (IRAS), in a 14-paged statement titled Income Tax Treatment of Digital Tokens released on its official site on April 17, 2020, clarified on a number of income tax treatments for transactions involving digital tokens. The same modification also touched Initial Coin Offering (ICO) operations.

The good news is that the countries that are regulating cryptocurrency operations through fiscal policies are not totally banning the use of such tokens, but want some return. Most countries are friendly to cryptocurrency in this or that way. According to the list of the most tax-friendly countries for cryptocurrency, compiled by Law & Trust international law firm, Australia has the best taxation climate for the growth of the industry, followed by Argentina, Belarus, Bulgaria and the UK.

Generally, governments are concerned about the issue of cryptocurrency taxation. Without a proper framework, cryptocurrency might be used for concealing profits and avoiding being taxed, which is good for the shadow economy growth.

The implication of such moves on the cryptocurrency world either be negative or positive. Supportive fiscal policies will see a boom in crypto investment, while stricter rules lead to reduced investments. For instance, Singapores waiver on some cryptocurrency transactions is hoped to attract more interest in the crypto industry. Moreover, the stricter management of transactions by ministry could also help reduce cybercrime involving cryptocurrencies.

Notwithstanding the recent government interventions, cryptocurrencies are here to stay and the growth of digital currencies is inevitable.

Original post:
Taxes Revolutionizing the Cryptocurrency Industry, Singapore Sets New Rules - Coin Idol