Could artificial intelligence have predicted the COVID-19 coronavirus? – Euronews

The use of artificial intelligence is now the norm in many industries, from integrating the technology in autonomous vehicles for safety, to AI algorithms being used to improve advertising campaigns. But, by using it in healthcare, could it also help us predict the outbreak of a virus such as the COVID-19 coronavirus?

Since the first cases were seen at the end of December 2019, coronavirus has spread from Wuhan, China, to 34 countries around the world, with more than 80,000 cases recorded. A hospital was built in 10 days to provide the 1,000 beds needed for those who had fallen victim to the virus in Wuhan 97 per cent of cases reported are in China.

The World Health Organisation (WHO) has said the world should prepare for a global coronavirus pandemic. The virus can be spread from person to person via respiratory droplets expelled when an infected person coughs or sneezes. According to the WHO: "Common signs of infection include respiratory symptoms, fever, cough, shortness of breath and breathing difficulties. In more severe cases, infection can cause pneumonia, severe acute respiratory syndrome, kidney failure, and even death."

AI developers have suggested that the technology could have been used to flag irregular symptoms before clinicians realise there is a developing problem. AI could alert medical institutions to spikes in the number of people suffering from the same symptoms, giving them two to four weeks' advance warning which in turn could allow them time to test for a cure and keep the public better informed.

As the virus continues to spread, AI is now being used to help predict where in the world it will strike next. The technology sifts through news stories and air traffic information, in order to detect and monitor the spread of the virus.

Read more:
Could artificial intelligence have predicted the COVID-19 coronavirus? - Euronews

Artificial Intelligence, is the Future of Human Resources. – 401kTV

Artificial Intelligence, is the Future of Human Resources

Artificial intelligence AI takes the lead over intelligent automation IA. Intelligent automation is the combination of robotic process automation and artificial intelligence to automate processes, according to a recent article on the topic in HR Dive, a publication for human resources professionals. Organizations that embrace intelligent automation may experience a return on investment of 200% or more, according to an Everest Group report cited by HR Dive. However, that doesnt mean organizations can expect a reduction in headcount, according to the report. In fact, projections of a reduction in workforce thanks to intelligent automation may be highly exaggerated, the Everest Group noted.

The Everest Group identified eight companies it called Pinnacle Enterprises companies distinguished by their advanced intelligent automation capabilities and their superior outcomes. These companies generated about 140% ROI and reported more than 60% cost savings, thanks to artificial intelligence and intelligent automation. The companies the Everest Group identified as Pinnacle Enterprises also experienced a 67% improvement in operational metrics, compared to the 48% improvement reported by other organizations. The Pinnacle Organizations also experienced improvements in their top lines, time-to-market, and customer and employee experiences as a result of using artificial intelligence and intelligent automation in their businesses, according to the Everest Group report.

Technology, particularly artificial intelligence, and now, intelligent automation, is infiltrating businesses little by little, particularly in the human resources space. In fact, artificial intelligence in HR has been cited as a top employee benefits trend for 2020. Its a trend employers would do well to pay attention to, especially since cost savings and ROI seem to be significant potential positive outcomes of adopting such technologies.

Technology such as artificial intelligence and intelligent automation makes human resources more efficient. According to a Hackett Group report from 2019, HR in organizations that leverage automation technology can do more with fewer resources an important distinction in a department thats often considered the heart of an organization, and that typically has more work than staff to complete it. In addition, the utilization of artificial intelligence and intelligent automation are hallmarks of a distinguished organization. Per the Hackett Group data, cited by HR Dive, world-class HR organizations leverage [artificial intelligence] and, as a result, have costs that are 20% lower than non-digital organizations and provide required services with 31% fewer employees.

Despite the apparent benefits, not everyone is a fan of automated technologies such as artificial intelligence and intelligent automation. Professors at the Wharton School of the University of Pennsylvania and ESSEC Business School, an international higher education institution located in France, Singapore and Morocco, cautioned employers about the potential downsides of using artificial intelligence and intelligent automation in human resources functions. Specifically, they warned that artificial intelligence could create problems for human resources because its unable to measure some HR functions and infrequent employee activities because they generate little data, can solicit negative employee reactions and isconstrained by ethical and legal considerations. However, human resources professionals are finding some success in using artificial intelligence and intelligent automation to perform functions such as searching through resumes for keywords and assisting with other recruiting functions, for example.

Despite the concerns of some, its likely that artificial intelligence and intelligent automation will continue to command a presence in human resources. As such, automation will prompt organizations to make a heftier investment in talent, noted a study byMIT Sloan Management Review and Boston Consulting Groups BCG GAMMA and BCG Henderson Institute. The study found that employers who successfully embrace artificial intelligence and intelligent automation will build technology teams in-house and rely less on external vendors. Theyll also poach artificial intelligence talent from other companies and upskill current employees to be on the front lines of the automation movement. Artificial intelligence and intelligent automation is here to stay, and its only getting more pervasive, especially in human resources and employee benefits. Employers should be ready.

Steff C. Chalk is Executive Director of The Retirement Advisor University, a collaboration with UCLA Anderson School of Management Executive Education. Steff also serves as Executive Director of The Plan Sponsor University and is current faculty of The Retirement Adviser University.

See the original post:
Artificial Intelligence, is the Future of Human Resources. - 401kTV

Artificial intelligence and machine learning for data centres and edge computing to feature at Datacloud Congress 2020 in Monaco – Data Economy

Vertiv EMEA president Giordano Albertazzi looks back on data center expansion in the Nordics and the regions role as an efficient best execution venue for the future.

At the start of the new year its natural to look to thefuture. But its also worth taking some time to think back to the past.

Last year was not only another period of strong data center growthglobally, and in the Nordic region specifically, but also the end of a decadeof sustained digital transformation.

There have been dramatic shifts over the last ten years butthe growth in hyperscale facilities is one of the most defining and one withwhich the Nordic region is very well acquainted.

According to figures from industry analysts Synergy Researchthe total number of hyperscale sites has tripled since 2013 and there are nowmore than 500 such facilities worldwide

And it seems that growth shows no signs of abating. Accordingto Synergy, in addition to the 504 current hyperscale data centers, a further151 that are at various stages of planning or building.

A good numberof those sites will be sited in the Nordics if recent history is anything to goby. The region has already seen significant investment from cloud andhyperscale operators such as Facebook, AWS and Apple. Google was also one ofthe early entrants and invested $800 million inits Hamina, Finland facility in 2010. It recently announced plans to invest a further $600 million in an expansion ofthat site.

I was lucky enough to speak at the recent DataCloud Nordicsevent at the end of last year. My presentation preceded Googles country manager,Google Cloud, Denmark and Finland, Peter Harden, who described the companysgrowth plans for the region. Hamina, Finland is one of Googles mostsustainable facilities thanks in no small part to its Nordics location whichenables 100% renewable energy and innovative sea water cooling.

Continuing that theme of sustainability, if the last decadehas been about keeping pace with data demand, then the next ten years will beabout continued expansion but importantly efficient growth in the right locations,using the right technology and infrastructure. The scale of growth beingpredicted billions of new edge devices for example will necessitate asustainable approach.

That future we at Vertiv, and others, believe will be basedaround putting workloads where they make most sense from a cost, risk, latency,security and efficiency perspective. Or as industry analysts 451 Research putsit: TheBest Execution Venue (BEV). (a slightlyunwieldy term but an accurate one). BEV refers to the specific ITinfrastructure an app or workload should run on cloud, on-premise or at theedge for example but could also equally apply to geographic location of datacenters.

In that BEV future, the Nordics will become increasingly important for hosting a variety of workloads but the sweet-spot could be those that are less latency sensitive high performance compute (HPC) for example and can therefore benefit from the stable, renewable and cheap power as well as the abundance of free cooling. Several new sub-sea cables coming online over the near future will also address some of the connectivity issues the region has faced.

Newsletter

Time is precious, but news has no time. Sign up today to receive daily free updates in your email box from the Data Economy Newsroom.

A recent study by the Nordic Council of Ministers estimatesthat approximately EUR 2.2 bn. have been invested in the Nordics on initiateddata centre construction works over the last 12 to 18 months (2018). Mainlywithin hyperscale and cloud infrastructure. This number could exceed EUR 4 bn.annually within the next five to seven years because of increasing marketdemand and a pipeline of planned future projects.

Vertiv recently conducted some forward-looking research thatappears to reinforce the Nordics future potential. Vertiv first conducted itsData Center 2025 research back in 2014 to understand where the industry thoughtit was headed. In 2019, weupdated that study to find out how attitudes had shifted in the interveningfive years a half way point if you will be between 2014 and 2025.

The survey of more than 800 data center experts covers a range of technology areas but lets focus on a few that are important and relevant to the Nordics.

We mentioned the edge a little earlier when talking about BEV.Vertiv has identified fourkey edge archetypes that cover the edge use cases that our experts believewill drive edge deployments in the future. According to the 2025 research, ofthose participants who have edge sites today, or expect to have edge sites in2025, 53% expect the number of edge sites they support to grow by at least 100%with 20% expecting an increase of 400% or more.

So along with providing a great venue for future colo and cloud growth, the Nordics, like other regions, is also likely to see strong edge growth. That edge demand will require not only new data center form-factors such as prefabricated modular (PFM) data center designs but also monitoring and management software and specialist services.

Another challenge around edge compute, and the core for thatmatter, is energy availability and increasingly, access to clean, renewableenergy.

The results of the 2025 research revealed that respondentsare perhaps more realistic and pragmatic about the importance and access toclean power than back in 2014. Participants in the original survey projected22% of data center power would come from solar and an additional 12% from windby 2025. Thats a little more than one-third of data center power from thesetwo renewable sources, which seemed like an unrealistic projection at the time.

This years numbers for solar and wind (13% and 8% respectively) seem more realistic. However, importantly for Nordics countries with an abundance of hydropower, participants in this years survey expect hydro to be the largest energy source for data centers in 2025.

The data center 2025 research, also looked at one of theother big drivers for building capacity in the Nordics: access to efficientcooling.

According to the 2025 survey, around 42% of respondentsexpect future cooling requirements to be met by mechanical cooling systems. Liquidcooling and outside air also saw growth from 20% in 2014 to 22% in 2019, likelydriven by the more extreme rack densities being observed today. This growth inthe use of outside air obviously benefits temperate locations like the Nordics.

In summary, if the last ten years have been about simplykeeping up with data center demand, the next ten years will be about addingpurposeful capacity in the most efficient, sustainable and cost-effective way:the right data center type, thermal and power equipment, and location for theright workloads.

If the past is anything to go by, the Nordics will have an important role to play in that future.

Read the latest from the Data Economy Newsroom:

See the original post:
Artificial intelligence and machine learning for data centres and edge computing to feature at Datacloud Congress 2020 in Monaco - Data Economy

Artificial Intelligence learns the value of teamwork to form efficient football teams – University of Southampton

Home>>>

Published:27 February 2020

Machine learning experts from the University of Southampton are optimising football team selection by using AI to value teamwork between pairs of players.

The new approach uses historic performance data to identify which player combinations are most important to a team, generating insights that can help select teams' most efficient line-ups and identify suitable transfer targets.

The study, led by PhD student Ryan Beal in the Agents, Interaction and Complexity (AIC) Group, has developed a number of teamwork metrics that can accurately predict team performance statistics, including passes, shots on target and goals.

Researchers presented their findings and hosted an AI in Team Sports workshop at this months Association for the Advancement of Artificial Intelligence (AAAI) Conference in New York.

"We have tested our methods from games in the 2018 FIFA World Cup and the last two seasons of the English Premier League," Ryan says. "We found that we could select teams using the AI in a similar fashion to human managers and then also suggest changes that would improve the team.

"When looking at the results for the Premier League, the teamwork analysis identified Aymeric Laporte as one of the key players for Manchester City. He has been injured for much of this season which may explain their downturn in form compared to last season."

The Southampton team have used a number of machine learning techniques to assess teamwork values from the historic data and found that teams with higher teamwork levels are more likely to win. They then trained an optimisation method to assess the teamwork between pairs of players and compute a number of new metrics that they compare in their latest paper.

"While this work could be used as a tool to assist football managers, we think that the approach could also be extended into other domains where teamwork between humans is important, such as emergency response or in security," Ryan says.

Ryan has also presented his work to sporting industry experts at the StatsBomb Innovation in Football Conference at Stamford Bridge in October.

Ryan's work is supported by UK Research and Innovation (UKRI) and AXA Research Fund. The work was done in collaboration with Narayan Changder (NIT Durgapur), Professor Tim Norman and Professor Gopal Ramchurn.

Team sport performance is one of a several innovative AI research topics being explored in the AIC Group. In 2018, Gopal and Dr Tim Matthews revealed how machine learning algorithms can accurately predict team and player performance to finish in the top 1% of the Fantasy Premier League game, outperforming close to six million human players.

Link:
Artificial Intelligence learns the value of teamwork to form efficient football teams - University of Southampton

Artificial Intelligence will take over Liverpool’s World Museum this summer – The Guide Liverpool

27/02/2020

In its first UK showing outside London, AI: More than Human surveys the creative and scientific developments within artificial intelligence (AI) through extraordinary international artworks and a plethora of interactive and immersive playful experiences.

Taking visitors on an extremely unique and unexpected journey, this exhibition will explore the complex relationship between humans and technology in the past, present and what we can expect in the future.

AI more than Human will examine what it means to be human when sophisticated technology such as AI is changing so much around us, and asks big questions: What is consciousness? Will machines ever outsmart a human? How can humans and machines work collaboratively? Today, we are on the cusp of our next great era as a species the augmented age and this exhibition will connect the visitor to a world far beyond our natural senses.

Anne Fahy, Head of World Museum said: World Museum explores millions of years of Earths history and the human activity that has shaped it. In this fascinating exhibition, we see just how long and how important the epic story of AI has been on human development.

Featuring remarkable work by leading scientists, researchers and artists, AI: More Than Humanis an unmissable taste of the breadth of creativity that is being generated and inspired by algorithms and machines. It demonstrates the opportunity for us to push our creative boundaries, and the potential of exciting collaborations between humans and machines. It allows visitors to consider their own relationship with AI, and what the future may look like.

With so many opportunities for visitors to interact with the works it is an exhibition for curious minds of all ages who want to play experience and understand this ubiquitous technology.

Neil McConnon, Head of Barbican International Enterprises, said: Barbican is delighted to have an opportunity to collaborate with World Museum, Liverpool to stage AI: More than Human. We hope it inspires, informs and provides a space to discover and reflect on the multitude of complex implications of AI on our lives, both liberating and daunting.

AI: More Than Humanis divided into four sections starting with TheDream of AI, which looks at the origins and history of AI. This section focusses on the religious traditions of Judaism and Shintoism, the sciences of Arabic alchemy and early mathematics and Gothic philosophies. It looks at how these beliefs and philosophies continue to influence our perception of and interaction with technology today, and how our fascination with creating beings, goes far back to ancient times.

Section 2, Mind Machines, explains how AI has developed through history, charting the groundbreaking work of some of AIs founding figures, such as Ada Lovelace, Charles Babbage and Alan Turing. Mind Machines documents pioneering computing moments, including when AI was used to beat a pro-chess player and even a human contestant on US game show, Jeopardy. A special commission explores the story of DeepMinds AlphaGo, the first computer to defeat a professional human Go player the Chinese strategy game with origins from 3,000 years ago. In addition to these immersive and interactive installations, this section presents artworks that use and respond to the ways AI sees images or understands language and movement, such as Anna Ridlers Myriad and Mosaic Virus and Mario Klingemanns piece, Circuit Training.

Another exhibition highlight is Sonys 2018 robot puppy aibo; who gradually develops a unique and innovative personality from its database of memories, which visitors are encouraged to contribute to by interacting and playing with aibo.

Section 3, Data Worlds examines the capability of AI to change society, as well as looking at ethical issues such as bias, truth, control and privacy. It looks at AIs role in fields such as healthcare, journalism and retail. It features Learning to See by artist Memo Akten who has worked with Nexus Studios to create an interactive work that invites visitors to manipulate everyday objects to illustrate how a neural network (a series of algorithms) can be fooled into seeing the world as a painting. Within this section, scientist, activist and founder of the Algorithmic Justice League, Joy Buolamwini, examines racial and gender bias using facial analysis software as part of Gender Shades, a project to reveal how prejudice can find its way into technology.

The final section,Endless Evolution, looks to the future of the human race, and where artificial life fits in. It features work byMassive Attack who encoded their seminal album, Mezzanine into synthetic DNA; Justine Emards beguiling Co(AI)xistence, exploring communication between human and machine, and Yuri Suzukis Electronium, enabling visitors to compose with AI. This final section explores the seemingly endless possibilities of AI to shape our lives.

Threaded throughout the exhibition are specially commissioned installations where visitors are encouraged to interact and engage. Including; 2065 an open-world video game set on a virtual island by Lawrence Lek. Universal Everythings Future You, an uncanny installation where visitors can interact with an AI version of themselves, Es Devlins PoemPortraits, which brings together art, design, poetry and machine learning, and Chris Salters Totem, a 14-metre light installation, gives a feeling of a living, breathing entity.

AI has evolved to provide many benefits in every aspect of life, from fashion, to art, music, medicine and even human rights. Our relationship with it is becoming more complex, and through this playful, interactive exhibition, we aim to explore the permeating feature of artificial intelligence. Fusing innovation, art and technology, visitors are invited to immerse themselves in this multi-sensory exhibition.

Continue reading here:
Artificial Intelligence will take over Liverpool's World Museum this summer - The Guide Liverpool

Learn how to start using artificial intelligence in your newsroom (before it is too late) – Journalism.co.uk

The upcoming Newsrewired conference, taking place on 4 June at MediaCityUK, will feature a workshop where delegates will learn how to start implementing artificial intelligence (AI) in their everyday journalistic work.

The session will be led by Charlie Beckett, a professor in the Department of Media and Communications and founding director of Polis, the London School of Economics international journalism think-tank.

Professor Beckett is the author of the study "New powers, new responsibilities. A global survey of journalism and artificial intelligence". He said that newsrooms have between two and five years to develop a meaningful strategy or risk falling behind their competitors.

"This is a marathon, not a sprint but theyve got to start running now.

"Youve got two years to start running and at least working out your route and if youre not active within five years, youre going to lose the window of opportunity. If you miss that, youll be too late," he said in an article for Journalism.co.uk.

Its really clear if you look at other industries that AI is shaping customer behaviour. People expect personalisation, be that in retail or housing, for production, supply or content creation. They use AI because of the efficiencies that it generates and how it enhances the services or products it offers.

Charlie Beckett is currently leading the Polis Journalism and AI project. He was director of the LSEs Truth, Trust and Technology Commission that reported on the misinformation crisis in 2018.

He is the author of "SuperMedia" (Wiley Blackwell, 2008) that set out how journalism is being transformed by technological and other changes. His second book "WikiLeaks: News In The Networked Era" (Polity 2012) described the history and significance of WikiLeaks and the wider context of new kinds of disruptive online journalism.

He was an award-winning filmmaker and editor at LWT, BBC and ITN.

To take advantage of our early-bird offer, book your ticket before 28 Fabruary 2020 and save 50.

If you like our news and feature articles, you can sign up to receive our free daily (Mon-Fri) email newsletter (mobile friendly).

View post:
Learn how to start using artificial intelligence in your newsroom (before it is too late) - Journalism.co.uk

Air Travelers Cant See All of It, but More Tech Is Moving Them Along – The New York Times

The time an airplane spends waiting for a gate after landing or waiting in line to take off could also be reduced. A group at SITA focused on airport management systems is helping to design technology that can synthesize data from many sources, including changing aircraft arrival times, weather conditions at destination airports and logistical issues to improve runway schedules and gate assignments.

Artificial intelligence software can also make a difference with rebooking algorithms, Mr. Etzioni said. When weather or mechanical issues disrupt travel, the airlines speed in recomputing, rerouting and rescheduling matters, he said.

The data streams get even more complex when the whole airport is considered, Ms. Stein of SITA said. A number of airports are creating a digital twin of their operations using central locations with banks of screens that show the systems, people and objects at the airport, including airplane locations and gate activity, line lengths at security checkpoints, and the heating, cooling and electrical systems monitored by employees who can send help when needed. These digital systems can also be used to help with emergency planning.

The same types of thermal, audio and visual sensors that can be used to supply data to digital twins are also being used to reduce equipment breakdowns. Karen Panetta, the dean of graduate engineering at Tufts University and a fellow at the Institute of Electrical and Electronics Engineers, said hand-held thermal imagers used before takeoff and after landing can alert maintenance crews if an area inside the airplanes engine or electrical system is hotter than normal, a sign something may be amiss. The alert would help the crew schedule maintenance right away, rather than be forced to take the aircraft out of service at an unexpected time and inconvenience passengers.

At the moment, people, rather than technology, evaluate most of the data collected, Dr. Panetta said. But eventually, with enough data accumulated and shared, more A.I. systems could be built and trained to analyze the data and recommend actions faster and more cost effectively, she said.

Air travel isnt the only segment of the transportation industry to begin using artificial intelligence and machine learning systems to reduce equipment failure. In the maritime industry, a Seattle company, ioCurrents, digitally monitors shipping vessel engines, generators, gauges, winches and a variety of other mechanical systems onboard. Their data is transmitted in real time to a cloud-based A.I. analytics platform, which flags potential mechanical issues for workers on the ship and on land.

A.I. systems like these and others will continue to grow in importance as passenger volume increases, Ms. Stein said. Airports can only scale so much, build so much and hire so many people.

Link:
Air Travelers Cant See All of It, but More Tech Is Moving Them Along - The New York Times

How Much $100 of Bitcoin Could Be Worth When the Last Coin is Mined – Bitcoinist

Everybody knows that one day Bitcoin mining will eventually cease, and the last coin will be mined. The date for this is expected to be around the year 2140. So what could an investment of $100 now be worth in 120 years time?

To estimate the price of Bitcoin well into the future we need to take a look at growth models for the cryptocurrency. The two most well-known are Parabolic Travs Parabolic super trend price growth model, and Plan Bs Stock to Flow price model (S2F).

Well also have to take into account that if hyperbitcoinization does occur, and Bitcoin becomes global money used by everyone, and no other kind of currency exists at all, there are still limits to Bitcoins price growth.

Hal Finney predicted Bitcoin to have a price of 10 million per coin back in 2009. In Finneys estimate he simply took the estimates for world household wealth and divided it by 21 million coins.

He arrived at $10 million per coin. Decrypt revisited the idea, and recalculated with updated numbers and came to a price of $18 million per coin. Bitcoins parabolic growth can only keep rising until there is no more wealth whose value can be converted to Satoshis.

Parabolic Travs parabolic super trend model closely correlates with Plan Bs stock to flow model. While many investors discount the idea of parabolic growth, Bitcoin has already grown 2,232,111,011.11% since Marti Malmi sold the first Bitcoins for fiat currency in 2009, to its all time high of $20K in 2017.

Bitcoin follows an S-curve of technological adoption, because while it is a currency it is also new technology, which is being adopted by new users at S-curve adoption rates.

See the similarities in Travs parabolic price model and S-curves of new technology adoption? They are both parabolic. We may see BTC follow the steeper curve exhibited by smart phones and the internet.

Stock to flow is how many years it would take to produce the current total supply of an asset. Golds stock to flow is 62. It would take 62 years of mining to produce the current world supply.

Plan Bs S2F further supports this parabolic growth with the model he provided to show the impact Bitcoins halvings have on price. Bitcoins S2F is 25, currently but will be halved to an S2F of 50, much closer to gold.

On the graph above you can see the parabolic price increase overlaid with the reduction in block rewards every 210,000 blocks (roughly 4 years).

Plan Bs model predicts a trillion dollar Bitcoin market cap after the upcoming halving, or a projected price of $55,000 per BTC.

Following Plan Bs model, Digitalek.net projects the price of BTC in 2025 to be $1,215,730.5 per coin.

Credit Suisse estimates global household wealth to be $360 trillion in USD. Dividing this number by 21 million Bitcoin puts us at a price of $17,142,857 per BTC.

However, Chainalysis estimates that as much as 4 million BTC have been lost, so lets calculate for 17 million BTC. Using Finneys calculation with 17 million BTC instead of 21 puts us at $21,176,470.58.

Assuming hyperbitcoinization occurs by 2140, $100 dollars of BTC at todays current price of $8880, would be 0.01126 in Satoshis. These same Satoshis could have a projected value of $238,373.77 by the time the last Bitcoin is mined in 2140.

How much do you think 1 BTC will be worth in 2140? Let us know in the comments!

Images via Shutterstock, charts by Market Realist, Planb, HCBurger1, Tradingview @ParabolicInvestor

Read the rest here:
How Much $100 of Bitcoin Could Be Worth When the Last Coin is Mined - Bitcoinist

World’s Top Crypto Miners Race to Roll Out Top-of-Line Machines Ahead of Bitcoin Halving – CoinDesk – CoinDesk

Two of the largest bitcoin mining equipment manufacturers are in a neck-and-neck race to roll out top-of-the-line machines ahead of bitcoin's (BTC) halving event in less than three months.

On Thursday, Beijing-based mining giant Bitmain launched its latest AntMiner S19 and S19 Pro models, boasting computing power as high as 110 terahashes per second (TH/s) and an energy cost of 29.5 watts per terahash (W/T).

Going by the firm's specifications, the two models would currently be the most profitable bitcoin mining devices if available, closely followed by the WhatsMiner M30S from Bitmain's Shenzhen-based rival, MicroBT, according to a miner profitability index from f2pool.

The launch comes on the back of a heated battle between Bitmain and MicroBT, which has gained a significant share of the mining equipment business after selling about 600,000 units of its M20 series in 2019, chipping away at Bitmain's long-time market dominance.

MicroBT, which launched its flagship M30 models in December, has started taking pre-orders for the latest and most powerful product line since last week, with deliveries of sample units starting as early as next month.

According to MicroBT's major distributor Pangolin Miner, the M30S priced at $2,430 apiece touts a computing power of 86 TH/s with an energy cost of 38 W/T and uses 8-nanometer chips supplied by Samsung. The firm said some devices will ship from March to May, but large pre-orders would have to wait until as late as June.

On the other hand, prices and the pre-order/delivery dates for Bitmain's S19 models have not yet been announced. Adding to the uncertainty is whether Bitmain can deliver production on a large scale, since the latest models adopt 7-nm chips that come in limited supplies from its vendor, Taiwan Semiconductor Manufacturing Company.

It also remains to be seen how the industry will react to the releases of top-notch but more expensive mining equipment, as bitcoin's price has retracted from its recent growth momentum above $10,000.

Currently, Bitmain's older model the AntMiner S9 is still one of the most widely used miners, generating a daily gross margin of about 30 percent at bitcoin's current price, based on f2pool's index.

Further, the coronavirus outbreak in China has affected the country's manufacturing and logistics businesses, causing delays for those that were looking to expand or upgrade existing mining facilities.

In fact, data from mining pool BTC.com shows bitcoin's mining difficulty a measure of how hard it is to compete for mining rewards has stagnated for a month and is currently around the same level seen on Jan. 28.

But with bitcoin's halving event approaching in May, a programmed-in change that will reduce the network's mining rewards from 12.5 BTC per block to 6.25, older models like the S9 will become unprofitable unless bitcoin's price increases significantly. As such, miners may have to either upgrade or get out of the industry.

The leader in blockchain news, CoinDesk is a media outlet that strives for the highest journalistic standards and abides by a strict set of editorial policies. CoinDesk is an independent operating subsidiary of Digital Currency Group, which invests in cryptocurrencies and blockchain startups.

Read the original here:
World's Top Crypto Miners Race to Roll Out Top-of-Line Machines Ahead of Bitcoin Halving - CoinDesk - CoinDesk

Are Miners Prepared for the Halving of Bitcoin? – Cointelegraph

Anyone following crypto news has undoubtedly seen numerous articles that forecast Bitcoins (BTC) valuation following the upcoming halving slated to take place in May of this year. And although the price of Bitcoin is clearly important to the industry and investors at large, planning for the halving is particularly critical to cryptocurrency miners.

Once the halving occurs, the unfortunate truth is that the profitability of all but the most efficient mining operations will be greatly challenged. To stay in the green, many will either be forced to upgrade their equipment or to shut down their mining operations altogether.

However, careful planning can mitigate these risks, and there are several steps miners should take to set themselves up for sustained profitability in the wake of the halving. To understand all the factors at play, its important to review what makes mining profitable in the first place. This includes:

The hash rate is the estimated number of tera hashes per second that the Bitcoin network is performing. It is a general measure of the networks processing power and of how many times the network can attempt to add a block to the Bitcoin blockchain every second.

The hash rate is a good indicator of the networks health, and while it cant be precisely measured, it can be estimated based on the current difficulty and time of block confirmations of Bitcoin.

Mining Bitcoin is not easy, and it has only gotten harder as more miners have joined the network. The difficulty of mining a block correlates with the overall network hash rate, and thus with the competition. The more people trying to solve a block, the more difficult it is to do so.

Miners can increase their chances by employing high-powered application-specific integrated circuits that are efficient and always running. The ultimate goal is to solve a block that is worth more than it costs to solve. Miners can also improve odds by joining a mining pool, in which profits are shared with the other members of the pool and vice versa.

Its not expected for the halving to have a big impact on mining difficulty. It may adjust slightly to make up for no-longer-profitable miners leaving the network, which will allow for the remaining miners to mine more profitably and to drive forward the hash rate, price and difficulty in general.

The electrical efficiency of mining devices has a massive impact on overall profitability. If miners are expending excess energy and paying more in electrical costs than receiving as a result of solving a block, theyre going to end up in the red.

Related: Bitcoin Mining's Electricity Bill: Is It Worth It?

A more efficient device will lead to greater profits in less time while also expending less energy, thus, reducing costs. Such efficient machines are going to be needed to correct for the reduction in block reward following the halving. Machines, such as the Antminer S9, are going to become essentially obsolete and will need to be replaced with newer, more efficient miners like the Antminer S17.

The power cost also has a big impact on profitability and is directly related to the power consumption as well as to the cost of electricity at the mining operation. As more efficient machines are needed to keep up with the reduced revenue following the halving, miners will need to run operations in a place with low energy costs.

Mining colocation centers offer high power and low costs of energy, along with several other benefits, such as 24/7 security and equipment oversight. Its strongly suggested that miners consider state-of-the-art facilities throughout the country to help them make the most of their operations at a fraction of the cost and consumption.

This is what halving is all about. The current block reward of 12.5 BTC will be halved to 6.25 in the spring, and the revenue of all miners on the network will be cut in half, as well. The only way to make up for this is to increase mining power and reduce operational costs.

The price of Bitcoin has historically responded well to previous halvings for those miners capable of remaining in the market after the fact. However, this has been the subject of considerable debate in the crypto community, and although opinions vary, the outlook is bullish.

The bottom line is that when the Bitcoin block reward halves, so will the total revenue generated by all miners. If the hash rate, power consumption and power cost all stay the same as they were before, its likely that a mining operation will be unprofitable if the hardware hasnt been upgraded to remain competitive.

When getting into this space, its essential to keep emotions out of the equation. Its best to rely on the numbers and objectively analyze trends and key indicators to set oneself up for the best chance at success. This is certainly easier said than done in todays environment, as social media, family and friends have made it easier than ever to become influenced by outside sources. Still, it is important to understand that long-term trends are more indicative of where the market is headed than are random fluctuations.

Looking back at the last two halving events in 2012 and 2016, both led to new market highs for Bitcoins price within a year to a year and a half. No doubt that the upcoming halving will impact the market, and although we cant know for certain what will happen, if the demand for Bitcoin remains the same and scarcity is greater, the expected response would be to see the price increase. By how much, it is hard to say.

For those committed to the long-term play, good planning and investing in the latest hardware seems prudent. Going a bit further, for those who dont host their own mining equipment, analyzing hosting options and locking in competitive pricing now in a multi-year contract can help manage costs in the coming months.

The best piece of advice for miners: Assess your needs today and well in advance of the halving. As Benjamin Franklin once famously noted, If you fail to plan, you are planning to fail.

The views, thoughts and opinions expressed here are the authors alone and do not necessarily reflect or represent the views and opinions of Cointelegraph.

Dave Perrill is the CEO at Compute North. A 25-year veteran of the IT and information security industry, Perrill has been keenly immersed in the cryptocurrency mining industry and blockchain technology since its formative days. He founded and subsequently sold two technology companies, including an Internet Service Provider/Managed Security Provider, SecureConnect, which was acquired by Trustwave Holdings in 2012. He also has extensive experience in networking, data center engineering, scaling large IT systems and security.

Excerpt from:
Are Miners Prepared for the Halving of Bitcoin? - Cointelegraph