Recognizing Lessons Learned From the First DNSSEC Key Rollover, a Year Later – CircleID

A year ago, under the leadership of the Internet Corporation for Assigned Names and Numbers (ICANN), the internet naming community completed the first-ever rollover of the cryptographic key that plays a critical role in securing internet traffic worldwide. The ultimate success of that endeavor was due in large part to outreach efforts by ICANN and Verisign which, when coupled with the tireless efforts of the global internet measurement community, ensured that this significant event did not disrupt internet name resolution functions for billions of end users.

At the 2019 Internet Measurement Conference (IMC) in Amsterdam last month, naming community leaders, including two Verisign technologists, presented a thorough examination of the 2018 Domain Name System Security Extensions (DNSSEC) root zone Key Signing Key (KSK) Rollover. The multidisciplinary team's work on the subject, Roll, Roll, Roll Your Root: A Comprehensive Analysis of the First Ever DNSSEC Root KSK Rollover earned IMC's Distinguished Paper Award.

DNSSEC uses digital signatures based on public-key cryptography to make internet communications more secure. DNS-based communications protected by DNSSEC are much harder to falsify, so DNSSEC has been instrumental in helping to prevent so-called "man-in-the-middle" attacks, which rely on spoofing DNS data.

Within any encryption protocol, it's important to occasionally update cryptographic keys. In more discrete encryption environments, this process can be relatively simple, but in the case of DNSSEC, the sheer scale of the DNS as well as the critical global importance of the DNS infrastructure and the tens of millions of globally distributed parties that rely on it made this key rollover uniquely challenging.

Through Verisign's role as the root zone maintainer and in operating two of the world's 13 authoritative root servers, we were honored to play a part in the rollover process, and perhaps even more importantly, to play a role in the critical measurement, analysis and study that allowed the rollover to take place without disrupting the security, stability and availability of the global DNS.

Verisign and others in the DNS community continue to study the successes and unexpected effects of the rollover (some of which we discussed in a blog post published earlier this year), with the goal of applying these insights to future rollovers.

KSK rollover experts from Verisign joined with other leaders in the naming community to discuss their findings with the larger internet research community at IMC Amsterdam 2019. IMC is one of the world's premier events focused on internet measurement. The Distinguished Paper Awards recognize important work in the area of internet measurement.

Roll, Roll, Roll Your Root: A Comprehensive Analysis of the First Ever DNSSEC Root KSK Rollover provides an in-depth analysis of events occurring before, during and after the 2018 KSK rollover from multiple perspectives, to include that of root operators, resolver operators and end users. The paper's authors, Moritz Mller, Matthew Thomas (Verisign), Duane Wessels (Verisign), Wes Hardaker, Taejoong Chung, Willem Toorop and Roland van Rijswijk-Deij, identified several key challenges that will require careful consideration during the next KSK rollover, including:

Overall, the paper confirmed that both effective measurement and real-time observation were critical to the success of the 2018 KSK rollover and will be critical to any future efforts. The challenges encountered during the KSK rollover process would have been more difficult to surmount without the active engagement of the global internet measurement community and without trust anchor telemetry. Looking forward to future rollovers, the paper recommends adding extended error codes for DNSSEC failures, the introduction of a standby key and exploring out-of-band distribution of trust anchors via operating system updates.

You can read the paper and learn more about the KSK Rollover from our Labs page, and from ICANN.

Here is the original post:
Recognizing Lessons Learned From the First DNSSEC Key Rollover, a Year Later - CircleID

Impacts of the Blockchain will be Gigantic on the Financial Industry Says Deltec Bank Bahamas – Press Release – Digital Journal

According to Deltec Bank, Bahamas - Blockchain technology uses distributed databases and cryptography for the purpose of recording transactions. These records are interlinked and part of a continually expanding system that acts as a decentralized ledger.

Says Deltec Bank Bahamas

The cryptocurrency Bitcoin brought blockchain technology into the spotlight. Now, this technology is being utilized in many different industries. This post will look specifically at how the blockchain movement is impacting the financial industry.

What is Blockchain Technology?

According to Deltec Bank, Bahamas- Blockchain technology uses distributed databases and cryptography for the purpose of recording transactions. These records are interlinked and part of a continually expanding system that acts as a decentralized ledger.

Don and Alex Tapscott, authors of Blockchain Revolution, describe the technology as an incorruptible digital ledger of economic transactions that can be programmed to record not just financial transactions but virtually everything of value.

Because there is no central authority and the ledger must be shared, the blockchain network is more transparent. Everyone who is part of the network must act responsibly, which greatly increases its security. Any nefarious actions would be spotted and halted.

How the Blockchain Movement is Affecting Banking

Blockchain technology has the potential to disrupt the banking system as we currently know it. Created for cryptocurrencies, blockchain will very likely replace traditional systems used for carrying out financial transactions.

Banks are starting to realize that changes to the payment system are imminent. These banks fear that avoiding the use of blockchain technology would end their existence. Thus, the need to embrace the blockchain movement has become apparent.

According to Hackernoon, some banks are already implementing the technology. A few examples include:

Banks that follow these examples and find ways to implement blockchain technology into their own systems will be ready to face the upcoming changes in the financial industry.

How the Blockchain Movement Impacts Banks

The blockchain movement is going to revolutionize banking. Here is a little sneak peek that will help you know what to expect.

Increased Security

Greater transparency leads to greater security. In addition, faster transactions do not provide enough time for invaders to interfere with those transactions.

Decreased Expenses

According to Asia Blockchain Review, this technology will save banks about, $15-20 billion in infrastructure cost alone by 2022. This reduction in cost is due in part to smart contracts that result in fewer interactions with go-betweens.

Quicker Transactions

Instead of the typical 1-3 days required to verify and complete traditional transfers, blockchain technology expedites the process so that transfers can be verified in a matter of minutes. This will increase customer satisfaction with banking services.

Better Data

Current banking systems allow data to reside in multiple locations. That information can be altered in one or more of the institutions where the data is being stored. This may result in outdated information.

Blockchain technology provides a solution. Stored data can only be obtained and modified when certain rules have been followed, maintaining the integrity of the data.

It is obvious that the blockchain movement will have a major impact on banks, which is why innovative banks are preparing for the future right now.

These banks are implementing blockchain technology in order to survive the upcoming changes in the financial industry. It will also help them to stand out from competitors and increase consumer satisfaction.

Disclaimer: The author of this text, Robin Trehan, has an Undergraduate degree in economics, Masters in international business and finance and MBA in electronic business. Trehan is Senior VP at Deltec International http://www.deltecbank.com. The views, thoughts, and opinions expressed in this text are solely the views of the author, and not necessarily reflecting the views of Deltec International Group, its subsidiaries and/or employees.

About Deltec Bank

Headquartered in The Bahamas, Deltec is an independent financial services group that delivers bespoke solutions to meet clients unique needs. The Deltec group of companies includes Deltec Bank & Trust Limited, Deltec Fund Services Limited, and Deltec Investment Advisers Limited, Deltec Securities Ltd. and Long Cay Captive Management.

Media ContactCompany Name: Deltec International GroupContact Person: Media ManagerEmail: Send EmailPhone: 242 302 4100Country: BahamasWebsite: https://www.deltecbank.com/

Read the rest here:
Impacts of the Blockchain will be Gigantic on the Financial Industry Says Deltec Bank Bahamas - Press Release - Digital Journal

Cardano and Coti Release AdaPay – Altcoin Buzz

Cardano Foundation in collaboration with the enterprise-grade fintech platform, Coti announced the launch of the long-awaited AdaPay.

Cardano is making things happen. According to reports, the new payment solution, AdaPay allows merchants to accept payments in ADA with a near-instant settlement. Merchants can make settlements into 35 fiat currencies and directly into their bank accounts.

With AdaPay, merchants can now start accepting ADA for their business by joining the COTIpay platform 110 on the COTIs website. Reports claim merchants can freely use either an AdaPay button to integrate. As well as a QR-based point of sale system.

With the intention of enabling merchants to manage real-time transactions in ADA, AdaPay makes use of Cotis Universal Payment Solution (UPS) to extend the utility case for the token.

The payment tool, AdaPay claims it combines all existing support systems of traditional payment processors. Theres also believed to be the added value of digital assets.

A video also exists showing the payment procedure.

CEO of COTI, Dr. Nir Haloani commented on the development. He said that the future looks more promising for payments. He also expressed satisfaction in the partnership, pointing out that the decision stands as one of COTIs foremost steps in expanding its network. Furthermore, he believes the ways people exchange value is changing for the better, and the future looks more promising for payments.

Sequel to the news, the token ADA rose by about 0.2% within 24 hours.

Interestingly, Cardano and Coti announced their plans for AdaPay earlier in October.

Cardano (ADA) can be used to send and receive digital funds. The digital cash claims to represent the future of money. Thus, creating fast, direct transfers, and secured through the use of cryptography.

Besides, Cardano offers scalability and security through a layered architecture. The cryptocurrency makes use of a unique approach in the space as its built on scientific philosophy and peer-reviewed academic research.

To remind, CEO of IOHK, Charles Hoskinson announced that ADA holders can now enjoy staking rewards. He made these comments during Cardano anniversary in Plovdiv, Bulgaria.

Read more here:
Cardano and Coti Release AdaPay - Altcoin Buzz

The 10 Hottest AI And Machine Learning Startups Of 2019 – CRN: The Biggest Tech News For Partners And The IT Channel

AI Startup Funding In 2019 Set To Outpace Previous Year

Investors just can't get enough of artificial intelligence and machine-learning startupsif the latest data on venture capital funding is any indication.

Total funding for AI and machine-learning startups for the first three quarters of 2019 was $12.1 billion, surpassing last year's total of $10.2 billion, according to the PwC and CB Insights' MoneyTree report.

With global spending on AI systems set to grow 28.4 percent annually to $97.9 billion, according to research firm IDC, these startups see an opportunity to build new hardware and software innovations that can fundamentally change the way we work and live.

What follows is a look at the 10 hottest AI and machine-learning startups of 2019, whose products range from new AI hardware and open-source platforms to AI-powered sales applications.

Continued here:

The 10 Hottest AI And Machine Learning Startups Of 2019 - CRN: The Biggest Tech News For Partners And The IT Channel

Here’s why machine learning is critical to success for banks of the future – Tech Wire Asia

HSBC Bank and others use machine learning to win big. Source: Shutterstock

MACHINE learning is a popular buzzword today, and has been heralded as one of the greatest innovations conceived by man.

A branch of artificial intelligence (AI), machine learning is increasingly embedded in daily life, such as automatic email reply predictions, virtual assistants, and chatbots.

The technology is also expected to revolutionize the world of finance. While it is slower than other industries in embracing the technology, the impact of ML is already visibly significant.

Most recently, HSBC said that the bank was using the technology to combat financial crime.

We have 608 million transactions every month. Hence, with AI and machine learning we are able to identify a good transaction done by an innocent person versus transaction conducted by criminals, said HSBC Hong Kong Financial Crime Threat Mitigation Regional Head Paul Jevtovic.

Like HSBC, several other banks are beginning to deploy ML at scale. Here are the top three use cases in the banking and financial services space:

Prior to the advent of ML, decisions were made on a rule-based system, where the same criteria are applied across a broad customer segment, subjecting them to a one-size-fits-all solution.

With ML, bankers can approach customers in a more personalized way. ML algorithms can analyze volumes of consumer data in banks, tracing each customers digital footprint with a unified, omni-faceted view.

This footprint includes their financial status across multiple accounts, financial investments, and banking transactions.

With the relevant data and armed with the right analytical tools, ML can provide valuable insights that allow banks to create tailor-made solutions based on a specific customers behavior, preferences, and requirements.

With the wide tracking of a customers digital footprint that ML offers, banks can quickly and accurately assess a potential borrowers ability to repay better than with traditional methods.

Leveraging ML can eliminate biases, and can quickly help differentiate between applicants who are more credit-worthy from those who have a higher default risk even without an elaborate credit history. ML can also help banks forecast potential issues and rectify them before they occur.

With the assurance that risks are being mitigated, banks can focus on issues that can add value to their customers, increase productivity, and provide greater support to their employees.

ML can be greatly leveraged for fraud detection. Fraud is a pain point for many financial institutions, one which could potentially cause a bank to go out of business.

With ML, anomalies in customers behaviors can be quickly detected. By flagging and blocking transactions that are suspicious, banks can catch fraud in real-time, protecting customers and themselves.

ML is undoubtedly one of the greatest technological feats of the 21st century. With its laser precision in predicting behaviors and anticipating risks, we can be sure that the role of ML will only be more prominent in the future of banking.

Regardless of size, financial institutions or businesses looking to engage financial services must be aware of the uses of ML in banking. Should they wish to stay relevant, they must start exploring the technology now.

Original post:

Here's why machine learning is critical to success for banks of the future - Tech Wire Asia

Synthesis-planning program relies on human insight and machine learning – Chemical & Engineering News

Computer-aided synthesis planning (CASP) programs aim to replicate what synthetic chemists do when tackling a synthesis: start with a target molecule and then work backwards to trace a synthetic route, including an efficient and achievable series of reactions and reagents. Work in this field stretches back 50 years, but successful examples have emerged only in the last several years. These either rely on chemistry rules written by human chemists, or on machine-learning algorithms that have assimilated synthesis knowledge from databases of reactions.

Now researchers report that one CASP program that combines human knowledge and machine learning performs better than those using only artificial intelligence, particularly for synthetic routes involving rarely used reactions (Angew. Chem. Int. Ed. 2019, DOI: 10.1002/anie.201912083).

The program is an update to Chematica, which was developed by Bartosz A. Grzybowski of the Ulsan National Institute of Science and Technology and the Polish Academy of Sciences, and is marketed by MilliporeSigma as Synthia. Grzybowski says the program now includes almost 100,000 rules that he and colleagues have encoded over 15 years. Last year, they demonstrated that Chematicas synthetic plans are as good or better than human chemists in laboratory syntheses. To this point, Grzybowski has been perhaps the staunchest proponent of the expert approach to synthesis planning software, says Connor W. Coley of the Massachusetts Institute of Technology, who has developed a machine learningbased CASP program.

Now Grzybowski and colleagues have incorporated machine learning into Chematica. They trained machine-learning algorithms called neural networks on about 1.4 million product molecules that match one or more of Chematicas expert-coded reactions. Grzybowski says this hybrid approach teaches the algorithms which of those expert rules chemists actually use. That can help Chematica avoid a synthetic step that is possible but impractical, or to favor a reaction that may be rarely seen in the literature, but is necessary for certain transformations.

Grzybowski says human insight is important to include in a CASP program because chemical synthesis poses a more difficult challenge for machine-learning algorithms than playing chess or Go, games that these programs consistently beat humans at. For one, successful synthetic-route planning often involves considering two or three steps simultaneously. And unlike making a move in those games, calculating the effects of a given synthetic transformationfor example, the effects on electron density or stereochemistrytakes significant computing time.

The researchers compared the abilities of their hybrid algorithm with those of a purely neural networkbased approach published last year (Nature 2018, DOI: 10.1038/nature25978). The two methods were about equally effective at proposing synthetic steps that matched published reactions when their training data included thousands of examples of those reactions. But when there were fewer than 100 examples, the neural network approach rarely identified a verified transformation, while the hybrid version of Chematica found it more than 75% of the time. Several of the hybrid programs proposed reactions to synthesize the glaucoma drug bimatoprost were not represented in its training data, demonstrating its ability to use unusual reactions.

Chemists agree that this human-machine partnership shows promise, especially for less common reactions. This is important because there has been a preference of modern retrosynthetic algorithms to favor well-precedented reactions, says Timothy A. Cernak, whose lab at the University of Michigan is sponsored by MilliporeSigma and uses Synthia. But Coley cautions that a fair comparison of a hybrid approach and a neural network alone is difficult because theres greater potential for human experts to bias the data that the system is trained and tested on.

The researchers have not verified the generated synthetic routes in lab experiments, but Grzybowski says his group will publish new, lab-tested natural product syntheses from this program soon. He also says there are plans to incorporate the hybrid system into Synthia.

Read the original post:

Synthesis-planning program relies on human insight and machine learning - Chemical & Engineering News

Verification In The Era Of Autonomous Driving, Artificial Intelligence And Machine Learning – SemiEngineering

The importance of data is changing traditional value creation in electronics and forcing recalculations of return on investment.

The last couple of weeks have been busy with me participating on three panels that dealt with AI and machine learning in the contexts of automotive and aero/defense, in San Jose, Berlin and Detroit. The common theme? Data is indeed the new oil, and it messes with traditional value creation in electronics. Also, requirements for system design and verification are changing and there are completely new, blank-sheet opportunities that can help with verification and confirmation of what AI and ML actually do.

In the context of AI/ML, I have been using the movie quote a lot that I think I am 90% excited and 10% scared, oh wait, perhaps I am 10% percent excited and 90% scared. The recent panel discussions that I was part of did not help that much.

First, I was part of a panel called Collaborating to Realize an Autonomous Future at Arm TechCon in San Jose. The panel was organized by Arms Andrew Moore and my co-panelists were Robert Day (Arm), Phil Magney (VSI Labs) and Hao Liu (AutoX Inc.). Given the technical audience, questions were centered around how to break down hardware development for autonomous systems, how the autonomous software stack could be divided between developers, whether compute will end up being centralized or decentralized, and what the security and safety implications of a large amount of collaboration would bebasically boiling things down to a changing industry structure with new dynamics of control in the design chain.

For the second panel I was in Berlin, which was gearing up for the celebrations of 30-year anniversary of the fall of the Berlin Wall. The panel title could roughly be translated as If training data is the oil for digitalization enabled by artificial intelligence, how can the available oil be used best? The panel was organized by Wolfgang Ecker (Infineon) and my co-panelists were Erich Biermann (Bosch), Raik Brinkmann (OneSpin), Matthias Kstner (Microchip), Stefan Mengel (BMBF) and Herbert Taucher (Siemens). Discussion points here were centered around ownership of data, whether users would be willing to share data with tool vendors and whether this data could be trusted to be complete enough in the first place.

The third panel took place in Detroit, from which I just returned. It took place at the Association of the United States Army (AUSA) Autonomy and AI Symposium. Moderated by Major Amber Walker, my co-panelists were Margaret Amori (NVIDIA), BG Ross Coffman (United States Army Futures Command) and Ryan Close (United States Army Futures Command, C5ISR Center). Questions here were centered on lessons learned from civilian autonomous vehicles and what the differences between civilian and Army customization needs are. We discussed advances in hardware and how ready developers are for new sensors and compute, course resilience and trust and what the new vulnerabilities for cyber-attacks in AI would be, as well as design for customization and how the best of both worldscustom and adaptablecan be achieved.

Discussions and opinions were diverse, to say the least. Two big take-aways stick with me.

First, data really is the new oil! It needs protectionsecurity and resilience are crucial in an Army context in which data in the enemys hands could have catastrophic consequences, and privacy is crucial in civilian applications as well. Data also changes the value chain in electronics. As I had written before in the context of IoT, the value really has to come from the overall system perspective and cannot be assigned to individual components alone. In a system value chain of sensors, network, storage and data, one may decide to give away the tracker if the data it creates allows value creation through advertisement. Calculation of return on investments is becoming much more complicated.

Second, verification of what these neural networks actually do (and not do) is becoming critical. I had mused in the past about a potential Revenge of the Digital Twins, but these recent panel discussions have emphasized to me that indeed the confirmability of what an CNN/DNN in an AI does is seen as critical by manyin both automotive and the Army contexts, safety of the car and the human life involved is a risk if we cannot really confirm that AI cannot be tricked. Examples that demonstrate how easy it is to trick self-driving cars by defacing street signs make me worry quite a bit here.

That said though, from challenges come opportunity. Verification for CNN/DNNs and associated data sets will likely be an interesting new market in itselfI am definitely watching this space.

Read more:

Verification In The Era Of Autonomous Driving, Artificial Intelligence And Machine Learning - SemiEngineering

Workday talks machine learning and the future of human capital management – ZDNet

Because people are the most important resources in any organization, human capital management (HCM) is essential in every large enterprise. Changes in technology -- from mobile devices to AI -- are having a profound impact on how people in business work and interact.

Also:More on machine learning

To glimpse the future of HCM technology and the role of machine learning, I spoke with Cristina Goldt, vice president for HCM product management and Strategy at Workday. Cristina is a prominent HCM technology leader who is helping to shape human capital management. Our conversation took place at Workday Rising 2019, the company's annual customer event, held this year in Orlando.

Watch the video embedded above to see the future of HCM and read the edited transcript below. I recorded this video as part of the CxOTalk series of conversations with the world's top innovators in business, technology, government, and higher education.

We see technology -- artificial intelligence, machine learning -- changing work, changing jobs, and the relationship between people and machines. We see the world of work and alternative arrangements and agile teams. We see all of that playing into how work gets done.

And, and very importantly, we see skills become a key factor or driver in how people are thinking about their workforce, their talent, executing on their talent strategy.

We need to support them. They're looking for how they support their people in this ever-changing world of HR and world of work. And so for us, it's how do we help them become those enterprises in the future to identify, develop and optimize talent. Scale and speed are what we endeavor to help them with.

I think executing on our account strategy, where we're trying to match talent to talent demand, which is the work. For us, it's how do we take that data foundation, that rich foundation that we have, and build on it. I talked about skills earlier. It really is about building that skills and capability foundation, which we did with using machine learning.

It's a common language of skills across all of our customers. And most importantly, if you think of software as a service, it's skills as a service because it's crowdsourced. It dynamically lives and breathes and grows based on data. We've solved the data challenge of understanding, categorizing and continually keeping your skills updated.

The next thing was to solve the challenge of [knowing] what skills my people have. Taking machine learning to do that, to infer skills, matching people to work.

In the past, it has been very manual and challenging, if not almost impossible, because there wasn't a common language or way to do the matching. The technology wasn't there. Today that technology is there to sort through the huge volumes of data, to understand skills.

Data is the foundation to make this happen. At Workday, we started with that core system of record, which became that core foundation of data. Which now moves to a core system of capabilities. You now have data about your people that you can take action on, make recommendations. Using machine learning to make suggestions and make all of this happen. It doesn't happen without the data. That data foundation gets us to the next step.

The data comes from the billions of transactions and thousands of dimensions of the over 40 million workers in Workday.

Data is an important part of the future of work and a foundation for all the things we're going to do next.

Disclosure: Workday covered most of my travel to Workday Rising.

View post:

Workday talks machine learning and the future of human capital management - ZDNet

The role of machine learning in IT service management – ITProPortal

The service desk acts as the go-to place for all IT-related needs and issues, typically managing incidents or service disruptions, requests, and changes. The service desk scope of work can be enormous and wide-ranging, depending on the nature and size of the organisation in question. As a critical function used by employees across a company, it needs to be managed appropriately.

Technology has upended the way business is done across all industries around the world. At the same time, traditional IT service management (ITSM) solutions have become inefficient in maintaining customer satisfaction levels and meeting increasing customer expectations in a fast-paced digital world.

According to the SolarWinds IT Trends Report 2019: Skills for Tech Pros of Tomorrow, 79 per cent of IT managers werent able to spend sufficient time on value-added business activities or initiatives due to interruptions with day-to-day support-related issues. This resulted in misleading or incorrect manual entries into a problem log, which caused misinformed decision-making. With managers inundated with work, its easy for them to accidentally become the victim of manual or human errors.

With IT environments changing at an accelerating rate, its crucial IT service desks adopt emerging technologies. An explosion of data in recent years has intensified the pressure for IT professionals, but automated processes and machine learning (ML) can alleviate this pressure significantly. Artificial Intelligence (AI) and ML arent just buzzwords anymore. Enterprises worldwide are incorporating these technologies to enhance and improve operational efficiencies.

Whether for their use in predictive analytics, providing business intelligence, performance monitoring of networks, applications and systems, or even for its importance in self-driving cars, AI and ML are transforming the IT space. So, what are the applications of ML when it comes to ITSM? As an essential driver of how a business operates, a service desk solution can employ ML to streamline processes, and reduce manual, time-intensive tasks, which will ultimately free up time for additional projects and training to deliver business-wide transformation.

Incident resolution time has the potential to be cut in half. ML will enable self-resolution of incidents without the involvement of technicians and users will be able to search for solutions by themselves. Chatbots (like Google Assistant, for example) will be able to give information to end users without them having to log a ticket by providing easy access to relevant knowledge base articles based on their queries. Through ML, help desks could learn from past incidents and data to route tickets to the appropriate technician or support group. This can considerably increase efficiencies. Even better, automated help desks can run 24/7, making services available to employees at all hours at their own convenience.

Old IT assets can cause performance degradation for employees who rely on technology assets to do their jobs. In turn, this can result in a sizeable number of incidents in an organisation. Businesses spend a lot of money on hardware and software because of asset management solutions with poor transparency. This can be turned around using asset management solutions with ML technology to help track their performance based on insights from performance levels or incidents associated with a given asset. If incidents about a specific technology asset come into the system frequently or en masse, ML can recognise these as being associated and therefore indicative of a broader problem to be addressed.

ML can consume large datasets of past performance data to enable an analysis of incidents to predict future problems. Predictive capabilities can help save time, money, and effort for the entire organisation as steps can be taken before the severity or impact of the incident increases.

When end users submit a ticket, automation rules rely heavily on data like categories and subcategories to ensure accurate routing. ML helps facilitate this process by providing end users with suggestions for the most relevant categories and subcategories for a given ticket.

Service desk reporting can show trends about seasonality. Predictive models, however, take into consideration rate of change, frequency of problems, and other key factors helping predict service degradation and likely resulting in increased incident flows. This can help determine when more coverage is needed to maintain service levels.

ML, while being versatile as-is, demonstrates some critical applications when it comes to ITSM. Increasingly, organisations are taking leaps and bounds in their digital journeys, and it is only right their IT services evolve with them.

Now is a critical time for the Information Technology service management industry. The market is growing at a double-digit figure each year and is forecasted by analyst house IDC to reach over $8.5 billion by 2023.

Today, organisations need to re-examine how they can use new IT management software incorporating machine learning capabilities. Only this can change the course of IT service management which has historically been a cumbersome function of every business IT department.

Just as with huge transformative initiatives, software and machine learning can help streamline processes and increase employee productivity to drive better business outcomes. Service desk software will let IT pros consolidate asset information from multiple sources and provide real-time asset intelligence, thus improving service delivery while enhancing flexibility for collecting and managing data. By removing the manual burden of tasks like ticketing and tracking of assets and their performance, this will enable IT professionals to focus on critical projects and business transformation.

Steve Stover, Vice President of Product and Strategy, SolarWinds

Go here to see the original:

The role of machine learning in IT service management - ITProPortal

Microsoft reveals how it caught mutating Monero mining malware with machine learning – The Next Web

Microsofts antivirus and malware division recently opened the bonnet on a malicious mutating cryptocurrency miner. The Washington-based big tech firm revealed how machine learning was crucial in putting a stop to it spreading further.

According to the Microsoft Defender Advanced Threat Protection team, a new malware dubbed Dexphot has been infecting computers since last year, but since June 2019 has been burning out thanks to machine learning.

Dexphot used a number of techniques such as encryption, obfuscation layers, and randomized files names, to disguise itself and hijack legitimate systems. If successful, the malware would run a cryptocurrency miner on the device. Whats more, a re-infection would be triggered if system admins detected it and attempt to uninstall it.

Microsoft says Dexphot always uses a cryptocurrency miner, but doesnt always use the same one. XMRig and JCE Miner were shown to be used over the course of Microsofts research.

At its peak in June this year, 80,000 machines are believed to have displayed malicious behavior after being infected by Dexphot.

Detecting and protecting against malware like Dexphot is challenging as it is polymorphic. This means that the malware can change its identifiable characteristics to sneak past definition-based antivirus software.

While Microsoft claims it was able to prevent infections in most cases, it also says its behavior-based machine learning models acted as a safety net when infections slipped through a systems primary defenses.

In simple terms, the machine learning model works by analyzing the behavior of a potentially infected system rather than scanning it for known infected files a safeguard against polymorphic malware. This means systems can be partly protected against unknown threats that use mechanics similar to other known attacks.

On a very basic level, system behaviors like high CPU usage could be a key indicator that a device has been infected. When this is spotted, antivirus software can take appropriate action to curtail the threat.

In the case of Dexphot, Microsoft says its machine learning-based detections blocked malicious system DLL (dynamic link library) files to prevent the attack in its early stages.

Microsoft has not released any information on how much cryptocurrency was earned as a result of the Dexphot campaign. But thanks to Microsofts machine learning strategy it seems to be putting a lid on it, as infections have dropped by over 80 percent.

It seems as long as there is cryptocurrency, bad actors will attempt to get their hands on it.

Just yesterday, Hard Fork reported that the Stantinko botnet, thats infected 500,000 devices worldwide, has added a cryptocurrency miner to its batch of malicious files.

Published November 27, 2019 09:27 UTC

Read the rest here:

Microsoft reveals how it caught mutating Monero mining malware with machine learning - The Next Web