How Can AI and ML Transform the Way We Read and Understand Data? – ReadWrite

Todays business is ruled by data and data-driven understanding. How you understand the data and interpret the data into business decisions has a direct impact on your business conversion and growth. For a more precise understanding of data, today we have artificial intelligence (AI) and Machine Learning (ML) technologies on our side. No doubt, these technologies that mimic human reasoning can positively transform businesses and their strategies.

We need to understand the impact of AI and ML technologies have in shaping our understanding and capability to interpret data.

Any business understands the importance of communicating with customers individually. Yes, thanks to the very nature of digital interfaces that opened up the tremendous scope of individual preferences and choices, your business communication must take into account the preferences of individual customers. The increasing importance of addressing individual choices for business conversion has forced many companies to focus on data-driven personalization measures.

Not only the large businesses but also the startups and small businesses increasingly understand the importance of having access to the relevant data for meeting the needs of visitors. AI can dig the available user data deeper and fetch out relevant patterns and insights that can be further utilized for data-driven decision making personalization. AI can also help to scale up such personalization efforts for every individual user.

A superb example of how AI can allow personalization in business operations can be found in the case of Starbucks. The global coffee chain brand designed 400,000 different types of emails created based on the data of individual preferences, tastes, and choices. Such well crafted personalized communication can help brands to create more engaging communication and conversation for business brands. The brand actually AI to decipher the volumes of data corresponding to customer preferences and choices.

When it comes to smaller businesses and little startups, such as AI-based data collection and data-centric personalization may be a little expensive. But small businesses can embrace similar approaches to create very specific data-oriented marketing campaigns with short duration to boost business conversion and customer engagement. Such AI-powered data-driven campaigns can also help to lift the brand image of any company.

For the B2B segment, business conversion highly depends on generating new leads. The B2B companies also need to depend heavily on tracking contact data and reaching out to them effectively through lead generation funnel. Most marketers agree to the humongous range of challenges B2B-based businesses face in doing this. This is where AI can play a great role in streamlining the process of lead generation through intelligent automation.

Artificial Intelligence (AI) powered lead generation and contact tracking solutions have the capability to make an analysis of the customer base along with important trends and emerging patterns. These trends, patterns, anomalies, characteristics, and various attributes can deliver important insights for optimizing websites and web apps. Thanks to AI-based optimization insights a website can venture to use better programming language, tools, features, and UI elements to generate more leads.

On the other hand, AI-based business data analysis can work hand in hand with big data analytics. This sophisticated and highly incisive approach to data utilization can easily help to discover ideal customers for a business. The interactions of users on web pages and corresponding data can be analyzed by B2B brands with the help of AI tools to produce the most relevant as well as actionable insights.

To make things easier for the businesses, AI, and machine learning technology for such analytical activities are now spotted in most of the leading analytics solutions across the spectrum. Simple Google Analytics can also offer highly result-oriented and precision-driven reports. Such technologies can easily know about the shortcomings and loopholes behind the decreasing motivation of traffic and readings of business conversion fallout.

There are also great tools like Finteza that uses AI technology for monitoring website traffic on a continuous basis besides checking other crucial issues and irregularities. These tools can also improve your data security since by detecting bad traffic they automatically point out the vulnerabilities in the web app.

Poor web traffic often results in DDoS attacks, manipulation of website cookies, and hackers or malicious programs impersonating computer bots. An AI-based lead generation solution can also reduce these security vulnerabilities.

AI optimizes the scope of personalization in a data-driven manner and that is portrayed as the principal useless of AI in dealing with data. But AI is also highly effective in optimizing the web design and improving the user experience (UX).

AI achieves this optimization and improvement by analyzing user behavior and interaction data and user feedback. Machine learning programs particularly can play a very effective role in learning from user behavior and adjusting various interactive elements accordingly.

AI and ML programs running behind the scene basically collect a lot of data corresponding to real user behavior so that real-time feedback about shortcomings and improvement needs can be communicated to the business owners. An ML-based program can also bring instant tweaks to the UX attributes for better engagement.

Another important thing in this respect that needs to be explained is the great role of AI in improving the efficiency of A/B tests. In the A/B testing process the AI and machine learning can deliver the most important insights about user demands and preferences to take further enhancement measures for UI and UX.

The most important aspect of AI in making an impact over A/B testing is that it leaves no scope for vague assessment or guessing. The data-driven insights guiding the A/B testing is more possible now as website cookies provide clear insights concerning user behavior.

Based on such insights the landing pages can reduce form fields as per user interest and preferences.

Biometrics data corresponding to direct interactions with a web app can help developers and marketers with a lot of actionable insights. There are many advanced online services right now available in the market that can help to understand and decipher website data.

Biometrics data coupled up with AI and machine learning technology opened up new possibilities for improved user experience.

Among these available services for data interpretation mostly take the help of a combination of both artificial intelligence and machine learning. These sophisticated solutions can easily track the eye movements of the users.

In addition, some of these services can also track facial expressions to assess user responses in different contexts. These services can extract the most organic kind of user data and generate the most valuable insights that can be used for UX design and performance optimization of websites.

As the trends stand, from this year onward the AI and ML-based data analytics and data-centric optimization of business apps will have more dominance. Thanks to these two technologies, there will be the least guesswork for all design, development, and optimization decisions.

Atman Rathod is the Co-founder at CMARIX TechnoLabs Pvt. Ltd., a leading web and mobile app development services company with 16+ years of experience. He loves to write about technology, startups, entrepreneurship, and business. His creative abilities, academic track record and leadership skills made him one of the key industry influencers as well.

Read more:
How Can AI and ML Transform the Way We Read and Understand Data? - ReadWrite

This AI Could Bring Us Computers That Can Write Their Own Software – Singularity Hub

When OpenAI first published a paper on their new language generation AI, GPT-3, the hype was slow to build. The paper indicated GPT-3, the biggest natural language AI model yet, was advanced, but it only had a few written examples of its output. Then OpenAI gave select access to a beta version of GPT-3 to see what developers would do with it, and minds were blown.

Developers playing with GPT-3 have taken to Twitter with examples of its capabilities: short stories, press releases, articles about itself, a search engine. Perhaps most surprising was the discovery GPT-3 can write simple computer code. When web developer, Sharif Shameem, modified it to spit out HTML instead of natural language, the program generated code for webpage layouts from prompts like a button that looks like a watermelon.

I used to say that AI research seemed to have an odd blind spot towards automation of programming work, and I suspected a subconscious self-preservation bias, tweeted John Carmack, legendary computer game developer and consulting CTO at Oculus VR. The recent, almost accidental, discovery that GPT-3 can sort of write code does generate a slight shiver.

While the discovery of GPT-3s coding skills may have been somewhat serendipitous, there is, in fact, a whole field dedicated to the development of machine learning algorithms that can code. The research has been making progress, and a new algorithm just recently took another step.

The algorithm, called machine inferred code similarity (MISIM), is the brainchild of researchers from Intel, Georgia Institute of Technology, University of Pennsylvania, and MIT. Trained on the huge amount of code already publicly available on the web, MISIM can figure out what a program is supposed to do. Then, after finding other similar programs and comparing it to them, MISIM can offer ways to make the program faster or more efficient.

It isnt the first machine learning algorithm to make recommendations or compare similarity, but according to the researchers in a new preprint paper on MISIM, it was up to 40 times more accurate at the task when it went head to head with several of its most advanced competitors.

Near term, the AI could be a useful sidekick for todays programmers. Further out, the field could open programming to anyone who can describe what they want to create in everyday language or bring machines that write and maintain their own code.

The pursuit of computers that can code is almost as old as modern computer science itself. While there have been advances in programming automation, the recent explosion in machine learning is accelerating progress in a field called machine programming.

In a 2018 paper on the field, a group of Intel and MIT researchers wrote, The general goal of machine programming is to remove the burden of writing correct and efficient code from a human programmer and to instead place it on a machine.

Researchers are pursuing systems that can automate the steps required to transform a persons intentthat is, what they want a piece of software to dointo a working program. Theyre also aiming to automate the maintenance of software over time, like, for instance finding and fixing bugs, keeping programs compatible, or updating code to keep up with hardware upgrades.

Thats easier said than done, of course. Writing software is as much art as it is science. It takes a lot of experience and creativity to translate human intent into the language of machines.

But as GPT-3 shows, language is actually a skill machine learning is rapidly mastering, and programming languages are not so different from English, Chinese, or Swahili. Which is why GPT-3 picking up a few coding skills as a byproduct of its natural language training is notable.

While algorithmic advances in machine learning, like GPT-3, are key to machine programmings success, theyd be useless without good training data. Luckily, theres a huge amount of publicly available code on sites like GitHubreplete with revision histories and notesand code snippets and comment threads on sites like Stack Overflow. Even the internet at large, with accessible webpages and code, is an abundant source of learning material for AI to gobble up.

In theory, just as GPT-3 ingests millions of example articles to learn how to write, machine programming AIs could consume millions of programs and learn to code. But how to make this work in practice is an open question. Which is where MISIM comes in.

MISIM advances machine programming a step by being able to accurately identify what a snippet of code is supposed to do. Once its classified the code, it compares it to millions of other snippets in its database, surfaces those that are most similar, and suggests improvements to the code snippet based on those other examples.

Because MISIM classifies the codes purpose at a high level, it can find code snippets that do the same thing but are written differentlytheres more than one way to solve the same problemand even snippets in other programming languages. Simplistically, this is a bit like someone reading a New Yorker article, identifying its topic, and then finding all the other articles on that topicwhether theyre in Der Spiegel or Xinhua.

Another benefit of working at that higher level of classification is the program doesnt need the code to be compiled. That is, it doesnt have to translate it into the machine code thats executed by the computer. Since MISIM doesnt require a compiler, it can analyze code snippets as theyre being written and offer similar bits of code that could be faster or more efficient. (This is a bit like an email autocomplete feature finishing your sentences.)

Intel plans to offer MISIM to internal developers for just this purpose. The hope is itll prove a useful sidekick, making the code-writing process faster, easier, and more effective. But theres potentially more it can do. Translation between computer languages, for example, could also be a valuable application. It could perhaps help coders update government software written in archaic languages to something more modern.

But Justin Gottschlich, director of machine programming at Intel, has an even grander vision: the full democratization of coding.

Combine MISIM (or something like it) with natural language AI, and future programmers could simply write down what they want a piece of software to do, and the computer whips up the code. That would open programming to anyone with a decent command of their native language and a desire to make something cool.

As Gottschlich told MIT Technology Review, I would like to see 8 billion people create software in whatever way is most natural for them.

Image credit: Markus Spiske /Unsplash

More:
This AI Could Bring Us Computers That Can Write Their Own Software - Singularity Hub

Machine learning has discovered a galaxy with an extremely low oxygen – Tech Explorist

To comprehend galaxy evolution, space experts need to examine worlds in different phases of formation and evolution. Most of the galaxies in the modern Universe are mature galaxies. Yet, standard cosmology predicts that there may, in any case, be a few galaxies in the early formation stage in the new Universe. Since these early-stage galaxies are rare, an international research team looked for them in wide-field imaging data taken with the Subaru Telescope.

However, it wasnt easy to find galaxies in the early stage of galaxy formation from the data because the wide-field data includes as many as 40 million objects.

In a new study, the research developed a new machine learning method to find such galaxies from the vast amount of data. They had a computer repeatedly learn the galaxy colors expected from theoretical models, and then let the computer select only galaxies in the early stage of galaxy formation.

By combining big data captured by the Subaru Telescope and the power of machine learning, scientists have discovered a galaxy with an extremely low oxygen abundance of 1.6% solar abundance, breaking the previous record of the lowest oxygen abundance.

The galaxy named HSC J1631+4426 is located 430 million light-years away in the constellation Hercules. It has an oxygen abundance, only 1.6 percent of that of the Sun. This is the lowest values ever reported for a galaxy.

The measured oxygen abundance suggests that most of the stars in this galaxy formed very recently. In other words, this galaxy is undergoing an early stage of galaxy evolution.

Prof. Ouchi of the National Astronomical Observatory of Japan and the University of Tokyo said,What is surprising is that the stellar mass of the HSC J1631+4426 galaxy is very small, 0.8 million solar masses. This stellar mass is only about 1/100,000 of our Milky Way galaxy, and comparable to the mass of a star cluster in our Milky Way. This small mass also supports the primordial nature of the HSC J1631+4426 galaxy.

There are two new indications from this discovery. First, this is the evidence about a galaxy at such an early stage of galaxy evolution existing today. In the framework of the standard cosmology, new galaxies are thought to be born in the present Universe. The discovery of the HSC J1631+4426 galaxy backs up the picture of the standard cosmology. Second, we may witness a new-born galaxy at the latest epoch of cosmic history. The standard cosmology suggests that the matter density of the Universe rapidly drops in our Universe, whose expansion accelerates.

More here:
Machine learning has discovered a galaxy with an extremely low oxygen - Tech Explorist

MLOps: What You Need To Know – Forbes

Digital data multilayers.

MLOps is a relatively new concept in the AI (Artificial Intelligence) world and stands for machine learning operations.Its about how to best manage data scientists and operations people to allow for the effective development, deployment and monitoring of models.

MLOps is the natural progression of DevOps in the context of AI, said Samir Tout, who is a Professor of Cybersecurity at the Eastern Michigan University's School of Information Security & Applied Computing (SISAC).While it leverages DevOps' focus on security, compliance, and management of IT resources, MLOps real emphasis is on the consistent and smooth development of models and their scalability.

The origins of MLOps goes back to 2015 from a paper entitled Hidden Technical Debt in Machine Learning Systems.And since then, the growth has been particularly strong.Consider that the market for MLOps solutions is expected to reach $4 billion by 2025.

Putting ML models in production, operating models, and scaling use cases has been challenging for companies due to technology sprawl and siloing, said Santiago Giraldo, who is the Senior Product Marketing Manager and Data Engineer at Cloudera.In fact, 87% of projects dont get past the experiment phase and therefore, never make it into production.

Then how can MLOps help?Well, the handling of data is a big part of it.

Some key best practices are having a reproducible pipeline for data preparation and training, having a centralized experiment tracking system with well-defined metrics, and implementing a model management solution that makes it easy to compare alternative models across various metrics and roll back to an old model if there is a problem in production, said Matei Zaharia, who is the chief technologist at Databricks.These tools make it easy for ML teams to understand the performance of new models and catch and repair errors in production.

Something else to consider is that AI models are subject to change.This has certainly been apparent with the COVID-19 pandemic.The result is that many AI models have essentially gone haywire because of the lack of relevant datasets.

People often think a given model can be deployed and continue operating forever, but this is not accurate, said Randy LeBlanc, who is the VP of Customer Success at RapidMiner.Like a machine, models must be continuously monitored and maintained over time to see how theyre performing and shifting with new dataensuring that theyre delivering real, ongoing business impact.MLOps also allows for faster intervention when models degrade, meaning greater data security and accuracy, and allows businesses to develop and deploy models at a faster rate. For example, if you discovered an algorithm that will save you a million dollars per month, every month this model isnt in production or deployment costs you $1 million.

MLOps also requires rigorous tracking that is based on tangible metrics.If not, a project can easily go off the rails.When monitoring models, you want to have standard performance KPIs as well as those that are specific to the business problem, said Sarah Gates, who is an Analytics Strategist at SAS.This should be through a central location regardless of where the model is deployed or what language it was written in.That tracking should be automatedso you immediately know and are alertedwhen performance degrades.Performance monitoring should be multifaceted, so you are looking at your models from different perspectives.

While MLOps tools can be a huge help, there still needs to be discipline within the organization.Success is more than just about technology.

"Monitoring/testing of models requires a clear understanding of the data biases," said Michael Berthold, who is the CEO and co-founder of KNIME. "Scientific research on event, model change, and drift detection has most of the answers, but they are generally ignored in real life. You need to test on independent data, use challenger models and have frequent recalibration. Most data science toolboxes today totally ignore this aspect and have a very limited view on 'end-to-end' data science."

Tom (@ttaulli) is an advisor to startups and the author of Artificial Intelligence Basics: A Non-Technical Introduction and The Robotic Process Automation Handbook: A Guide to Implementing RPA Systems. He also has developed various online courses, such as for the COBOL programming language.

See more here:
MLOps: What You Need To Know - Forbes

Artificial Intelligence and Machine Learning Market with Future Prospects, Key Player SWOT Analysis and Forecast To 2025 – Express Journal

The Artificial Intelligence and Machine Learning Market report offers a detailed study on the evaluation of industry with respect to competitive players, latest advancements, regional analysis, emerging trends, and current tendencies of the end-user. The report also covers Artificial Intelligence and Machine Learning market size, market share, growth rate, revenue, and CAGR reported previously along with its forecast estimation. The report also includes performance in terms of revenue influence from various segments. It includes an in-depth analysis of key factors that influencing revenue growth of the Artificial Intelligence and Machine Learning market.

The report on Artificial Intelligence and Machine Learning market analyzes the primary growth factors, restraints and opportunities influencing the market outlook in the upcoming years. According to the research document, the market is predicted to generate significant revenue while registering a CAGR of XX% over the estimated timeframe (20XX-20XX).

The study provides detailed information regarding the impact of COVID-19 on the growth of Artificial Intelligence and Machine Learning market. With the pandemic unceasing, stringent lockdown measures has withheld the revenue of several industries and will continue to have a lingering impact even after the economy rejuvenates. Most of the businesses across various industry verticals have revised their budget plans in a bit to re-establish profit trajectory for the ensuing years.

Request Sample Copy of this Report @ https://www.express-journal.com/request-sample/158893

Our detailed assessment of this business space allows you to devise a plan-of-action for navigating through the market uncertainty and build versatile contingency plans to stay ahead of the competition. Additionally, the report offers a granular analysis of the various market segmentations as well as the competitive scenario of this business sphere.

Major aspects from the Artificial Intelligence and Machine Learning market report:

Artificial Intelligence and Machine Learning Market segments enclosed in the report:

Regional segmentation: North America, Europe, Asia-Pacific, South America, Middle East & Africa.

Product types:

Applications spectrum:

Competitive outlook:

Reasons why you should buy this report

Request Customization on This Report @ https://www.express-journal.com/request-for-customization/158893

Read the original post:
Artificial Intelligence and Machine Learning Market with Future Prospects, Key Player SWOT Analysis and Forecast To 2025 - Express Journal

Hands-On Guide To Loss Functions Used To Evaluate A ML Algorithm – Analytics India Magazine

The loss function is a method of evaluating how well the algorithm performs on your dataset, most of the people are confused about the difference between loss function and the cost function. We will use the term cost function for a single training example and loss function for the entire training dataset. We always try to reduce the loss function of the models using optimization techniques like Gradient Descent.

Loss functions are categorized into two types of losses

So how to choose which loss function to use.? It depends on the output tensor of the model. In regression problems, the output tensor is continuous data, and the output tensor of classification problems are probability values. Depending on the output variable we need to choose loss function to our model. Sometimes while building optimization models we may use multiple loss functions depending on the output tensor.

In this article, we will discuss the following loss functions.

Based on the demonstrated implementations, we can use these loss functions to evaluate the accuracy of any machine learning algorithm.

MSE loss is popularly used loss functions in dealing with regression problems. MSE loss function is an estimator measuring the average of error squares, mathematically it calculates the squared differences between the actual value and the predicted value, it is very easy to understand and implement.

A large MSE value indicates a wider spread of data and a smaller MSE indicated the value is nearest to the mean.

Implementation in python

MAE loss is also categorized in the regression problem, MAE loss calculates the average of absolute differences between the actual value and predicted values, when our data includes outliers we use MAE, so what are outliers.? The data points that are too large or too small than the mean. By using MAE loss the outliers dont affect the model.

Cross entropy is mostly used loss function in dealing with classification problems, cross-entropy measures the classification model whose probability in the range of 0 to 1, cross-entropy loss nearer to 0 results low loss and loss nearer to 1 results high loss. We calculate the individual loss for each class in a multiclass classification problem. When the output is probability distribution we use cross-entropy which uses softmax activation in the output layer.

Home Hands-On Guide To Loss Functions Used To Evaluate A ML Algorithm

The above formula is similar to the likelihood function if y=o then y(log(p)) will be 0 if y=1 then (1-y)log(1-p) will be zero.

We use Cross entropy loss in Multi-class classification problems like object detection. Lets consider the ImageNet dataset in which we have 1000 classes to find the loss.

The hinge loss function is used for a binary classification problem, a loss function is used to evaluate how well the given boundary is separating the given data, hinge loss is mostly used in SVM, this is used in the combination of the activation function in the last layer. We use Hinge loss to classify whether an email is a spam or not.

In the above demonstration, we discussed various types of the loss function and saw that depending on the output tensor we need to choose the loss function. Choosing the loss function for our model is very important because using the loss function we evaluate our model and it helps in optimizing the model to obtain error and reaching the global minima. Based on the implementations we showed above, these loss functions can be used to evaluate the accuracy of any machine learning algorithm.

comments

Visit link:
Hands-On Guide To Loss Functions Used To Evaluate A ML Algorithm - Analytics India Magazine

Surprisingly Recent Galaxy Discovered Using Machine Learning May Be the Last Generation Galaxy in the Long Cosmic History – SciTechDaily

HSC J1631+4426 broke the record for the lowest oxygen abundance. Credit: NAOJ/Kojima et al.

Breaking the lowest oxygen abundance record.

New results achieved by combining big data captured by the Subaru Telescope and the power of machine learning have discovered a galaxy with an extremely low oxygen abundance of 1.6% solar abundance, breaking the previous record of the lowest oxygen abundance. The measured oxygen abundance suggests that most of the stars in this galaxy formed very recently.

To understand galaxy evolution, astronomers need to study galaxies in various stages of formation and evolution. Most of the galaxies in the modern Universe are mature galaxies, but standard cosmology predicts that there may still be a few galaxies in the early formation stage in the modern Universe. Because these early-stage galaxies are rare, an international research team searched for them in wide-field imaging data taken with the Subaru Telescope. To find the very faint, rare galaxies, deep, wide-field data taken with the Subaru Telescope was indispensable, emphasizes Dr. Takashi Kojima, the leader of the team.

However, it was difficult to find galaxies in the early stage of galaxy formation from the data because the wide-field data includes as many as 40 million objects. So the research team developed a new machine learning method to find such galaxies from the vast amount of data. They had a computer repeatedly learn the galaxy colors expected from theoretical models, and then let the computer select only galaxies in the early stage of galaxy formation.

The research team then performed follow-up observations to determine the elemental abundance ratios of 4 of the 27 candidates selected by the computer. They have found that one galaxy (HSC J1631+4426), located 430 million light-years away in the constellation Hercules, has an oxygen abundance only 1.6 percent of that of the Sun. This is the lowest values ever reported for a galaxy. The measured oxygen abundance suggests that most of the stars in this galaxy formed very recently. In other words, this galaxy is undergoing an early stage of the galaxy evolution.

What is surprising is that the stellar mass of the HSC J1631+4426 galaxy is very small, 0.8 million solar masses. This stellar mass is only about 1/100,000 of our Milky Way galaxy, and comparable to the mass of a star cluster in our Milky Way, said Prof. Ouchi of the National Astronomical Observatory of Japan and the University of Tokyo. This small mass also supports the primordial nature of the HSC J1631+4426 galaxy.

The research team thinks that there are two interesting indications from this discovery. First, this is the evidence about a galaxy at such an early stage of galaxy evolution existing today. In the framework of the standard cosmology, new galaxies are thought to be born in the present universe. The discovery of the HSC J1631+4426 galaxy backs up the picture of the standard cosmology. Second, we may witness a new-born galaxy at the latest epoch of the cosmic history. The standard cosmology suggests that the matter density of the universe rapidly drops in our universe whose expansion accelerates. In the future universe with the rapid expansion, matter does not assemble by gravity, and new galaxies wont be born. The HSC J1631+4426 galaxy may be the last generation galaxy in the long cosmic history.

Original post:
Surprisingly Recent Galaxy Discovered Using Machine Learning May Be the Last Generation Galaxy in the Long Cosmic History - SciTechDaily

IoT Automation Trend Rides Next Wave of Machine Learning, Big Data – IoT World Today

IoT automation has found new raison detre in the COVID-19 era.

An array of new methods along with unexpected new pressures cast todays IoT automation efforts in an utterly new light.

Progress today in IoT automation is based on fresh methods employing big data, machine learning, asset intelligence and edge computing architecture. It is also enabled by emerging approaches to service orchestration and workflow, and by ITOps efforts that stress better links between IT and operations.

On one end, advances in IoT automation include robotic process automation (RPA) tools that use sensor data to inform backroom and clerical tasks. On the other end are true robots that maintain the flow of goods on factory floors.

Meanwhile, nothing has focused business leaders on automation like COVID-19. Automation technologies have gained priority in light of 2020s pandemic, which is spurring use of IoT sensors, robots and software to enable additional remote monitoring. Still, this work was well underway before COVID-19 emerged.

Cybersecurity Drives Advances in IoT Automation

In particular, automated discovery of IoT environments for cybersecurity purposes has been an ongoing driver of IoT automation. That is simply because there is too much machine information to manually track, according to Lerry Wilson, senior director for innovation and digital ecosystems at Splunk. The target is anomalies found in data stream patterns.

Anomalous behavior starts to trickle into the environment, and theres too much for humans to do, Wilson said. And, while much of this still requires a human somewhere in the loop, the role of automation continues to grow.

Wilson said Splunk, which focuses on integrating a breadth of machine data, has worked with partners to ensure incoming data can now kick off useful functions in real time. These kinds of efforts are central to emerging information technology/operations technology (IT/OT) integration. This, along with machine learning (ML), promises increased automation of business workflows.

Today, we and our partners are creating machine learning that will automatically set up a work order people dont have to [manually] enter that anymore, he said, adding that what once took the form of analytical reports now is correlated with historic data for immediate execution.

We moved past reporting to action, Wilson said.

Notable use cases Splunk has encountered include systems that collect signals to monitor and optimize factory floor and campus activity as well as to correlate asset information, Wilson indicated.

Hyperautomation Hyped

The move toward more coordinated, highly integrated systems automation is strong enough that Gartner has dubbed it hyperautomation, and included it in its Top 10 Strategic Technology Trends for 2020.

The research group describes hyperautomation as the orchestrated use of multiple technologies to catalyze business-driven process change, and declares everything that can be automated, will be automated.

The hyperautomation category includes process and task automation tools, ML, event-driven software and RPA, according to Gartner. Estimates of Coherent Market Insights valued a global market for hyperautomation at $4.2 billion in 2017, and predicted 18.9% CAGR from 2019 through 2027.

Automation hyper or other is supported in several products. These include workflow orchestration software from companies ranging from Broadcom and BMC to Radianse and Resolve Systems. The space also holds players like ServiceNow and Splunk.

The ranks include industrial IoT automation systems from GE, Honeywell, Rockwell Automation, Plex, PTC and Siemens, as well as IT infrastructure and ERP application software such as C3.ai, IBM and SAP.

And, that is not to mention domain specialists like Esri, with geospatial data processing; Dassault Systmes, with 3D Design and engineering software; and many others working to automate aspects of IoT.

Business Process Automation

For Radianse, which integrates intelligent tracking and management software with tagged RFID and non-RFID devices, IoT automation means expanding real-time monitoring of staff tasks and automation of schedules from elder care facilities and hospitals to gyms, fitness centers and even bars.

In hospitals, naturally, asset tracking has gained new importance as respirator demand has vaulted. Cleaning schedules, too, now require new levels of tracking and efficiency. Change here is rapid.

With the COVID-19 pandemic, you see pivots in approaches. You see interfaces that dont require touch menus, or that interface to users own devices, according to Randy Ribeck, vice president of strategy for Radianse.

Ribeck said the company works with customers to implement systems that automate scheduling and asset use, and that the influx of data can be challenging. So, paring down incoming data to the essentials is an important mission. Otherwise, at times, you can be drinking from a fire hose, he said.

ITOps Automation

Agility has been the mantra of many organizations for years. Thats taken the form of DevOps, ITOps, MLOps and AIOps. All are methods organizations use to automate the repeatable steps developers and administrators take to keep apps running.

As use of IoT devices grows, more automation is being applied. Basically, more organizations are taking on the workflow styles of traditional telcos or cloud providers.

There is a common problem around the proliferation of IoT [devices]. Organizations are left to manage all of these things, to make sure they are working properly, said Vijay Kurkal, CEO, Resolve Systems, maker of an AIOps platform for enterprise-wide incident response, automation and process orchestration.

The problems grow greater as IoT devices take on more tasks. He cites one of the most ubiquitous of Things. That is, the ATM.

More than ever, banks need to know ATMs are up, running and functioning. That is because each ATM now serves multiple applications. If they fail, you lose business and customers are frustrated, Kurkal said.

Moreover, a truck roll that requires technicians to be dispatched (in a truck) to ATM locations is expensive. All that makes AI and automation an integral part of capable incident resolution planning, he said.

IoT Automation on the Map

Automation takes on a different aspect when IoT data is introduced, according to Susan Foss, product manager for real-time visualization and analytics at Esri, the geographic information system (GIS) giant.

What is different? Its the nature of the data being collected, she said. Organizations have never had this type of information before or at this granularity of time-space detail.

Before it was more periodic. Now they have it in the form of a living, breathing, constant supply, she added. That ushers in event processing architectures, changes the pace with which teams have to work with data, and augers more automation.

Foss said Esri is working with users to connect fast-arriving IoT data to location data. The goal is to create immediate visualizations of data on a map. This requires, Foss said, a delicate balance of compute horsepower against the incoming real-time data, as well as static data sources that might need to be used with it.

And, real-time activity mapping is going indoors in the face of the COVID-19 pandemic. To that end, Esri recently updated its ArcGIS Indoors offering with new space planning templates. The software uses beacons and Wi-Fi to collect data for display on a live map showing activity in offices and other physical plants. Clearly, such capabilities have special import in the wake of coronavirus.

Retooling for the Next Normal

Subtle changes are underway in IoT automation, driven by global events, according to Prashanth Mysore, director of DELMIA marketing and strategic development at Dassault Systmes.

For one thing, a next normal is focused on ensuring employees safety and security, Mysore said.He also anticipates more change in supply chains, as closer connections to sourcing become more important, and real-time monitoring of supply chains is needed.

Mysore systems simulation and 3-D modeling will help in this regard, particularly where much new what-if analysis of system behavior must be swiftly completed. Like others, he singles out lightning-fast shifts to ventilator manufacturing by auto makers and others as a harbinger of things to come.

Things are so dynamic. For example, people have to look at how remote operations and networking affect security, he said, pointing as well to an upsurge in remote IoT system maintenance in times to come. This move to greater operational flexibility also signals the need for convergence between IT systems and operational systems, Mysore indicated.

Autonomizing the Unpredictable

Of course, the factory floor remains the citadel of automation. Key factors at play are big data, ML and the general trend toward digitization, according to Juan Aparicio Ojea, head of advanced manufacturing automation, Siemens Corporate Technology.

Ojeda said these factors combine to create what he calls autonomous automation. This next step for automation, it seems, is to venture into the realm of the unpredictable.

In traditional or classical automation, there is explicit motion programming, explains Ojeda. Tasks and procedures are static and repetitive. Thats due for change.

Historically, we have been very good at automating the predictable process. For example, the welding line in automotive assembly, he said. This approach faces issues, if parts are not represented perfectly. And changing these systems is programming intensive.

With next-generation autonomous automation, systems are based on ML modeling, rather than explicit programming, said Ojeda, who describes this as a move from automating the predictable to autonomizing the unpredictable.

As the recent COVID-19 rush to retool production lines showed, shifts in production can be challenging. This could be a job for autonomous automation, which Ojeda posits as a means toward more flexible automation and robotics.

Edge Computing Fits

IoT implementers should be aware that greater automation is about more than machine learning algorithms. Team leaders must also understand the full life cycle of products.

Autonomous automation means you have to extract the data, maintain it, figure out where you store it its a different computing architecture, requiring a new way of planning, Ojea said. Nothing comes free, machine learning is very compute and data intensive.

An answer to that issue in some cases will be robotics linked with edge computing. It makes a lot of sense to put computer power very close to the process, Ojea said. Edge fits well.

At the same time, autonomous automation should be viewed as an addition to classic automation methods, not a complete replacement, Ojea said.

From the days of the Jacquard automated loom through to Henry Fords automated assembly line and beyond, automation has driven new technology use. Clearly, the technologies now ready to meet that call are many, giving tech leaders plenty to ponder as they reimagine automation as it applies to IoT.

The rest is here:
IoT Automation Trend Rides Next Wave of Machine Learning, Big Data - IoT World Today

What Robots Need to Succeed: Machine-Learning to Teach Effectively – Robotics Business Review

With machine learning, algorithms are automatically generated from large datasets, speeding the development and reducing the difficulty of creating complex systems, including robotics systems. While data at scale is what makes accurate machine learning go, the data used to train ML models must also be very accurate and of high quality.

By Hyun Kim | July 31, 2020

The Mid-twentieth century sociologist David Reisman was perhaps the first to wonder with unease what people would do with all of their free time once the encroaching machine automation of the 1960s liberated humans from their menial chores and decision-making. His prosperous, if anxious, vision of the future only half came to pass however, as the complexities of life expanded to continually fill the days of both man and machine. Work alleviated by industrious machines, such as robotics systems, in the ensuing decades only freed humans to create increasingly elaborate new tasks to be labored over. Rather than give us more free time, the machines gave us more time to work.

Machine LearningToday, the primary man-made assistants helping humans with their work are decreasingly likely to take the form of an assembly line of robot limbs or the robotic butlers first dreamed up during the era of the Space Race. Three quarters of a century later, it is robotic minds, and not necessarily bodies, that are in demand within nearly every sector of business. But humans can only teach artificial intelligence so much or at least at so great a scale. Enter Machine Learning, the field of study in which algorithms and physical machines are taught using enormous caches of data. Machine learning has many different disciplines, with Deep Learning being a major subset of that.

Today Deep Learning is finally experiencing its star turn, driven by the explosive potential of Deep Neural Network algorithms and hardware advancements.

Deep Learning ArrivesDeep Learning utilizes neural network layers to learn patterns from datasets. The field was first conceived 20-30 years ago, but did not achieve popularity due to the limitations of computational power at the time. Today Deep Learning is finally experiencing its star turn, driven by the explosive potential of Deep Neural Network algorithms and hardware advancements. Deep Learning require enormous amounts of computational power, but can ultimately be very powerful if one has enough computational capacity and the required datasets.

So who teaches the machines? Who decides what AI needs to know? First, engineers and scientists decide how AI learns. Domain experts then advise on how robots need to function and operate within the scope of the task that is being addressed, be that assisting warehouse logistics experts, security consultants, etc.

Planning and LearningWhen it comes to AI receiving these inputs, it is important to make the distinction between Planning and Learning. Planning involves scenarios in which all the variables are already known, and the robot just has to work out at what pace it has to move each joint to complete a task such as grabbing an object. Learning on the other hand, involves a more unstructured dynamic environment in which the robot has to anticipate countless different inputs and react accordingly.

Learning can take place via Demonstrations (Physically training their movements through guided practice), Simulations (3D artificial environments), or even by being fed videos or data of a person or another robot performing the task it is hoping to master for itself. The latter of these is a form of Training Data, a set of labeled or annotated datasets that an AI algorithm can use to recognize and learn from. Training Data is increasingly necessary for todays complex Machine Learning behaviors. For ML algorithms to pick up patterns in data, ML teams need to feed it with a large amount of data.

Accuracy and AbundanceAccuracy and abundance of data are critical. A diet of inaccurate or corrupted data will result in the algorithm not being able to learn correctly, or drawing the wrong conclusions. If your dataset is focused on Chihuahuas, and you input a picture of a blueberry muffin, then you would still get a Chihuahua. This is known as lack of proper data distribution.

Insufficient training data will result in a stilted learning curve that might not ever reach the full potential of how it was designed to perform. Enough data to encompass the majority of imagined scenarios and edge cases alike is critical for true learning to take place.

Hard at WorkMachine Learning is currently being deployed across a wide array of industries and types of applications, including those involving robotics systems. For example, unmanned vehicles are currently assisting the construction industry, deployed across live worksites. Construction companies use data training platforms such as Superb AI to create and manage datasets that can teach ML models to avoid humans and animals, and to engage in assembling and building.

In the medical sector, research labs at renowned international universities deploy training data to help computer vision models to recognize tumors within MRIs and CT Scans. These can eventually be used to not only accurately diagnose and prevent diseases, but also train medical robots for surgery and other life-saving procedures. Even the best doctor in the world has a bad nights sleep sometimes, which can dull focus the next day. But a properly trained robotic tumor-hunting assistant can at perform peak efficiency every day.

Living Up to the PotentialSo whats at stake here? Theres a tremendous opportunity for training data, Machine Learning, and Artificial Intelligence to help robots to live up to the potential that Reisman imagined all those decades ago. Technology companies employing complex Machine Learning initiatives have a responsibility to educate and create trust within the general public, so that these advancements can be permitted to truly help humanity level up. If the world can deploy well-trained, built and purposed AI, coupled with advanced robotics, then we may very well live to see some of that leisure time that Reisman was so nervous about. I think most people today would agree that we certainly could use it.

Hyun Kim, Co-founder and CEO, Superb AI

Hyunsoo (Hyun) Kim is the co-founder and CEO of Superb AI, and is on a mission to democratize data and artificial intelligence. With a background in Deep Learning and Robotics during his PhD studies at Duke University and career as a Machine Learning Engineer, Kim saw the need for a more efficient way for companies to handle machine learning training data. Superb AI enables companies to create and manage the enormous amounts of data they need to train machine learning algorithms, and lower the hurdle for industries to adopt the technology. Kim has also been selected as the featured honoree for the Enterprise Technology category of Forbes 30 Under 30 Asia 2020, and Superb AI managed last year to join Y Combinator, a prominent Silicon Valley startup accelerator.

Excerpt from:
What Robots Need to Succeed: Machine-Learning to Teach Effectively - Robotics Business Review

How one company is using machine learning to remove bias from the hiring process – WRAL Tech Wire

Editors note: Stuart Nisbet is chief data scientist at Cadient Talent, a talent acquisition firm based in Raleigh.

RALEIGH At Cadient Talent, its a question that we wrestle with on a daily basis: How do we eliminate bias from the hiring process?

The only way to address a problem or bias is to acknowledge it head on, under the scrutiny of scientific examination. Through the application of machine learning, we are able to learn where we have erred in the past, allowing us to make less biased hiring decisions moving forward. When we uncover unconscious bias, or even conscious bias, and educate ourselves to do better based on unbiased machine learning we are able to take the first step toward correcting an identified problem.

Bias is defined as a prejudgment or a prejudice in favor of or against one thing, person, or group compared with another, usually in a way that is considered to be unfair. Think of bias as three sets of facts: The first is a set of objective facts that are universally accepted. The second is a set of facts that confirms beliefs, in line with what an individual believes to be true. Where bias enters the picture is in the intersection between the objective facts and the facts that confirm personal beliefs.

By selectively choosing the facts that confirm particular beliefs and focusing on the things that confirm those beliefs, bias enters. If we look at hiring from that perspective, and if our goal is to remove bias from the hiring process, then we need to remove the personal choice of which data points are included in the process. All data points that contribute to a positive choice (hire the applicant) or negative choice (decline the applicant) are included in the process and choosing the data points and their weights is done objectively through statistics, not subjectively through human choice.

How can computer algorithms help us do this? Our goal is to be able to augment the intelligence of humans, in particular by using the experiences and prior judgment in past hiring decisions, with an emphasis on those that resulted in good hiring decisions. Good hiring can be measured in a number of ways, that dont implement inappropriate bias, such as the longevity of employees. If a new hire does not remain on the job very long, then perhaps the recruiting effort was not done well, and, in hindsight, you would not have chosen that applicant. But, if you hire someone who is productive and stays for a long time, that person would be considered a good hire.

We want to remove bias when it is unintentional or has no bearing on whether an employee is going to be able to perform the job in a satisfactory manner. So, if a hiring managers entire responsibility is to apply their knowledge and experience to determine the best fit, why do we use machine learning to eliminate bias? Because, artificial intelligence only removes the bias towards non-work-related candidate attributes and augments decisions based on relevant work traits, where there is appropriate bias.

Our goal is then to make the hiring process as transparent as possible and consider all of the variables that are used in a hiring decision. Thats extremely complicated, if not impossible, if you have nothing but a human-based approach because the decision-making of a hiring manager is far more complex and less understood than those of a machine learning algorithm. So, we want to focus on the strength of simplicity in a machine learning algorithm; meaning we only want to look at variables, columns, and pieces of data in the algorithm that are pertinent to the hiring process and do not include any data points that are not relevant to performance.

Stuart Nisbet

An assessment result, for example, whether cognitive or personality-based, may be a very valid data point to consider if the traits being assessed are pertinent to the job. Work history and demonstrated achievement in similar roles may be very important to consider. The opposite is very clear, too. Gender, ethnicity, and age should have no legitimate bearing on someones job performance. This next point is critical. A hiring manager cannot meet an applicant in an interview and credibly say that they dont recognize the gender, ethnicity, or general age category of the person sitting across from them. No matter our intentions, this is incredibly hard to do. Conversely, it is the easiest task for an algorithm to perform.

If the algorithm is not provided gender, ethnicity, or age, there is no chance for those variables to be brought into the hiring decision. This involves bringing in the data that is germane, having a computer look at what hiring decisions have been made in the past that have resulted in high performing long-term employees, and then strengthening future decisions based on the past performance of good hiring management practices. This will ultimately remove the bias in hiring.

One of the things that deserves consideration is the idea of perpetuating past practices that could be biased. If all we are doing is hiring like we have hired in the past and there have been prejudicial or biased hiring practices, that could promote institutional bias. Through time, we have trained computers to do exactly what a biased manager would have done in the past. If the only data that is used (trained) for hiring is the same data that is selected by biases of the past, then it is difficult to train on data that is not biased. For example, if we identify gender as a bias in the hiring process, and we take the gender variable out of the algorithm, gender would not be considered. When we flag previous bias, we are able to minimize future bias.

We should unabashedly look at whether we are able to identify and learn from hiring practices that may have had bias in the past. This is one of the greatest strengths of applying very simple machine learning algorithms in the area of hourly hiring.

An aspect of the hiring process that opens up a lot of opportunities in the area of artificial intelligence and machine learning is implementing diversity.

Artificial intelligence can really differentiate itself here. Machine learning can make the very best hiring decisions based on the data that its given; if you have diversity goals and want hiring practices to encourage a diverse work population, it is very simple to choose the best candidates from whichever populations are important to corporate goals. This can be done transparently and simply. It doesnt prioritize one person over another. It allows the hiring of the very best candidates from each population that youre interested in representing the company.

Upon scrutiny and scientific examination, machine learning can be a very valuable tool for augmenting the hiring decisions managers make every day and help to understand when bias has entered into our decisions and yielded far less than our collective best.

The rest is here:
How one company is using machine learning to remove bias from the hiring process - WRAL Tech Wire