The Value of Machine-Driven Initiatives for K12 Schools – EdTech Magazine: Focus on Higher Education

K12 schools and districts are using artificial intelligence, specifically machine learning, to address a range of needs, from infrastructure efficiencies to targeted academic interventions.

Self-learning machines and intelligent algorithms can detect the signs of students vaping on campus or spikes of noise that might indicate a violent incident. AI-driven innovations can collect and analyze data on HVAC usage to help administrators identify inefficiencies. And those are just a few examples. Such evolving technologies promise plenty of benefits for K12 education.

So, what is machine learning? Machine learning algorithms use statistics to find patterns in massive amounts of data, according to the MIT Technology Review.

A key benefit of intelligent algorithms: detecting patterns in vast data sets that frustrate human efforts. Advanced machine learning tools now leverage human-inspired deep neural networks to deliver both pattern recognition and behavioral prediction, while AI solutions are designed to mimic human decision-making based on available data.

In K12 education, machine learning tools enable collating and correlating student performance, and then identifying key indicators that suggest the need for specific teacher or administrative support. Administrators also have to navigate the privacy and security concerns surrounding AI-driven deployments a growing challenge as the use of Big Data in education becomes more commonplace and districts fleets of digitally connected classroom devices expand.

Theres also more to machine learning than classroom data collection.

MORE FROM EDTECH: How K12 Schools have adopted artificial intelligence.

Many institutions now use machine learning to search for patterns and sift through operational IT data, says Mohan Rajagopalan, senior director of product management for Splunk. Doing so, he says, empowers them to detect anomalies, such as deviation from past behaviors indicating machine or network failures, or unusual changes in access patterns indicating potential security issues that may arise, allowing IT staff to forecast usage trends and assist in capacity planning.

That data analysis is beneficial to K12 schools running on last-generation network technology while simultaneously managing one-to-one computing initiatives. Having the ability to predict potential downtime and understand student use trends can help administrators more effectively track technology spending and security. Machine learning leaders such as Splunk have already helped school districts prevent network outages and reduce their mean time to investigate and repair IT issues.

MORE FROM EDTECH: Teachers are turning to AI solutions for assistance.

Education has a tech talent shortage. Thats no surprise: K12 schools often cant offer competitive salaries, and many districts are located outside of large urban areas, making it harder to recruit from an already-limited talent pool. The result? Local teams are on the hook to run enterprise-scale networks with skeleton crews.

Institutions can leverage intelligent algorithms to supplement and augment human operators, Rajagopalan notes. For schools, these solutions offer a way to do more with less by enhancing the efficacy of smaller IT teams tasked with servicing technology solutions at scale, implementing data-first security features that prioritize student privacy and supporting both in-house and BYOD deployments.

Machine learning integration offers key IT infrastructure benefits, Rajagopalan says, including:

MORE FROM EDTECH: Assessment innovation in K12 levels the playing field for students.

Effective school environments extend past classrooms, teachers and learning technologies to the basic building infrastructure. For example, sudden HVAC failure could cause building temperatures to plunge or skyrocket, forcing temporary closures or class relocation. Inefficient devices can also negatively impact school budgets if districts overspend on maintenance or replacement, draining funds that could be used for new computing technologies such as virtual reality assets or cloud-based assessments.

Technology companies such as Microsoft already are leveraging machine learning to reduce climate control costs and improve employee comfort. But recent research suggests schools by virtue of their not-for-profit approach often overlook cost-effective investments in energy efficiency.

The sheer amount of data that facility control systems and sensors generate provides the necessary foundation for machine learning to automatically look for failures and resolution, Rajagopalan says. He points to the example of machine failure due to overheating: If school facility managers received alerts of air conditioning units overheating based on current ambient temperature and usage patterns, they could temporarily shut down the units for repairs, reducing the need for costly replacement.

Despite the mind-boggling potential for machine learning, Rajagopalan says, the recipe for success lies not in developing more technology but in being able to successfully align technology with specific use cases and user needs.

For K12 schools, that means the application of machine learning and AI isnt about speed or scale, but specificity. From identifying IT patterns to bridging the tech talent shortage or avoiding costly failures, machine learning applied to solve specific challenges can help school districts maximize their digital strategy investments.

See original here:

The Value of Machine-Driven Initiatives for K12 Schools - EdTech Magazine: Focus on Higher Education

CMSWire’s Top 10 AI and Machine Learning Articles of 2019 – CMSWire

PHOTO: tiffany terry

Would you believe me if I told you artificial intelligence (AI) wrote this article?

With 2020 on the horizon, and with all the progress made in AI and machine learning (ML) already, it probably wouldnt surprise you if that were indeed the case which is bad news for writers like me (or not).

As we transition into a new year, its worth noting that 73% of global consumers say they are open to businesses using AI if it makes life easier, and 83% of businesses say that AI is a strategic priority for their businesses already. If thats not a recipe for even more progress in 2020 and beyond, then my name isnt CMSWire-Bot-927.

Today, were looking back at the AI and ML articles which resonated with CMSWire's audience in 2019. Strap yourself in, because this list is about to blast you into the future.

ML and, more broadly, AI have become the tech industry's most important trends over the past 18 months. And despite the hype and, to some extent, fear surrounding the technology, many businesses are now embracing AI at an impressive speed.

Despite this progress, many of the pilot schemes are still highly experimental, and some organizations are struggling to understand how they can really embrace the technology.

As the business world grapples with the potential of AI and machine learning, new ethical challenges arise on a regular basis related to its use.

One area where tensions are being played out is in talent management: a struggle between relying on human expertise or in deferring decisions to machines so as to better understand employee needs, skills and career potential.

Marketing technology has evolved rapidly over the past decade, with one of the most exciting developments being the creation of publicly-available, cost-effective cognitive APIs by companies like Microsoft, IBM, Alphabet, Amazon and others. These APIs make it possible for businesses and organizations to tap into AI and ML technology for both customer-facing solutions as well as internal operations.

The workplace chatbots are coming! The workplace chatbots are coming!

OK, well, theyre already here. And in a few years, there will be even more. According to Gartner, by 2021 the daily use ofvirtual assistants in the workplacewill climb to 25%. That will be up from less than 2% this year.Gartneralso identified a workplace chatbot landscape of more than 1,000 vendors, so choosing a workplace chatbot wont be easy. IT leaders need to determine the capabilities they need from such a platform in the short term and select a vendor on that basis, according to Gartner.

High-quality metadata plays an outsized role in improving enterprise search results. But convincing people to consistently apply quality metadata has been an uphill battle for most companies. One solution that has been around for a long time now is to automate metadata's creation, using rules-based content auto-classification products.

Although enterprise interest in bots seems to be at an all-time high,Gartner reports that 68%of customer service leaders believe bots and virtual assistants will become even more important in the next two years. As bots are called upon to perform a greater range of tasks, chatbots will increasingly rely on back-office bots to find information and complete transactions on behalf of customers.

If digital workplaces are being disrupted by the ongoing development of AI driven apps, by 2021 those disruptors could end up in their turn being disrupted. The emergence of a new form of AI, or a second wave of AI, known as augmented AI is so significant Gartner predicts that by 2021 it will be creating up to $2.9 trillion of business value and 6.2 billion hours of worker productivity globally.

AI and ML took center stage at IBM Think this year, the shows major AI announcements served as a reminder that the company has some of the most differentiated and competitive services for implementing AI in enterprise operational processes in the market. But if Big Blue is to win the AI race against AWS, Microsoft and Google Cloud in 2019 and beyond, it must improve its developer strategy and strengthen its communications, especially in areas such as trusted AI and governance

Sentiment analysis is the kind of tool a marketer dreams about. By gauging the publics opinion of an event or product through analysis of data on a scale no human could achieve, it gives your team the ability to figure out what people really think. Backed by a growing body of innovative research, sentiment-analysis tools have the ability to dramatically improve your ROI yet many companies are overlooking it.

Pop quiz: Can you define the differences between AI and automation?

I wont judge you if the answer is no. There's a blurry line between AI and automation, with the terms often used interchangeably, even in tech-forward professions. But there's a very real difference between the two and its one thats becoming evermore critical for organizations to understand.

Read the rest here:

CMSWire's Top 10 AI and Machine Learning Articles of 2019 - CMSWire

Machine Learning in 2019 Was About Balancing Privacy and Progress – ITPro Today

The overall theme of the year was two-fold: how can this technology make our lives easier and how can we protect privacy while enjoying those benefits? Natural language processing development continued and enterprises increasingly looked to AI and machine learning in 2019 for automation. Meanwhile, consumers became more concerned about the privacy of all that data theyre creating and enterprises are collecting, with consequences for businesses especially those that rely on said data for various technological processes or must invest in ensuring its security.

This year was a big one for analytics, big data and artificial intelligence but at the current pace of development, every subsequent year in this sector seems bigger than the last. Here are five of the leading stories in big data, AI and machine learning in 2019, with an eye to how they may continue to unfold in 2020.

Related: Prepare for Machine Learning in the Enterprise

The dominance of Amazons digital personal assistant, Alexa, in the home is clear, but this falls slew of new Alexa product announcementswas a sign that the workplace is the logical next step. An Alexa-powered enterprise seems increasingly likely as Facebook, Google and Microsoft all put their own resources into advancing natural language processingfor both voice-powered assistants and chatbots. The tech will become even more important if the growth of robotic process automation (see below) also continues and it emerges as another way to automate things in the enterprise space.

In 2019, it became increasingly clear that the enterprise is past preparing for the impact of machine learningon their operations and into the time for action for organizations that want to stay ahead of the enterprise machine learning curve. According to Gartner, seven out of 10 enterprises will be using some form of AI in the workplace by 2021.

The countrys most populous state and one thats home to many tech companies finished negotiations for its GDPR-esque California Consumer Privacy Actin September, with the law taking effect on the first day of 2020. Many tech companies put up strong opposition to CCPR, but Microsoft unexpectedly announcedin November that it would apply the regulations to customers across the country. Its a sign that the tech giant anticipates that CCPR isnt the only law of its kind likely to take effect in the U.S., especially as the push for federal regulationscontinues. Microsoft recently announced a regulatory compliance dashboard in Azure and AI-powered recommendations in the Microsoft 365 admin center to include guidance for compliance with the European Unions General Data Protection Regulation.

The world beyond the United States continued to affect the adoption and use of machine learning and big data in this country in 2019. Visa issuesaffected not just talent acquisition a challenge for the enterprise in taking AI and machine learning in 2019 from the organizational wishlist to implementation but also research, as it hampered conference travel. Chinas own advancements in artificial intelligence, and the ethical issues related to data privacythat have emerged, could also affect policy and practices in the U.S. especially as things shift to 5G. Barring a sea change in China related to data collection and use, the country should continue to affect tech adoption here in the United States in 2020.

Robotic process automation a group of technologies that let line of business users set up, launch and administer virtual workers sans the IT department is still a small sector in software. Worldwide revenue was at $850 million in 2018. However, its also a quickly growing one because it frees up workers from routine work and cuts labor costs. As automation becomes more robust, natural language processing continues to advance quickly and data quality improves, look for this sectors growth to continue in 2020 -- with big potential in IT and HR departments in particular. Robotic process automation is here to assume the standardized, routine tasks for any organization that generates or uses data.

Read more:

Machine Learning in 2019 Was About Balancing Privacy and Progress - ITPro Today

Another free web course to gain machine-learning skills (thanks, Finland), NIST probes ‘racist’ face-recog and more – The Register

Roundup As much of the Western world winds down for the Christmas period, here's a summary of this week's news from those machine-learning boffins who havent broken into the eggnog too early.

Finland, Finland, Finland: The Nordic country everyone thinks is part of Scandinavia but isnt has long punched above its weight on the technology front as the home of Nokia, the Linux kernel, and so on. Now the Suomi state is making a crash course in artificial intelligence free to all.

The Elements of AI series was originally meant to be just for Finns to get up to speed on the basics of AI theory and practice. Many Finns have already done so, but as a Christmas present, the Finnish government is now making it available for everyone to try.

The course takes about six weeks to complete, with six individual modules and is available in English, Swedish, Estonian, Finnish, and German. If you complete 90 per cent of the course and get 50 per cent of the answers right then the course managers will send you a nice certificate.

Meanwhile, don't forget there are many cool and useful free online courses on neural networks and the like, such as Fast.ai's excellent series and Stanford's top-tier lectures and notes.

Yep, AL still racist and sexist: A major study by the US National Institute of Standards and Technology, better known as NIST, has revealed major failings in today's facial-recognition systems.

The study examined 189 software algorithms from 99 developers, although interestingly Amazons Rekognition engine didnt take part, and the results arent pretty. When it came to recognizing Asian and African American faces, the algorithms were wildly inaccurate compared to matching Caucasian faces, especially with systems from US developers.

While it is usually incorrect to make statements across algorithms, we found empirical evidence for the existence of demographic differentials in the majority of the face recognition algorithms we studied, said Patrick Grother, a NIST computer scientist and the reports primary author.

While we do not explore what might cause these differentials, this data will be valuable to policymakers, developers and end users in thinking about the limitations and appropriate use of these algorithms.

For sale: baby shoes, never worn: As Hemmingway put it, the death of a child is one of the greatest tragedies that can occur, and Microsoft wants to do something about that using machine learning.

Redmond boffins worked with Tatiana Anderson and Jan-Marino Ramirez at Seattle Childrens Research Institute, in America, and Edwin Mitchell at the University of Auckland, New Zealand, to analyse Sudden Unexpected Infant Death (SUID) cases. Using a decades worth of data from the US Center for Disease Control (CDC), covering over 41 million births and 37,000 SUID deaths, the team sought to use specially prepared logistic-regression models to turn up some insights.

The results, published in the journal Pediatrics, were surprising: there was a clear difference between deaths that occurred in the first week after birth, dubbed SUEND, which stands for Sudden Unexpected Early Neonatal Death, and those that occurred between the first week and the end of a childs first year.

In the case of SUID, they found that rates were higher for unmarried, young mothers (between 15 and 24 years old), while this was not the case for SUEND cases. Instead, maternal smoking was highlighted as a major causative factor in SUEND situations, as were the length of pregnancy and birth weight.

The team are now using the model to look down other causative factors, be they genetic, environmental or something else. Hopefully such research will save many more lives in the future.

AI cracking calculus: Calculus, the bane of many schoolchildrens lives, appears to be right up AIs street.

A team of Facebook eggheads built a natural-language processing engine to understand and solve calculus problems, and compared the output with Wolfram Mathematica's output. The results were pretty stark: for basic equations, the AI solved them with 98 per cent accuracy, compared to 85 per cent for Mathematica.

With more complex calculations, however, the AIs accuracy drops off. It scored 81 per cent for a harder differential equation and just 40 per cent for more complex calculations.

These results are surprising given the difficulty of neural models to perform simpler tasks like integer addition or multiplication, the team said in a paper [PDF] on Arxiv. These results suggest that in the future, standard mathematical frameworks may benefit from integrating neural components in their solvers.

Deep-fake crackdown: Speaking of Facebook: today, the antisocial network put out an announcement that it had shut down two sets of fake accounts pushing propaganda. One campaign, originating in the country of Georgia, had 39 Facebook accounts, 344 Pages, 13 Groups, and 22 Instagram accounts, now all shut down. The network was linked to the nation's Panda advertising agency, and was pushing pro-Georgian-government material.

What's the AI angle? Here it is: the other campaign was based in Vietnam, and was devoted to influencing US voters using Western-looking avatars generated by deep-fake software a la thispersondoesnotexist.com.

Some 610 accounts, 89 Pages, 156 Groups and 72 Instagram accounts were shut down. The effort was traced to a group calling itself Beauty of Life (BL), which Facebook linked to the Epoch Media Group, a stateside biz that's very fond of President Trump and spent $9.5m in Facebook advertising to push its messages.

"The BL-focused network repeatedly violated a number of our policies, including our policies against coordinated inauthentic behavior, spam and misrepresentation, to name just a few," said Nathaniel Gleicher, Head of Security Policy at Facebook.

"The BL is now banned from Facebook. We are continuing to investigate all linked networks, and will take action as appropriate if we determine they are engaged in deceptive behavior."

Facebook acknowledged that it took the action as a result of its own investigation and "benefited from open source reporting." This almost certainly refers to bullshit-busting website Snopes, which uncovered the BL network last month.

Sponsored: What next after Netezza?

Follow this link:

Another free web course to gain machine-learning skills (thanks, Finland), NIST probes 'racist' face-recog and more - The Register

TinyML as a Service and machine learning at the edge – Ericsson

This is the second post in a series about tiny machine learning (TinyML) at the deep IoT edge. Read our earlier introduction to TinyMl as-a-Service, to learn how it ranks in respect to traditional cloud-based machine learning or the embedded systems domain.

TinyML is an emerging concept (and community) to run ML inference on Ultra Low-Power (ULP ~1mW) microcontrollers. TinyML as a Service will democratize TinyML, allowing manufacturers to start their AI business with TinyML running on microcontrollers.

In this article, we introduce the challenges behind the applicability of ML concepts within the IoT embedded world. Furthermore, we emphasize how these challenges are not simply due to the constraints added by the limited capabilities of embedded devices but are also evident where the computation capabilities of ML-based IoT deployments are empowered by additional resources confined at the network edge.

To summarize the nature of these challenges, we can say:

Below, we take a closer look at each of these challenges.

Edge computing promises higher performing service provisioning, both from a computational and a connectivity point of view.

Edge nodes support the latency requirements of mission critical communications thanks to their proximity to the end-devices, and enhanced hardware and software capabilities allow execution of increasingly complex and resource-demanding services in the edge nodes. There is growing attention, investments and R&D to make execution of ML tasks at the network edge easier. In fact, there are already several ML-dedicated "edge" hardware examples (e.g. Edge TPU by Google, Jetson Nano by Nvidia, Movidius by Intel) which confirm this.

Therefore, the question we are asking is: what are the issues that the edge computing paradigm has not been able to completely solve yet? And how can these issues undermine the applicability of ML concepts in IoT and edge computing scenarios?

We intend to focus on and analyze five areas in particular: (Note: Some areas we describe below may have solutions through other emerging types of edge computing but are not yet commonly available).

Figure 1

The web and the embedded worlds feature very heterogeneous characteristics. Figure 1 (above) depicts how this high heterogeneity is characterized, by comparing qualitatively and quantitively the capacities of the two paradigms both from a hardware and software perspective. Web services can rely on powerful underlying CPU architectures with high memory and storage capabilities. From a software perspective, web technologies can be designed to choose and benefit from a multitude of sophisticated operating systems (OS) and complex software tools.

On the other hand, embedded systems can rely on the limited capacity of microcontroller units (MCUs) and CPUs that are much less powerful when compared with general-purpose and consumer CPUs. The same applies with memory and storage capabilities, where 500KB of SRAM and a few MBs of FLASH memory can already be considered a high resource. There have been several attempts to bring the flexibility of Linux-based systems in the embedded scenario (e.g. Yocto Project), but nevertheless most of 32bit MCU-based devices owns the capacity for running real-time operating systems and no more complex distribution.

In simple terms, when Linux can run, system deployment is made easier since software portability becomes straightforward. Furthermore, an even higher cross-platform software portability is also made possible thanks to the wide support and usage of lightweight virtualization technologies such as containers. With almost no effort, developers can basically ship the same software functionalities between entities operating under Linux distributions, as happens in the case of cloud and edge.

The impossibility of running Linux and container-based virtualization in MCUs represents one of the most limiting issue and bigger challenge for current deployments. In fact, it appears clear how in typical "cloud-edge-embedded devices" scenarios, cloud and edge services are developed and deployed with hardware and software technologies, which are fundamentally different and easier to be managed if compared to embedded technologies.

TinyML as-a-Service tries to tackle this issue by taking advantage of alternative (and lightweight) software solutions.

Figure 2

In the previous section, we considered on a high-level how the technological differences between web and embedded domains can implicitly and significantly affect the execution of ML tasks on IoT devices. Here, we analyze how a big technological gap exists also in the availability of ML-dedicated hardware and software web, edge, and embedded entities.

From a hardware perspective, during most of computing history there have been only a few types of processor, mostly available for general use. Recently, the relentless growth of artificial intelligence (AI) has led to the optimization of ML tasks for existing chip designs such as graphics processing units (GPUs), as well as the design of new dedicated hardware forms such as application specific integrated circuits (ASICs), which embed chips designed exclusively for the execution of specific ML operations. The common thread that connects all these new devices is their usage at the edge. In fact, these credit-card sized devices are designed with the idea of operating at the network edge.

At the beginning of this article we mentioned a few examples of this new family of devices (Edge TPU, Jetson Nano, Movidius). We foresee that in the near future even more big and small chip and hardware manufacturers will increasingly invest resources into the design and production of ML-dedicated hardware. However, it appears clear how, at least so far, there has not been the same effort in the embedded world.

Such a lack of hardware availability undermines somehow a homogeneous and seamless ML "cloud-to-embedded" deployments. In many scenarios, the software can help compensate for hardware deficiencies. However, the same boundaries that we find in the hardware sphere apply for the development of software tools. Today, in the web domain, there are hundreds of ML-oriented application software. Such availability is registering a constant growth thanks also to the possibility given by the different open source initiatives that allow passionate developers all over the world to merge efforts. The result is more effective, refined, and niche applications. However, the portability of these applications into embedded devices is not so straightforward. The usage of high-level programming languages (e.g., Python), as well as the large sizes of the software runtime (intended as both runtime system and runtime program lifecycle phase) are just some of the reasons why the software portability is painful if not impossible.

The main rationale behind the TinyML as-a-Service approach is precisely the one to break the existing wall between cloud/edge and embedded entities. However, to expect exactly the same ML experience in the embedded domain as we have in the web and enterprise world would be unrealistic. It is still an irrefutable fact that size matters. The execution of ML inference is the only operation that we reasonably foresee to be executed in an IoT device. We are happy to leave all the other cumbersome ML tasks, such as data processing and training, to the more equipped and resourceful side of the scenario depicted in Figure 2.

In the next article, we will go through the different features which characterize TinyML as-a-Service and share the technological approach underlying the TinyML as-a-Service concept.

In the meantime, if you have not read it yet, we recommend reading our earlier introduction to TinyMl as-a-Service.

The IoT world needs a complete ML experience. TinyML as-a-service can be one possible solution for making this enhanced experience possible, as well as expanding potential technology opportunities. Stay tuned!

Continued here:

TinyML as a Service and machine learning at the edge - Ericsson

Want to dive into the lucrative world of deep learning? Take this $29 class. – Mashable

Just to let you know, if you buy something featured here, Mashable might earn an affiliate commission.From AI to sentiment analysis, the Ultimate Deep Learning class covers it all.

Image: pexels

By StackCommerceMashable Shopping2019-12-24 10:00:00 UTC

TL;DR: Become a machine learning expert with The Ultimate Deep Learning and NLP Certification Bundle for $29, a 97% savings.

Artificial intelligence and deep learning might bring to mind pictures of Will Smith's misadventures in I, Robot, but in reality, these are the technologies behind tomorrow's world. In fact, by 2030, PwC predicts nearly 40 percent of all U.S. jobs could be replaced by AI and automation.

That's a bit frightening, but if you're one of the pros who understands the technology behind these next-generation concepts, you'll be in good shape. And luckily, this Ultimate Deep Learning and NLP Certification Bundle is on sale and a great place to start.

This bundle provides six premium courses and over 300 lessons introducing you to machine learning, neural networks, and core tools like Keras, TensorFlow, and Python. You'll get a core understanding of deep learning from a practical viewpoint and deal with issues that newcomers to the field face all for just $29.

Once you get the gist of neural networks, you'll dive deeper into NLP, which helps computers understand, analyze, and manipulate human language. And eventually, you'll build your own applications for problems like text classification, neural machine translation, and stock prediction.

Usually retailing for $1,200, this learning bundle is currently on sale for just $29. And don't worry; you can complete all of the lessons on your own time and with lifetime access, you can return to it anytime you need to.

View post:

Want to dive into the lucrative world of deep learning? Take this $29 class. - Mashable

What is Machine Learning? A definition – Expert System

Machine learning is an application of artificial intelligence (AI) that provides systems the ability to automatically learn and improve from experience without being explicitly programmed. Machine learning focuses on the development of computer programs that can access data and use it learn for themselves.

The process of learning begins with observations or data, such as examples, direct experience, or instruction, in order to look for patterns in data and make better decisions in the future based on the examples that we provide. The primary aim is to allow the computers learn automatically without human intervention or assistance and adjust actions accordingly.

Machine learning algorithms are often categorized as supervised or unsupervised.

Machine learning enables analysis of massive quantities of data. While it generally delivers faster, more accurate results in order to identify profitable opportunities or dangerous risks, it may also require additional time and resources to train it properly. Combining machine learning with AI and cognitive technologies can make it even more effective in processing large volumes of information.

See the rest here:

What is Machine Learning? A definition - Expert System

Kubernetes and containers are the perfect fit for machine learning – JAXenter

Machine learning is permeating every corner of the industry, from fraud detection to supply chain optimization to personalizing the customer experience. McKinsey has found that nearly half of enterprises have infused AI into at least one of their standard business processes, and Gartner says seven out of 10 enterpriseswill be using some form of AI by 2021. Thats a short two years away.

But for businesses to take advantage of AI, they need an infrastructure that allows data scientists to experiment and iterate with different data sets, algorithms, and computing environments without slowing them down or placing a heavy burden on the IT department. That means they need a simple, automated way to quickly deploy code in a repeatable manner across local and cloud environments and to connect to the data sources they need.

A cloud-native environment built on containers is the most effective and efficient way to support this type of rapid development, evidenced by announcements from big vendors like Googleand HPE, which have each released new software and services to enable machine learning and deep learning in containers. Much as containers can speed the deployment of enterprise applications by packaging the code in a wrapper along with its runtime requirements, these same qualities make containers highly practical for machine learning.

Broadly speaking, there are three phases of an AI project where containers are beneficial: exploration, training, and deployment. Heres a look at what each involves and how containers can assist with each by reducing costs and simplifying deployment, allowing innovation to flourish.

To build an AI model, data scientists experiment with different data sets and machine learning algorithms to find the right data and algorithms to predict outcomes with maximum accuracy and efficiency. There are various libraries and frameworksfor creating machine learning models for different problem types and industries. Speed of iteration and the ability to run tests in parallel is essential for data teams as they try to uncover new revenue streams and meet business goals in a reasonable timeframe.

Containers provide a way to package up these libraries for specific domains, point to the right data source and deploy algorithms in a consistent fashion. That way, data scientists have an isolated environment they can customize for their exploration, without needing IT to manage multiple sets of libraries and frameworks in a shared environment.

SEE ALSO:Unleash chaos engineering: Kubethanos kills half your Kubernetes pods

Once an AI model has been built, it needs to be trained against large volumes of data across different platforms to maximize accuracy and minimize resource utilization. Training is highly compute-intensive, and containers make it easy to scale workloads up and down across multiple compute nodes quickly. A scheduler identifies the optimal node based on available resources and other factors.

A distributed cloud environment also allows compute and storage to be managed separately, which cuts storage utilization and therefore costs. Traditionally, compute and storage were tightly coupled, but containers along with a modern data management plane allows compute to be scaled independently and moved close to the data, wherever it resides.

With compute and storage separate, data scientists can run their models on different types of hardware, such as GPUs and specialized processors, to determine which model will provide the greatest accuracy and efficiency. They can also work to incrementally improve accuracy by adjusting weightings, biases and other parameters.

In production, a machine learning application will often combine several models that serve different purposes. One model might summarize the text in a social post, for example, while another assesses sentiment. Containers allow each model to be deployed as a microservice an independent, lightweight program that developers can reuse in other applications.

Microservices also make it easier to deploy models in parallel in different production environments for purposes such as a/b testing, and the smaller programs allow models to be updated independently from the larger application, speeding release times, and reducing the room for error.

SEE ALSO:Artificial intelligence & machine learning: The brain of a smart city

At each stage of the process, containers allow data teams to explore, test and improve their machine learning programs more quickly and with minimal support from IT. Containers provide a portable and consistent environment that can be deployed rapidly in different environments to maximize the accuracy, performance, and efficiency of machine learning applications.

The cloud-native model has revolutionized how enterprise applications are deployed and managed by speeding innovation and reducing costs. Its time to bring these same advantages to machine learning and other forms of AI so that businesses can better serve their customers and compete more effectively.

Excerpt from:

Kubernetes and containers are the perfect fit for machine learning - JAXenter

Data science and machine learning: what to learn in 2020 – Packt Hub

Its hard to keep up with the pace of change in the data science and machine learning fields. And when youre under pressure to deliver projects, learning new skills and technologies might be the last thing on your mind. But if you dont have at least one eye on what you need to learn next you run the risk of falling behind. In turn this means you miss out on new solutions and new opportunities to drive change: you might miss the chance to do things differently.

Thats why we want to make it easy for you with this quick list of what you need to watch out for and learn in 2020.

TensorFlow remains the most popular deep learning framework in the world. With TensorFlow 2.0 the Google-based development team behind it have attempted to rectify a number of issues and improve overall performance. Most notably, some of the problems around usability have been addressed, which should help the projects continued growth and perhaps even lower the barrier to entry.

Relatedly TensorFlow.js is proving that the wider TensorFlow ecosystem is incredibly healthy. It will be interesting to see what projects emerge in 2020 it might even bring JavaScript web developers into the machine learning fold.

Explore Packts huge range of TensorFlow eBooks and videos on the store.

PyTorch hasnt quite managed to topple TensorFlow from its perch, but its nevertheless growing quickly. Easier to use and more accessible than TensorFlow, if you want to start building deep learning systems quickly your best bet is probably to get started on PyTorch.

Search PyTorch eBooks and videos on the Packt store.

When it comes to data analysis, one of the most pressing issues is to speed up pipelines. This is, of course, notoriously difficult even in organizations that do their best to be agile and fast, its not uncommon to find that their data is fragmented and diffuse, with little alignment across teams.

One of the opportunities for changing this is cloud. When used effectively cloud platforms can dramatically speed up analytics pipelines and make it much easier for data scientists and analysts to deliver insights quickly. This might mean that we need increased collaboration between data professionals, engineers, and architects, but if were to really deliver on the data at our disposal, then this shift could be massive.

Learn how to perform analytics on the cloud with Cloud Analytics with Microsoft Azure.

While cloud might help to smooth some of the friction that exists in our organizations when it comes to data analytics, theres no substitute for strong and clear leadership. The split between the engineering side of data and the more scientific or interpretive aspect has been noted, which means that there is going to be a real demand for people that have a strong understanding of what data can do, what it shows, and what it means in terms of action.

Indeed, the article just linked to also mentions that there is likely to be an increasing need for executive level understanding. That means data scientists have the opportunity to take a more senior role inside their organizations, by either working closely with execs or even moving up to that level.

Learn how to build and manage a data science team and initiative that delivers with Managing Data Science.

In the excitement about the opportunities of machine learning and artificial intelligence, its possible that weve lost sight of some of the fundamentals: the algorithms. Indeed, given the conversation around algorithmic bias, and unintended consequences it certainly makes sense to place renewed attention on the algorithms that lie right at the center of our work.

Even if youre not an experienced data analyst or data scientist, if youre a beginner its just as important to dive deep into algorithms. This will give you a robust foundation for everything else you do. And while statistics and mathematics will feel a long way from the supposed sexiness of data science, carefully considering what role they play will ensure that the models you build are accurate and perform as they should.

Get stuck into algorithms with Data Science Algorithms in a Week.

Computer vision and Natural Language Processing are two of the most exciting aspects of modern machine learning and artificial intelligence. Both can be used for analytics projects, but they also have applications in real world digital products. Indeed, with augmented reality and conversational UI becoming more and more common, businesses need to be thinking very carefully about whether this could give them an edge in how they interact with customers.

These sorts of innovations can be driven from many different departments but technologists and data professionals should be seizing the opportunity to lead the way on how innovation can transform customer relationships.

For more technology eBooks and videos to help you prepare for 2020, head to the Packt store.

More:

Data science and machine learning: what to learn in 2020 - Packt Hub

Dotscience Forms Partnerships to Strengthen Machine Learning – Database Trends and Applications

Dotscience, a provider of DevOps for Machine Learning (MLOps) solutions, is forming partnerships with GitLab and Grafana Labs, along with strengthening integrations with several platforms and cloud providers.

The company is deepening integrations to include Scikit-learn, H2O.ai and TensorFlow; expanding multi-cloud support with Amazon Web Services (AWS) and Microsoft Azure; and a entering a joint collaboration with global enterprises to develop an industry benchmark for helping enterprises get maximum ROI out of their AI initiatives.

MLOps is poised to dominate the enterprise AI conversation in 2020, as it will directly address the challenges enterprises face when looking to create business value with AI, said Luke Marsden, CEO and founder at Dotscience. Through new partnerships, expanded multi-cloud support, and collaborations with MLOps pioneers at global organizations in the Fortune 500, we are setting the bar for MLOps best practices for building production ML pipelines today.

Grafana Labs, the open observability platform, and Dotscience are partnering to deliver observability for ML in production.

With Dotscience, ML teams can statistically monitor the behavior of ML models in production on unlabelled production data by analyzing the statistical distribution of predictions.

The partnership simplifies the deployment of ML models to Kubernetes and adds the ability to set up monitoring dashboards for deployed ML models using cloud-native tools including Grafana and Prometheus, which reduces the time spent on these tasks from weeks to seconds.

As a GitLab Technology Partner, Dotscience is extending the use of its platform for collaborative, end-to-end ML data and model management to the more than 100,000 organizations and developers actively using GitLab as their DevOps platform.

Dotscience is now available on the AWS Marketplace, enabling AWS customers to easily and quickly deploy Dotscience directly through AWS Marketplaces 1-Click Deployment, and through Microsoft Azure.

Dotscience has expanded the frameworks in which data scientists can deploy tested and trained ML models into production and statistically monitor the productionized models, to include Scikit-learn, H2O.ai and TensorFlow.

These new integrations make Dotsciences recently added deploy and monitor platform advancementsthe easiest way to deploy and monitor ML models on Kubernetes clustersavailable to data scientists using a greater range of ML frameworks.

For more information about these partnerships and updates, visit https://dotscience.com/.

Originally posted here:

Dotscience Forms Partnerships to Strengthen Machine Learning - Database Trends and Applications