Google crams machine learning into smartwatches in AI push – CIO

Google is bringing artificial intelligence to a whole new set of devices, including Android Wear 2.0 smartwatches and the Raspberry Pi board, later this year.

A cool thing is these devices don't require a set of powerful CPUs and GPUs to carry out machine learning tasks.

Google researchers are instead trying to lighten the hardware load to carry out basic AI tasks, as exhibited by last week's release of Android Wear 2.0 operating system for wearables.

[ Bots may send your liability risk soaring ]

Google has added some basic AI features to smartwatches with Android Wear 2.0, and those features can work within the limited memory and CPU constraints of wearables.

Android Wear 2.0 has a "smart reply" feature, which provides basic responses to conversations. It works much like how predictive dictionaries work, but it can auto-reply to messages based on the context of the conversation.

Google uses a new way to analyze data on the fly without bogging down a smartwatch. In conventional machine-learning models, a lot of data needs to be classified and labeled to provide accurate answers. Instead, Android Wear 2.0 uses a "semi-supervised" learning technique to provide approximate answers.

"We're quite surprised and excited about how well it works even on Android wearable devices with very limited computation and memory resources," Sujith Ravi, staff research scientist at Google said in a blog entry.

For example, the skimmed down machine-learning model can classify a few words -- based on sentiment and other clues -- and create an answer. The machine-learning model introduces a streaming algorithm to process data, and it provides trained responses that also factor in previous interactions, word relationships, and vector analysis.

The process is faster because the data is analyzed and compared based on bit arrays, or in the form of 1s and 0s. That helps analyze data on the fly, which tremendously reduces the memory footprint. It doesn't go through the conventional process of referring to rich vocabulary models, which require a lot of hardware. The AI feature is not intended for sophisticated answers or analysis of a large set of complex words.

The feature can be used with third-party message apps, the researchers noted. It is loosely based on the same smart-reply technology in Google's messaging Allo app, which is built from the company's Expander set of semi-supervised learning tools.

The Android Wear team originally reached out to Google's researchers and expressed an interested in implementing the "smart reply" technology directly in smart devices, Ravi said.

AI is becoming pervasive in smartphones, PCs, and electronics like Amazon's Echo Dot, but it largely relies on machine learning taking place in the cloud. Machine-learning models in the cloud are trained, a process called learning, to recognize images or speech. Conventional machine learning relies on algorithms, superfast hardware, and a huge amount of data for more accurate answers.

Google's technology is different than Qualcomm's rough implementation of machine learning in mobile devices, which hooks up algorithms with digital signal processors (DSPs) for image recognition or natural language processing. Qualcomm has tuned DSPs in its upcoming Snapdragon 835 to process speech or images at higher speeds, so AI tasks are carried out faster.

Google has an ambitious plan to apply machine learning through its entire business. The Google Assistant -- which is also in Android Wear 2.0 -- is a visible AI across smartphones, TVs, and other consumer devices. The search company has TensorFlow, an open-source machine-learning framework, and has its own inferencing chip called Tensor Processing Unit.

Originally posted here:

Google crams machine learning into smartwatches in AI push - CIO

Related Posts

Comments are closed.