{"id":207666,"date":"2017-02-13T18:28:16","date_gmt":"2017-02-13T23:28:16","guid":{"rendered":"http:\/\/www.euvolution.com\/futurist-transhuman-news-blog\/uncategorized\/google-crams-machine-learning-into-smartwatches-in-ai-push-cio.php"},"modified":"2022-05-18T07:10:44","modified_gmt":"2022-05-18T11:10:44","slug":"google-crams-machine-learning-into-smartwatches-in-ai-push-cio","status":"publish","type":"post","link":"https:\/\/www.euvolution.com\/futurist-transhuman-news-blog\/artificial-intelligence\/google-crams-machine-learning-into-smartwatches-in-ai-push-cio.php","title":{"rendered":"Google crams machine learning into smartwatches in AI push &#8211; CIO"},"content":{"rendered":"<p><p>    Google is bringing artificial intelligence to a whole new set    of devices, including Android Wear 2.0 smartwatches and the    Raspberry Pi board, later this year.  <\/p>\n<p>    A cool thing is these devices don't require a set of powerful    CPUs and GPUs to carry out machine learning tasks.  <\/p>\n<p>    Google researchers are instead trying to lighten the hardware    load to carry out basic AI tasks, as exhibited by last week's    release of Android Wear 2.0 operating system for wearables.  <\/p>\n<p>    [     Bots may send your liability risk soaring ]  <\/p>\n<p>    Google has added some basic AI features to smartwatches with    Android Wear 2.0, and those features can work within the    limited memory and CPU constraints of wearables.  <\/p>\n<p>    Android Wear 2.0 has a \"smart reply\" feature, which provides    basic responses to conversations. It works much like how    predictive dictionaries work, but it can auto-reply to messages    based on the context of the conversation.  <\/p>\n<p>    Google uses a new way to analyze data on the fly without    bogging down a smartwatch. In conventional machine-learning    models, a lot of data needs to be classified and labeled to    provide accurate answers. Instead, Android Wear 2.0 uses a    \"semi-supervised\" learning technique to provide approximate    answers.  <\/p>\n<p>    \"We're quite surprised and excited about how well it works even    on Android wearable devices with very limited computation and    memory resources,\" Sujith Ravi, staff research scientist at    Google said in a     blog entry.  <\/p>\n<p>    For example, the skimmed down machine-learning model can    classify a few words -- based on sentiment and other clues --    and create an answer. The machine-learning model introduces a    streaming algorithm to process data, and it provides trained    responses that also factor in previous interactions, word    relationships, and vector analysis.  <\/p>\n<p>    The process is faster because the data is analyzed and compared    based on bit arrays, or in the form of 1s and 0s. That helps    analyze data on the fly, which tremendously reduces the memory    footprint. It doesn't go through the conventional process of    referring to rich vocabulary models, which require a lot of    hardware. The AI feature is not intended for sophisticated    answers or analysis of a large set of complex words.  <\/p>\n<p>    The feature can be used with third-party message apps, the    researchers noted. It is loosely based on the same smart-reply    technology in Google's messaging Allo app, which is built from    the company's Expander set of semi-supervised learning tools.  <\/p>\n<p>    The Android Wear team originally reached out to Google's    researchers and expressed an interested in implementing the    \"smart reply\" technology directly in smart devices, Ravi    said.  <\/p>\n<p>    AI is becoming pervasive in smartphones, PCs, and electronics    like Amazon's Echo Dot, but it largely relies on machine    learning taking place in the cloud. Machine-learning models in    the cloud are trained, a process called learning, to recognize    images or speech. Conventional machine learning relies on    algorithms, superfast hardware, and a huge amount of data for    more accurate answers.  <\/p>\n<p>    Google's technology is different than Qualcomm's rough    implementation of machine learning in mobile devices, which    hooks up algorithms with digital signal processors (DSPs) for    image recognition or natural language processing. Qualcomm has    tuned DSPs in its upcoming Snapdragon 835 to process speech or    images at higher speeds, so AI tasks are carried out    faster.  <\/p>\n<p>    Google has an ambitious plan to apply machine learning through    its entire business. The Google Assistant -- which is also in    Android Wear 2.0 -- is a visible AI across smartphones, TVs,    and other consumer devices. The search company has TensorFlow,    an open-source machine-learning framework, and has its own    inferencing chip called Tensor Processing Unit.  <\/p>\n<p><!-- Auto Generated --><\/p>\n<p>Originally posted here:<\/p>\n<p><a target=\"_blank\" rel=\"nofollow noopener\" href=\"http:\/\/www.cio.com\/article\/3168734\/artificial-intelligence\/google-crams-machine-learning-into-smartwatches-in-ai-push.html\" title=\"Google crams machine learning into smartwatches in AI push - CIO\">Google crams machine learning into smartwatches in AI push - CIO<\/a><\/p>\n","protected":false},"excerpt":{"rendered":"<p> Google is bringing artificial intelligence to a whole new set of devices, including Android Wear 2.0 smartwatches and the Raspberry Pi board, later this year.  <a href=\"https:\/\/www.euvolution.com\/futurist-transhuman-news-blog\/artificial-intelligence\/google-crams-machine-learning-into-smartwatches-in-ai-push-cio.php\">Continue reading <span class=\"meta-nav\">&rarr;<\/span><\/a><\/p>\n","protected":false},"author":1,"featured_media":0,"comment_status":"closed","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"limit_modified_date":"","last_modified_date":"","_lmt_disableupdate":"","_lmt_disable":"","footnotes":""},"categories":[13],"tags":[],"class_list":["post-207666","post","type-post","status-publish","format-standard","hentry","category-artificial-intelligence"],"modified_by":"Danzig","_links":{"self":[{"href":"https:\/\/www.euvolution.com\/futurist-transhuman-news-blog\/wp-json\/wp\/v2\/posts\/207666"}],"collection":[{"href":"https:\/\/www.euvolution.com\/futurist-transhuman-news-blog\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.euvolution.com\/futurist-transhuman-news-blog\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.euvolution.com\/futurist-transhuman-news-blog\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/www.euvolution.com\/futurist-transhuman-news-blog\/wp-json\/wp\/v2\/comments?post=207666"}],"version-history":[{"count":0,"href":"https:\/\/www.euvolution.com\/futurist-transhuman-news-blog\/wp-json\/wp\/v2\/posts\/207666\/revisions"}],"wp:attachment":[{"href":"https:\/\/www.euvolution.com\/futurist-transhuman-news-blog\/wp-json\/wp\/v2\/media?parent=207666"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.euvolution.com\/futurist-transhuman-news-blog\/wp-json\/wp\/v2\/categories?post=207666"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.euvolution.com\/futurist-transhuman-news-blog\/wp-json\/wp\/v2\/tags?post=207666"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}