{"id":120395,"date":"2014-03-31T20:41:00","date_gmt":"2014-04-01T00:41:00","guid":{"rendered":"http:\/\/www.euvolution.com\/futurist-transhuman-news-blog\/uncategorized\/why-nvidia-thinks-it-can-power-the-ai-revolution.php"},"modified":"2014-03-31T20:41:00","modified_gmt":"2014-04-01T00:41:00","slug":"why-nvidia-thinks-it-can-power-the-ai-revolution","status":"publish","type":"post","link":"https:\/\/www.euvolution.com\/futurist-transhuman-news-blog\/artificial-intelligence\/why-nvidia-thinks-it-can-power-the-ai-revolution.php","title":{"rendered":"Why Nvidia thinks it can power the AI revolution"},"content":{"rendered":"<p><p>  6 hours ago Mar. 31, 2014 - 12:07 PM PDT<\/p>\n<p>    Smarter robots and devices are coming to a home near you, and    chipmaker Nvidia wants to help make it happen. It wont develop    the algorithms that dictate their behavior or build the sensors    that let them take in our world, but its graphics-processing    units, or GPUs, might be a great way to handle the heavy    computing necessary to make many forms of artificial    intelligence a reality.  <\/p>\n<p>    Most applications dont use GPUs exclusively, but rather    offload the most computationally intensive tasks onto them from    standard microprocessors. Called GPU acceleration,     the practice is very common in supercomputing workloads and    its becoming ubiquitous in the area of computer vision and    object recognition, too. In 2013, more than 80 percent of the    teams participating    in the ImageNet image-recognition competition utilized    GPUs, said Sumit Gupta, general manager of the the Advanced    Computing Group at Nvidia.  <\/p>\n<p>    In March 2013, Google acquired DNNresearch, a deep learning    startup co-created by University of Toronto professor Geoff    Hinton.     Part of the rationale behind that acquisition was teams    performance of Hintons team in the 2012 ImageNet competition,    where     the groups GPU-powered deep learning models easily bested    previous approaches.  <\/p>\n<p>      Source: Nvidia    <\/p>\n<p>    It turns out that the deep neural network  problem is just a    slam dunk for the GPU, Gupta said. Thats because     deep learning algorithms often require a lot of computing    power to process their data (e.g., images or text) and extract    the defining features of the things included in that data.    Especially during the training phase, when the models and    algorithms are being tuned for accuracy, they need to process    a lot of data.  <\/p>\n<p>    Numerous customers are using Nvidias Tesla GPUs forimage    and speech recognition, including Adobe and Chinese search    giant Baidu. Nvidia is working on other aspects of machine    learning as well, Gupta noted. Netflix uses them (in the Amazon    Web Services cloud) to power its recommendation engine, Russian    search engine Yandex uses GPUs to power its search engine, and    IBM uses them to run clustering algorithms inHadoop.  <\/p>\n<p>    Nvidia might be so excited about machine learning because it    has been pushing GPUs as a general-purpose computing platform     not just a graphics and gaming chip  for years with mixed    results. The company has tried to do this by simplify    programming its processors via the CUDA    language it has developed, but Gupta acknowledged theres    still an overall lack of knowledge about how to use GPUs    effectively. Thats why so much real innovation still remains    with these large users that have the parallel-programming    skills necessary to take advantage of 2,500 or more cores at a    time (and even more in multi-GPU systems).  <\/p>\n<p>      Source: Nvidia    <\/p>\n<p>    However, Nvidia is looking beyond servers and into robotics to    fuel some of its machine learning ambitions over the next    decade. Last week, the company announced its Jetson TK1    development kit, which Gupta called a supercomputing    version of Raspberry Pi. At $192, the kit is programmable    using CUDA and includes all the ports one might expect to see,    as well as a Tegra K1 system-on-a-chip (the latest version of    Nvidias mobile processor) thats comprised of a 192-core    Kepler GPU, an ARM Cortex A15 CPU and 300 gigaflops of    performance.  <\/p>\n<p><!-- Auto Generated --><\/p>\n<p>More here:<\/p>\n<p><a target=\"_blank\" href=\"http:\/\/gigaom.com\/2014\/03\/31\/why-nvidia-thinks-it-can-power-the-ai-revolution\/\/RS=^ADAmUi1vyl6_wu9jjUKbpClbO3AcJk-\" title=\"Why Nvidia thinks it can power the AI revolution\">Why Nvidia thinks it can power the AI revolution<\/a><\/p>\n","protected":false},"excerpt":{"rendered":"<p> 6 hours ago Mar. 31, 2014 - 12:07 PM PDT Smarter robots and devices are coming to a home near you, and chipmaker Nvidia wants to help make it happen. It wont develop the algorithms that dictate their behavior or build the sensors that let them take in our world, but its graphics-processing units, or GPUs, might be a great way to handle the heavy computing necessary to make many forms of artificial intelligence a reality <a href=\"https:\/\/www.euvolution.com\/futurist-transhuman-news-blog\/artificial-intelligence\/why-nvidia-thinks-it-can-power-the-ai-revolution.php\">Continue reading <span class=\"meta-nav\">&rarr;<\/span><\/a><\/p>\n","protected":false},"author":1,"featured_media":0,"comment_status":"closed","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"limit_modified_date":"","last_modified_date":"","_lmt_disableupdate":"","_lmt_disable":"","footnotes":""},"categories":[13],"tags":[],"class_list":["post-120395","post","type-post","status-publish","format-standard","hentry","category-artificial-intelligence"],"modified_by":null,"_links":{"self":[{"href":"https:\/\/www.euvolution.com\/futurist-transhuman-news-blog\/wp-json\/wp\/v2\/posts\/120395"}],"collection":[{"href":"https:\/\/www.euvolution.com\/futurist-transhuman-news-blog\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.euvolution.com\/futurist-transhuman-news-blog\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.euvolution.com\/futurist-transhuman-news-blog\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/www.euvolution.com\/futurist-transhuman-news-blog\/wp-json\/wp\/v2\/comments?post=120395"}],"version-history":[{"count":0,"href":"https:\/\/www.euvolution.com\/futurist-transhuman-news-blog\/wp-json\/wp\/v2\/posts\/120395\/revisions"}],"wp:attachment":[{"href":"https:\/\/www.euvolution.com\/futurist-transhuman-news-blog\/wp-json\/wp\/v2\/media?parent=120395"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.euvolution.com\/futurist-transhuman-news-blog\/wp-json\/wp\/v2\/categories?post=120395"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.euvolution.com\/futurist-transhuman-news-blog\/wp-json\/wp\/v2\/tags?post=120395"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}