{"id":130256,"date":"2014-05-05T15:40:48","date_gmt":"2014-05-05T19:40:48","guid":{"rendered":"http:\/\/www.euvolution.com\/futurist-transhuman-news-blog\/uncategorized\/stephen-hawking-says-threat-of-artificial-intelligence-a-real-concern.php"},"modified":"2014-05-05T15:40:48","modified_gmt":"2014-05-05T19:40:48","slug":"stephen-hawking-says-threat-of-artificial-intelligence-a-real-concern","status":"publish","type":"post","link":"https:\/\/www.euvolution.com\/futurist-transhuman-news-blog\/artificial-intelligence\/stephen-hawking-says-threat-of-artificial-intelligence-a-real-concern.php","title":{"rendered":"Stephen Hawking says threat of artificial intelligence a real concern"},"content":{"rendered":"<p><p>    Stephen Hawking, in an article inspired by the new Johnny Depp    flick Transcendence, said it would be the \"worst mistake in    history\" to dismiss the threat of artificial intelligence.  <\/p>\n<p>    In a paper he co-wrote with University at California, Berkeley    computer-science professor Stuart Russell, and Massachusetts    Institute of Technology physics professors Max Tegmark and    Frank Wilczek, Hawking said cited several achievements in the    field of artificial intelligence, including self-driving cars,    Siri and the computer that won Jeopardy!  <\/p>\n<p>    \"Such achievements will probably pale against what the coming    decades will bring,\" the article in Britain's Independent said.  <\/p>\n<p>    \"Success in creating AI would be the biggest event in human    history,\" the article continued. \"Unfortunately, it might also    be the last, unless we learn how to avoid the risks.\"  <\/p>\n<p>    The professors wrote that in the future there may be nothing to    prevent machines with superhuman intelligence from    self-improving, triggering a so-called \"singularity.\"  <\/p>\n<p>    \"One can imagine such technology outsmarting financial markets,    out-inventing human researchers, out-manipulating human    leaders, and developing weapons we cannot even understand.    Whereas the short-term impact of AI depends on who controls it,    the long-term impact depends on whether it can be controlled at    all,\" the article said.  <\/p>\n<p>    \"Although we are facing potentially the best or worst thing to    happen to humanity in history, little serious research is    devoted to these issues outside non-profit institutes such as    the Cambridge Centre for the Study of Existential Risk, the    Future of Humanity Institute, the Machine Intelligence Research    Institute, and the Future of Life Institute. All of us should    ask ourselves what we can do now to improve the chances of    reaping the benefits and avoiding the risks.\"  <\/p>\n<p><!-- Auto Generated --><\/p>\n<p>Go here to read the rest: <\/p>\n<p><a target=\"_blank\" href=\"http:\/\/www.spacedaily.com\/reports\/Stephen_Hawking_says_threat_of_artificial_intelligence_a_real_concern_999.html\/RK=0\/RS=nnyvZbsTB7Jh2fZ37kbOOAJH9O8-\" title=\"Stephen Hawking says threat of artificial intelligence a real concern\">Stephen Hawking says threat of artificial intelligence a real concern<\/a><\/p>\n","protected":false},"excerpt":{"rendered":"<p> Stephen Hawking, in an article inspired by the new Johnny Depp flick Transcendence, said it would be the \"worst mistake in history\" to dismiss the threat of artificial intelligence. In a paper he co-wrote with University at California, Berkeley computer-science professor Stuart Russell, and Massachusetts Institute of Technology physics professors Max Tegmark and Frank Wilczek, Hawking said cited several achievements in the field of artificial intelligence, including self-driving cars, Siri and the computer that won Jeopardy! \"Such achievements will probably pale against what the coming decades will bring,\" the article in Britain's Independent said. \"Success in creating AI would be the biggest event in human history,\" the article continued.  <a href=\"https:\/\/www.euvolution.com\/futurist-transhuman-news-blog\/artificial-intelligence\/stephen-hawking-says-threat-of-artificial-intelligence-a-real-concern.php\">Continue reading <span class=\"meta-nav\">&rarr;<\/span><\/a><\/p>\n","protected":false},"author":1,"featured_media":0,"comment_status":"closed","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"limit_modified_date":"","last_modified_date":"","_lmt_disableupdate":"","_lmt_disable":"","footnotes":""},"categories":[13],"tags":[],"class_list":["post-130256","post","type-post","status-publish","format-standard","hentry","category-artificial-intelligence"],"modified_by":null,"_links":{"self":[{"href":"https:\/\/www.euvolution.com\/futurist-transhuman-news-blog\/wp-json\/wp\/v2\/posts\/130256"}],"collection":[{"href":"https:\/\/www.euvolution.com\/futurist-transhuman-news-blog\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.euvolution.com\/futurist-transhuman-news-blog\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.euvolution.com\/futurist-transhuman-news-blog\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/www.euvolution.com\/futurist-transhuman-news-blog\/wp-json\/wp\/v2\/comments?post=130256"}],"version-history":[{"count":0,"href":"https:\/\/www.euvolution.com\/futurist-transhuman-news-blog\/wp-json\/wp\/v2\/posts\/130256\/revisions"}],"wp:attachment":[{"href":"https:\/\/www.euvolution.com\/futurist-transhuman-news-blog\/wp-json\/wp\/v2\/media?parent=130256"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.euvolution.com\/futurist-transhuman-news-blog\/wp-json\/wp\/v2\/categories?post=130256"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.euvolution.com\/futurist-transhuman-news-blog\/wp-json\/wp\/v2\/tags?post=130256"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}