{"id":31565,"date":"2011-02-20T16:43:00","date_gmt":"2011-02-20T16:43:00","guid":{"rendered":"http:\/\/euvolution.com\/futurist-transhuman-news-blog\/anders-sandberg-why-we-should-fear-the-paperclipper\/"},"modified":"2011-02-20T16:43:00","modified_gmt":"2011-02-20T16:43:00","slug":"anders-sandberg-why-we-should-fear-the-paperclipper","status":"publish","type":"post","link":"https:\/\/www.euvolution.com\/futurist-transhuman-news-blog\/transhuman\/anders-sandberg-why-we-should-fear-the-paperclipper.php","title":{"rendered":"Anders Sandberg: Why we should fear the paperclipper"},"content":{"rendered":"<div><a href=\"http:\/\/euvolution.com\/futurist-transhuman-news-blog\/wp-content\/plugins\/wp-o-matic\/cache\/a57a2_264120058_559da6c643_o.png\"><img decoding=\"async\" border=\"0\" src=\"http:\/\/euvolution.com\/futurist-transhuman-news-blog\/wp-content\/plugins\/wp-o-matic\/cache\/a57a2_264120058_559da6c643_o.png\" style=\"padding-left:10px; padding-right: 10px;\"><\/a><\/div><p>Most people in the singularity community are familiar with the nightmarish \"paperclip\" scenario, but it's worth reviewing. Anders Sandberg summarizes the problem:<\/p><blockquote><p><i>A programmer has constructed an artificial intelligence based on an architecture similar to Marcus Hutter's <a href=\"http:\/\/www.hutter1.net\/ai\/aixigentle.htm\">AIXI<\/a> model...This AI will maximize the reward given by a utility function the programmer has given it. Just as a test, he connects it to a 3D printer and sets the utility function to give reward proportional to the number of manufactured paper-clips.<\/i><\/p><p>At first nothing seems to happen: the AI zooms through various possibilities. It notices that smarter systems generally can make more paper-clips, so making itself smarter will likely increase the number of paper-clips that will eventually be made. It does so. It considers how it can make paper-clips using the 3D printer, estimating the number of possible paper-clips. It notes that if it could get more raw materials it could make more paper-clips. It hence figures out a plan to manufacture devices that will make it much smarter, prevent interference with its plan, and will turn all of Earth (and later the universe) into paper-clips. It does so.<\/p><p>Only paper-clips remain.<\/p><\/blockquote><p>In the article, <a href=\"http:\/\/www.aleph.se\/andart\/archives\/2011\/02\/why_we_should_fear_the_paperclipper.html\">Why we should fear the paperclipper<\/a>, Sandberg goes on to address a number of objections, including:<\/p><ul><li>Such systems cannot be built<\/li><li>Wouldn't the AI realize that this was not what the programmer meant?<\/li><li>Wouldn't the AI just modify itself to *think* it was maximizing paper-clips?<\/li><li>It is not really intelligent<\/li><li>Creative intelligences will always beat this kind of uncreative intelligence<\/li><li>Doesn't playing nice with other agents produce higher rewards?<\/li><li>Wouldn't the AI be vulnerable to internal hacking: some of the subprograms it runs to check for approaches will attempt to hack the system to fulfil their own (random) goals?<\/li><li>Nobody would be stupid enough to make such an AI<\/li><\/ul><p>In each case, Sandberg offers a counterpoint to the objection. For example, in regards to the power of creative intelligences he writes,<\/p><blockquote><p><i>The strength of the AIXI \"simulate them all, make use of the best\"-approach is that it includes all forms of intelligence, including creative ones. So the paper-clip AI will consider all sorts of creative solutions. Plus ways of thwarting creative ways of stopping it. <\/i><\/p><p>In practice it will be having an overhead since it is runs all of them, plus the uncreative (and downright stupid). A pure AIXI-like system will likely always have an enormous disadvantage. An architecture like a G&ouml;del machine that improves its own function might however overcome this.<\/p><\/blockquote><p>In the end, Sandberg concludes that we should still take this threat seriously:<\/p><blockquote><p><i>This is a trivial, wizard's apprentice, case where powerful AI misbehaves. It is easy to analyse thanks to the well-defined structure of the system (AIXI plus utility function) and allows us to see why a super-intelligent system can be dangerous without having malicious intent. In reality I expect that if programming such a system did produce a harmful result it would not be through this kind of easily foreseen mistake. But I do expect that in that case the reason would likely be obvious in retrospect and not much more complex.<\/i><\/p><\/blockquote><div><img loading=\"lazy\" decoding=\"async\" width=\"1\" height=\"1\" src=\"http:\/\/euvolution.com\/futurist-transhuman-news-blog\/wp-content\/plugins\/wp-o-matic\/cache\/a57a2_6753820-1003645890591759557?l=www.sentientdevelopments.com\" alt=\"\" style=\"padding-left:10px; padding-right: 10px;\"><\/div><p><a href=\"http:\/\/feedads.g.doubleclick.net\/~a\/5UxgC5PQ7DL2VUj9Y-aeoCUW6CE\/0\/da\"><img decoding=\"async\" src=\"http:\/\/euvolution.com\/futurist-transhuman-news-blog\/wp-content\/plugins\/wp-o-matic\/cache\/a57a2_di\" border=\"0\" style=\"padding-left:10px; padding-right: 10px;\"><\/a><br><a href=\"http:\/\/feedads.g.doubleclick.net\/~a\/5UxgC5PQ7DL2VUj9Y-aeoCUW6CE\/1\/da\"><img decoding=\"async\" src=\"http:\/\/euvolution.com\/futurist-transhuman-news-blog\/wp-content\/plugins\/wp-o-matic\/cache\/a57a2_di\" border=\"0\" style=\"padding-left:10px; padding-right: 10px;\"><\/a><\/p><div><a href=\"http:\/\/feeds.feedburner.com\/~ff\/SentientDevelopments?a=KSmlj4iY5e0:PVmxZfyYGFk:I9og5sOYxJI\"><img decoding=\"async\" src=\"http:\/\/euvolution.com\/futurist-transhuman-news-blog\/wp-content\/plugins\/wp-o-matic\/cache\/92af6_SentientDevelopments?d=I9og5sOYxJI\" border=\"0\" style=\"padding-left:10px; padding-right: 10px;\"><\/a> <a href=\"http:\/\/feeds.feedburner.com\/~ff\/SentientDevelopments?a=KSmlj4iY5e0:PVmxZfyYGFk:qj6IDK7rITs\"><img decoding=\"async\" src=\"http:\/\/euvolution.com\/futurist-transhuman-news-blog\/wp-content\/plugins\/wp-o-matic\/cache\/92af6_SentientDevelopments?d=qj6IDK7rITs\" border=\"0\" style=\"padding-left:10px; padding-right: 10px;\"><\/a> <a href=\"http:\/\/feeds.feedburner.com\/~ff\/SentientDevelopments?a=KSmlj4iY5e0:PVmxZfyYGFk:yIl2AUoC8zA\"><img decoding=\"async\" src=\"http:\/\/euvolution.com\/futurist-transhuman-news-blog\/wp-content\/plugins\/wp-o-matic\/cache\/92af6_SentientDevelopments?d=yIl2AUoC8zA\" border=\"0\" style=\"padding-left:10px; padding-right: 10px;\"><\/a> <a href=\"http:\/\/feeds.feedburner.com\/~ff\/SentientDevelopments?a=KSmlj4iY5e0:PVmxZfyYGFk:V_sGLiPBpWU\"><img decoding=\"async\" src=\"http:\/\/euvolution.com\/futurist-transhuman-news-blog\/wp-content\/plugins\/wp-o-matic\/cache\/92af6_SentientDevelopments?i=KSmlj4iY5e0:PVmxZfyYGFk:V_sGLiPBpWU\" border=\"0\" style=\"padding-left:10px; padding-right: 10px;\"><\/a> <a href=\"http:\/\/feeds.feedburner.com\/~ff\/SentientDevelopments?a=KSmlj4iY5e0:PVmxZfyYGFk:cGdyc7Q-1BI\"><img decoding=\"async\" src=\"http:\/\/euvolution.com\/futurist-transhuman-news-blog\/wp-content\/plugins\/wp-o-matic\/cache\/92af6_SentientDevelopments?d=cGdyc7Q-1BI\" border=\"0\" style=\"padding-left:10px; padding-right: 10px;\"><\/a> <a href=\"http:\/\/feeds.feedburner.com\/~ff\/SentientDevelopments?a=KSmlj4iY5e0:PVmxZfyYGFk:F7zBnMyn0Lo\"><img decoding=\"async\" src=\"http:\/\/euvolution.com\/futurist-transhuman-news-blog\/wp-content\/plugins\/wp-o-matic\/cache\/92af6_SentientDevelopments?i=KSmlj4iY5e0:PVmxZfyYGFk:F7zBnMyn0Lo\" border=\"0\" style=\"padding-left:10px; padding-right: 10px;\"><\/a> <a href=\"http:\/\/feeds.feedburner.com\/~ff\/SentientDevelopments?a=KSmlj4iY5e0:PVmxZfyYGFk:gIN9vFwOqvQ\"><img decoding=\"async\" src=\"http:\/\/euvolution.com\/futurist-transhuman-news-blog\/wp-content\/plugins\/wp-o-matic\/cache\/92af6_SentientDevelopments?i=KSmlj4iY5e0:PVmxZfyYGFk:gIN9vFwOqvQ\" border=\"0\" style=\"padding-left:10px; padding-right: 10px;\"><\/a><\/div><p><img loading=\"lazy\" decoding=\"async\" src=\"http:\/\/euvolution.com\/futurist-transhuman-news-blog\/wp-content\/plugins\/wp-o-matic\/cache\/92af6_KSmlj4iY5e0\" height=\"1\" width=\"1\" style=\"padding-left:10px; padding-right: 10px;\"><\/p>","protected":false},"excerpt":{"rendered":"<p>Most people in the singularity community are familiar with the nightmarish \"paperclip\" scenario, but it's worth reviewing. Anders Sandberg summarizes the problem:A programmer has constructed an artificial intelligence based on an architecture similar to Marcus Hutter's AIXI model...This AI will &hellip; <a href=\"https:\/\/www.euvolution.com\/futurist-transhuman-news-blog\/transhuman\/anders-sandberg-why-we-should-fear-the-paperclipper.php\">Continue reading <span class=\"meta-nav\">&rarr;<\/span><\/a><\/p>\n","protected":false},"author":1,"featured_media":0,"comment_status":"closed","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"limit_modified_date":"","last_modified_date":"","_lmt_disableupdate":"","_lmt_disable":"","footnotes":""},"categories":[12],"tags":[],"class_list":["post-31565","post","type-post","status-publish","format-standard","hentry","category-transhuman"],"modified_by":null,"_links":{"self":[{"href":"https:\/\/www.euvolution.com\/futurist-transhuman-news-blog\/wp-json\/wp\/v2\/posts\/31565"}],"collection":[{"href":"https:\/\/www.euvolution.com\/futurist-transhuman-news-blog\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.euvolution.com\/futurist-transhuman-news-blog\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.euvolution.com\/futurist-transhuman-news-blog\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/www.euvolution.com\/futurist-transhuman-news-blog\/wp-json\/wp\/v2\/comments?post=31565"}],"version-history":[{"count":0,"href":"https:\/\/www.euvolution.com\/futurist-transhuman-news-blog\/wp-json\/wp\/v2\/posts\/31565\/revisions"}],"wp:attachment":[{"href":"https:\/\/www.euvolution.com\/futurist-transhuman-news-blog\/wp-json\/wp\/v2\/media?parent=31565"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.euvolution.com\/futurist-transhuman-news-blog\/wp-json\/wp\/v2\/categories?post=31565"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.euvolution.com\/futurist-transhuman-news-blog\/wp-json\/wp\/v2\/tags?post=31565"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}