{"id":204991,"date":"2017-01-27T08:09:06","date_gmt":"2017-01-27T13:09:06","guid":{"rendered":"http:\/\/www.euvolution.com\/futurist-transhuman-news-blog\/uncategorized\/superintelligence-nick-bostrom-oxford-university-press.php"},"modified":"2017-01-27T08:09:06","modified_gmt":"2017-01-27T13:09:06","slug":"superintelligence-nick-bostrom-oxford-university-press","status":"publish","type":"post","link":"https:\/\/www.euvolution.com\/futurist-transhuman-news-blog\/superintelligence\/superintelligence-nick-bostrom-oxford-university-press.php","title":{"rendered":"Superintelligence &#8211; Nick Bostrom &#8211; Oxford University Press"},"content":{"rendered":"<p><p>Superintelligence              Paths, Dangers, Strategies              Nick Bostrom            Reviews and Awards        <\/p>\n<p>      \"I highly recommend this book\" --Bill Gates    <\/p>\n<p>      \"Nick Bostrom makes a persuasive case that the future impact      of AI is perhaps the most important issue the human race has      ever faced. Instead of passively drifting, we need to steer a      course. Superintelligence charts the submerged rocks      of the future with unprecedented detail. It marks the      beginning of a new era.\" --Stuart Russell, Professor      of Computer Science, University of California,      Berkley    <\/p>\n<p>      \"Those disposed to dismiss an 'AI takeover' as science      fiction may think again after reading this original and      well-argued book.\" --Martin Rees, Past President,      Royal Society    <\/p>\n<p>      \"This superb analysis by one of the world's clearest thinkers      tackles one of humanity's greatest challenges: if future      superhuman artificial intelligence becomes the biggest event      in human history, then how can we ensure that it doesn't      become the last?\" --Professor Max Tegmark,      MIT    <\/p>\n<p>      \"Terribly important ... groundbreaking... extraordinary      sagacity and clarity, enabling him to combine his      wide-ranging knowledge over an impressively broad spectrum of      disciplines - engineering, natural sciences, medicine, social      sciences and philosophy - into a comprehensible whole... If      this book gets the reception that it deserves, it may turn      out the most important alarm bell since Rachel Carson's      Silent Spring from 1962, or ever.\" --Olle Haggstrom,      Professor of Mathematical Statistics    <\/p>\n<p>      \"Valuable. The implications of introducing a second      intelligent species onto Earth are far-reaching enough to      deserve hard thinking\" --The Economist    <\/p>\n<p>      \"There is no doubting the force of [Bostrom's]      arguments...the problem is a research challenge worthy of the      next generation's best mathematical talent. Human      civilisation is at stake.\" --Clive Cookson, Financial      Times    <\/p>\n<p>      \"Worth reading.... We need to be super careful with AI.      Potentially more dangerous than nukes\" --Elon Musk,      Founder of SpaceX and Tesla    <\/p>\n<p>      \"Every intelligent person should read it.\" --Nils      Nilsson, Artificial Intelligence Pioneer, Stanford      University    <\/p>\n<p><!-- Auto Generated --><\/p>\n<p>More here: <\/p>\n<p><a target=\"_blank\" rel=\"nofollow\" href=\"https:\/\/global.oup.com\/academic\/product\/superintelligence-9780199678112\" title=\"Superintelligence - Nick Bostrom - Oxford University Press\">Superintelligence - Nick Bostrom - Oxford University Press<\/a><\/p>\n","protected":false},"excerpt":{"rendered":"<p> Superintelligence Paths, Dangers, Strategies Nick Bostrom Reviews and Awards \"I highly recommend this book\" --Bill Gates \"Nick Bostrom makes a persuasive case that the future impact of AI is perhaps the most important issue the human race has ever faced. Instead of passively drifting, we need to steer a course.  <a href=\"https:\/\/www.euvolution.com\/futurist-transhuman-news-blog\/superintelligence\/superintelligence-nick-bostrom-oxford-university-press.php\">Continue reading <span class=\"meta-nav\">&rarr;<\/span><\/a><\/p>\n","protected":false},"author":1,"featured_media":0,"comment_status":"closed","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"limit_modified_date":"","last_modified_date":"","_lmt_disableupdate":"","_lmt_disable":"","footnotes":""},"categories":[431612],"tags":[],"class_list":["post-204991","post","type-post","status-publish","format-standard","hentry","category-superintelligence"],"modified_by":null,"_links":{"self":[{"href":"https:\/\/www.euvolution.com\/futurist-transhuman-news-blog\/wp-json\/wp\/v2\/posts\/204991"}],"collection":[{"href":"https:\/\/www.euvolution.com\/futurist-transhuman-news-blog\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.euvolution.com\/futurist-transhuman-news-blog\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.euvolution.com\/futurist-transhuman-news-blog\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/www.euvolution.com\/futurist-transhuman-news-blog\/wp-json\/wp\/v2\/comments?post=204991"}],"version-history":[{"count":0,"href":"https:\/\/www.euvolution.com\/futurist-transhuman-news-blog\/wp-json\/wp\/v2\/posts\/204991\/revisions"}],"wp:attachment":[{"href":"https:\/\/www.euvolution.com\/futurist-transhuman-news-blog\/wp-json\/wp\/v2\/media?parent=204991"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.euvolution.com\/futurist-transhuman-news-blog\/wp-json\/wp\/v2\/categories?post=204991"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.euvolution.com\/futurist-transhuman-news-blog\/wp-json\/wp\/v2\/tags?post=204991"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}