{"id":181707,"date":"2017-03-06T15:03:32","date_gmt":"2017-03-06T20:03:32","guid":{"rendered":"http:\/\/www.euvolution.com\/prometheism-transhumanism-posthumanism\/singularitarianism-transhumanism-wiki-fandom-powered\/"},"modified":"2017-03-06T15:03:32","modified_gmt":"2017-03-06T20:03:32","slug":"singularitarianism-transhumanism-wiki-fandom-powered","status":"publish","type":"post","link":"https:\/\/www.euvolution.com\/prometheism-transhumanism-posthumanism\/singularitarianism\/singularitarianism-transhumanism-wiki-fandom-powered\/","title":{"rendered":"Singularitarianism | Transhumanism Wiki | Fandom powered &#8230;"},"content":{"rendered":"<p><p>    Singularitarianism is a moral    philosophy based upon the belief that a technological singularity  the    technological creation of smarter-than-human intelligence     is possible, and advocating deliberate action to bring it into    effect and ensure its safety. While many futurists and    transhumanists speculate on the possibility    and nature of this technological development (often referred to    as the Singularity), Singularitarians believe it is not    only possible, but desirable if, and only if, guided safely.    Accordingly, they might sometimes \"dedicate their lives\" to    acting in ways they believe will contribute to its safe    implementation.  <\/p>\n<p>    The term \"singularitarian\" was originally defined by Extropian Mark Plus in 1991 to mean    \"one who believes the concept of a Singularity\". This term has    since been redefined to mean \"Singularity activist\" or \"friend    of the Singularity\"; that is, one who acts so as to bring about    the Singularity.[1]  <\/p>\n<p>    Ray    Kurzweil, the author of the book The Singularity    is Near, defines a Singularitarian as someone \"who    understands the Singularity and who has reflected on its    implications for his or her own life\".[2]  <\/p>\n<p>    In his 2000 essay, \"Singularitarian Principles\", Eliezer    Yudkowsky writes that there are four qualities that define    a Singularitarian:[3]  <\/p>\n<p>    In July 2000 Eliezer Yudkowsky, Brian Atkins and Sabine Atkins    founded the Singularity    Institute for Artificial Intelligence to work towards the    creation of self-improving Friendly AI. The    Singularity Institute's writings argue for the idea that an AI    with the ability to improve upon its own design (Seed AI) would rapidly lead    to superintelligence. Singularitarians    believe that reaching the Singularity swiftly and safely is the    best possible way to minimize net existential    risk.  <\/p>\n<p>    Many believe a technological singularity is possible without    adopting Singularitarianism as a moral philosophy. Although the    exact numbers are hard to quantify, Singularitarianism is    presently a small movement. Other prominent Singularitarians    include Ray    Kurzweil and Nick Bostrom.  <\/p>\n<p>    Often ridiculing the Singularity as \"the Rapture for nerds\", many    critics have dismissed singularitarianism as a pseudoreligion    of fringe    science.[4]    However, some green    anarchist militants have taken singularitarian rhetoric    seriously enough to have called for violent direct action    to stop the Singularity.[5]  <\/p>\n<p>    lt:Singuliaritarianizmas  <\/p>\n<p><!-- Auto Generated --><\/p>\n<p>Original post:<\/p>\n<p><a target=\"_blank\" rel=\"nofollow\" href=\"http:\/\/transhumanism.wikia.com\/wiki\/Singularitarianism\" title=\"Singularitarianism | Transhumanism Wiki | Fandom powered ...\">Singularitarianism | Transhumanism Wiki | Fandom powered ...<\/a><\/p>\n","protected":false},"excerpt":{"rendered":"<p> Singularitarianism is a moral philosophy based upon the belief that a technological singularity the technological creation of smarter-than-human intelligence is possible, and advocating deliberate action to bring it into effect and ensure its safety. While many futurists and transhumanists speculate on the possibility and nature of this technological development (often referred to as the Singularity), Singularitarians believe it is not only possible, but desirable if, and only if, guided safely <a href=\"https:\/\/www.euvolution.com\/prometheism-transhumanism-posthumanism\/singularitarianism\/singularitarianism-transhumanism-wiki-fandom-powered\/\">Continue reading <span class=\"meta-nav\">&rarr;<\/span><\/a><\/p>\n","protected":false},"author":1,"featured_media":0,"comment_status":"closed","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[187724],"tags":[],"class_list":["post-181707","post","type-post","status-publish","format-standard","hentry","category-singularitarianism"],"_links":{"self":[{"href":"https:\/\/www.euvolution.com\/prometheism-transhumanism-posthumanism\/wp-json\/wp\/v2\/posts\/181707"}],"collection":[{"href":"https:\/\/www.euvolution.com\/prometheism-transhumanism-posthumanism\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.euvolution.com\/prometheism-transhumanism-posthumanism\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.euvolution.com\/prometheism-transhumanism-posthumanism\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/www.euvolution.com\/prometheism-transhumanism-posthumanism\/wp-json\/wp\/v2\/comments?post=181707"}],"version-history":[{"count":0,"href":"https:\/\/www.euvolution.com\/prometheism-transhumanism-posthumanism\/wp-json\/wp\/v2\/posts\/181707\/revisions"}],"wp:attachment":[{"href":"https:\/\/www.euvolution.com\/prometheism-transhumanism-posthumanism\/wp-json\/wp\/v2\/media?parent=181707"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.euvolution.com\/prometheism-transhumanism-posthumanism\/wp-json\/wp\/v2\/categories?post=181707"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.euvolution.com\/prometheism-transhumanism-posthumanism\/wp-json\/wp\/v2\/tags?post=181707"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}