{"id":176848,"date":"2017-02-12T06:41:15","date_gmt":"2017-02-12T11:41:15","guid":{"rendered":"http:\/\/www.euvolution.com\/prometheism-transhumanism-posthumanism\/its-already-too-late-to-stop-the-singularity-big-think\/"},"modified":"2017-02-12T06:41:15","modified_gmt":"2017-02-12T11:41:15","slug":"its-already-too-late-to-stop-the-singularity-big-think","status":"publish","type":"post","link":"https:\/\/www.euvolution.com\/prometheism-transhumanism-posthumanism\/transhuman-news-blog\/transhuman\/its-already-too-late-to-stop-the-singularity-big-think\/","title":{"rendered":"It&#8217;s Already Too Late to Stop the Singularity &#8211; Big Think"},"content":{"rendered":"<p><p>    Ben Goertzel: Some people are gravely worried    about the uncertainty and the negative potential associated    with transhuman, superhuman AGI. And indeed we are stepping    into a great unknown realm.  <\/p>\n<p>    Its almost like a Rorschach type of thing really. I mean we    fundamentally dont know what a superhuman AI is going to do    and thats the truth of it, right. And then if you tend to be    an optimist you will focus on the good possibilities. If you    tend to be a worried person whos pessimistic youll focus on    the bad possibilities. If you tend to be a Hollywood movie    maker you focus on scary possibilities maybe with a happy    ending because thats what sells movies. We dont know whats    going to happen.  <\/p>\n<p>    I do think however this is the situation humanity has been in    for a very long time. When the cavemen stepped out of their    caves and began agriculture we really had no idea that was    going to lead to cities and space flight and so forth. And when    the first early humans created language to carry out simple    communication about the moose they had just killed over there    they did not envision Facebook, differential calculus and MC    Hammer and all the rest, right. I mean theres so much that has    come about out of early inventions which humans couldnt have    ever foreseen. And I think were just in the same situation. I    mean the invention of language or civilization could have led    to everyones death, right. And in a way it still could. And    the creation of superhuman AI it could kill everyone and I    dont want it to. Almost none of us do.  <\/p>\n<p>    Of course the way we got to this point as a species and a    culture has been to keep doing amazing new things that we    didnt fully understand. And thats what were going to keep on    doing. Nick Bostroms book was influential but I felt that in    some ways it was a bit deceptive the way he phrased things. If    you read his precise philosophical arguments which are very    logically drawn what Bostrom says in his book,    Superintelligence, is that we cannot rule out the possibility    that a superintelligence will do some very bad things. And    thats true. On the other hand some of the associated rhetoric    makes it sound like its very likely a superintelligence will    do these bad things. And if you follow his philosophical    arguments closely he doesnt show that. What he just shows is    that you cant rule it out and we dont know whats going    on.  <\/p>\n<p>    I dont think Nick Bostrom or anyone else is going to stop the    human race from developing advanced AI because its a source of    tremendous intellectual curiosity but also of tremendous    economic advantage. So if lets say President Trump decided to    ban artificial intelligence research  I dont think hes going    to but suppose he did. China will keep doing artificial    intelligence research. If U.S. and China ban it, you know,    Africa will do it. Everywhere around the world has AI textbooks    and computers. And everyone now knows you can make peoples    lives better and make money from developing more advanced AI.    So theres no possibility in practice to halt AI development.    What we can do is try to direct it in the most beneficial    direction according to our best judgment. And thats part of    what leads me to pursue AGI via an open source project such as    OpenCog. I respect very much what Google, Baidu, Facebook,    Microsoft and these other big companies are doing in AI.    Theres many good people there doing good research and with    good hearted motivations. But I guess Im enough of an old    leftist raised by socialists and I sort of  Im skeptical that    a company whose main motive is to maximize shareholder value is    really going to do the best thing for the human race if they    create a human level AI.  <\/p>\n<p>    I mean they might. On the other hand theres a lot of other    motivations there and a public company in the end has a    fiduciary responsibility to their shareholders. All in all I    think the odds are better if AI is developed in a way that is    owned by the whole human race and can be developed by all of    humanity for its own good. And open source software is sort of    the closest approximation that we have to that now. So our    aspiration is to grow OpenCog into sort of the Linux of AGI and    have people all around the world developing it to serve their    own local needs and putting their own values and understanding    into it as it becomes more and more intelligent.  <\/p>\n<p>    Certainly this doesnt give us any guarantee. We can observe    things like Linux has fewer bugs than Windows or OSX and its    open source. So more eyeballs on something sometimes can make    it more reliable. But theres no solid guarantee that making an    AGI open source will make the singularity come out well. But my    gut feel is that theres enough hard problems with creating a    superhuman AI and having it respect human values and have a    relationship of empathy with people as it grows. Theres enough    problems there without the young AGI getting wrapped up in    competition of country versus country and company versus    company and internal politics within companies or militaries. I    feel like we dont want to add these problems of sort of human    slash primate social status competition dynamics. We dont want    to add those problems into the challenges that are faced in AGI    development.  <\/p>\n<\/p>\n<p><!-- Auto Generated --><\/p>\n<p>Read more from the original source:<br \/>\n<a target=\"_blank\" href=\"http:\/\/bigthink.com\/videos\/ben-goertzel-will-superhuman-agi-kill-us\" title=\"It's Already Too Late to Stop the Singularity - Big Think\">It's Already Too Late to Stop the Singularity - Big Think<\/a><\/p>\n","protected":false},"excerpt":{"rendered":"<p> Ben Goertzel: Some people are gravely worried about the uncertainty and the negative potential associated with transhuman, superhuman AGI.  <a href=\"https:\/\/www.euvolution.com\/prometheism-transhumanism-posthumanism\/transhuman-news-blog\/transhuman\/its-already-too-late-to-stop-the-singularity-big-think\/\">Continue reading <span class=\"meta-nav\">&rarr;<\/span><\/a><\/p>\n","protected":false},"author":4,"featured_media":0,"comment_status":"closed","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[14],"tags":[],"class_list":["post-176848","post","type-post","status-publish","format-standard","hentry","category-transhuman"],"_links":{"self":[{"href":"https:\/\/www.euvolution.com\/prometheism-transhumanism-posthumanism\/wp-json\/wp\/v2\/posts\/176848"}],"collection":[{"href":"https:\/\/www.euvolution.com\/prometheism-transhumanism-posthumanism\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.euvolution.com\/prometheism-transhumanism-posthumanism\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.euvolution.com\/prometheism-transhumanism-posthumanism\/wp-json\/wp\/v2\/users\/4"}],"replies":[{"embeddable":true,"href":"https:\/\/www.euvolution.com\/prometheism-transhumanism-posthumanism\/wp-json\/wp\/v2\/comments?post=176848"}],"version-history":[{"count":0,"href":"https:\/\/www.euvolution.com\/prometheism-transhumanism-posthumanism\/wp-json\/wp\/v2\/posts\/176848\/revisions"}],"wp:attachment":[{"href":"https:\/\/www.euvolution.com\/prometheism-transhumanism-posthumanism\/wp-json\/wp\/v2\/media?parent=176848"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.euvolution.com\/prometheism-transhumanism-posthumanism\/wp-json\/wp\/v2\/categories?post=176848"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.euvolution.com\/prometheism-transhumanism-posthumanism\/wp-json\/wp\/v2\/tags?post=176848"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}