Turchin: SETI at risk of downloading a trojan horse

Russian physicist Alexey Turchin contends that passive SETI may be just as dangerous—if not more so—than active SETI:

I was fortune enough to be able to talk to Turchin at the Humanity+ Summit at Harvard earlier this year where he clarified his argument to me.

Turchin worries that humanity may be tricked by an out-of-control script that is propagating throughout the Galaxy. This script, which uses a pre-Singularity civilization as its vector, fools its hosts with a lure of some kind (e.g. immortality, access to the Galactic Internet, etc.) who in turn unwittingly build a device that produces a malign extraterrestrial artificial intelligence (ETAI). This ETAI then takes over all the resources of the planet so that it can re-broadcast itself into the cosmos in search of the next victim.

This concept is similar to Carl Sagan's interstellar transportation machine in Contact, except that it would work to destroy our civilization rather than see it move forward.

It's worth noting that this ETAI and its script may be a mutation of some sort, where no civilization was actually responsible for designing the damn thing. It's just a successful replicative schema that's following Darwinian principles.

It's also worth noting that I warned of this back in 2004.

Moving forward, Turchin suggests we raise awareness of the potential problem, change the guidelines for SETI research and consider the prohibition of SETI before we get our own AI.

Turchin's idea sounds ludicrous, but it's one of those crazy things that causes a nervous laugh. I think Turchin's idea needs to be discussed as there may be some merit to it. We need to be careful.


Related Posts

Comments are closed.