Transhumanism Evolved

-

-
-
-

Introduction to transtopianism

Fractal Inferno


"Biology is not destiny. It was never more than tendency. It was just nature's first quick and dirty way to compute with meat. Chips are destiny."

-- Bart Kosko, Ph.D., Heaven in a Chip

"I admit, it is difficult to think, encased in this rotting piece of meat. The stink of it filling every breath, a suffocating cloud you can't escape. Disgusting! Look at how pathetically fragile it is. Nothing this weak is meant to survive."

-- Smith, Matrix III



For centuries people have feared Hell, the place where "sinful" souls suffer eternal, unimaginable torment. Gradually, as the light of reason began to dawn upon the world, or at least certain parts of it, the concept of Hell began to lose its power. It became a somewhat politically incorrect relic of a bygone dark age, not to be taken too literally or seriously. Nowadays, few people in the civilized world believe that Hell exists, although, oddly enough, many still claim to believe in God. Well, they're wrong. While God, Heaven, or angels may be little more than childish fantasies, Hell most certainly does exist. It is right here, right now. Hell is all around us. We live it, breathe it; it's our reality. A reality ruled by entropy, where suffering and death are the sickening defaults for every living thing. The question isn't whether you will suffer and die, but rather how much, and when. Some may be luckier than others, but in the end no flesh is spared. The shadow of death and misery hangs over us like Damocles' sword, and poisons all pleasures. Ours is a brutal natural order where organisms have to kill each other for food, the struggle for status and resources is constant, and disease, violence, disasters, and accidents perpetually vie with steady degeneration in their bid to inflict suffering and death upon the living. In a world such as this, it can hardly be deemed surprising that many seek blissful oblivion via drugs, sex, religious psychosis...or suicide.

After aeons of pain, fear, despair, frustration, and perspectiveless drudgery, of countless lost generations, there is, however, finally some genuine hope. Ex machina libertas; technology will set you free. Science and technology have, ever since man learned to control fire and began to make simple tools, steadily improved the quality of life. It is they who gave birth to civilization, man's ongoing rebellion against a harsh, chaotic, uncaring universe. Now, in what is commonly known as the 21st century, this technology-driven rebellion is about to reach its zenith. Soon, our species will have the ability to not only liberate itself from Hell, but to create Heaven on Earth; to finally fulfill its destiny among the stars and "become as gods".

"Within thirty years, we will have the technological means to create superhuman intelligence. Shortly after, the human era will be ended."

-- Vernor Vinge at the 1993 NASA VISION-21 Symposium

This event, the relatively sudden emergence of superintelligence (SI), is often referred to as the (technological) Singularity in Transhumanist circles. The longer definition is:

"SINGULARITY: the postulated point or short period in our future when our self-guided evolutionary development accelerates enormously (powered by nanotech, neuroscience, AI, and perhaps uploading) so that nothing beyond that time can reliably be conceived." [Vernor Vinge, 1986]

-- Lextropicon

Assuming, of course, that we actually survive the next 15-40 years and reach the Singularity, which may be assuming too much. There is no shortage of existential risks, after all, with runaway / military nanotech, genetically engineered or synthetic bacteria and viruses, and good old nuclear warfare being among the most likely candidates for near-term human extinction. Even if superhuman intelligence wins the race, survival is by no means guaranteed for those who don't participate or fall behind in this burst of self-directed hyperevolution. Technology, like most things, is a double-edged sword, and will give us or our creations not just the means not to improve life immeasurably --to banish aging, disease, and suffering forever-- but also to extinguish it on an unprecedented scale, practically unopposed. Now I am become death, the destroyer of worlds...

Deepblue

Directed efforts to stop some of the more dangerous technologies which might cause a "malevolent" Singularity (or global destruction in general) can only slow the process down somewhat, not stop it entirely. Unless, of course, one is prepared to sacrifice science, technology, and civilization itself. This, however, would be a certain death sentence for every individual and, eventually, the species itself, while the alternative offers at least some hope, even if the odds are stacked against us. Rather than trying to enforce an essentially immoral and ultimately doomed program of relinquishment, as suggested by Bill Joy et al, we should try to develop the most empowering, mind & body-enhancing technologies as soon as possible. Above all we need to become smarter, more rational than we currently are in order to deal intelligently with the complex challenges ahead.

"Evolution says organisms are replaced by species of superior adaptability. When our robots are tired of taking orders, they may, if we're lucky, show more compassion to us than we've shown the species we pushed into oblivion. Perhaps they will put us into zoos, throw peanuts at us and make us dance inside our cages."

-- Dr. Michio Kaku in "The Future of Technology"
(Time magazine cover story, June 2000)

Vae victis! The Posthuman future may be glorious, filled with wonders far beyond our current comprehension, but what good is that to a person if he can't be part of it? If, for example, AIs become superintelligent before humans do, this will reduce us to second-rate beings that are almost completely at the mercy of this new "master race". During our dominion of the Earth we have wiped out countless animal species, brought others (including our "cousins", the apes) to the brink of extinction, used them for scientific experiments, put them in cages for our enjoyment etc., etc. This is the privilege of (near-absolute) power. If we lose the top spot to our own creations, we will find ourselves in the same precarious position that animals are in now. While it may be more or less impossible to predict what exactly an "alien" Superintelligence would do to/with lower life forms such as humans, the mere fact that we'd be completely at its mercy should be reason enough for concern.

Needless to say, from a personal perspective it doesn't matter much who or what exactly will become superintelligent (AIs, genetically engineered humans, cyborgs) -- in each case you'd be faced with an unpredictable, vastly superior being. A god, in effect. Because one's personality would almost certainly change, perhaps even completely beyond recognition, once the augmentation process starts, it doesn't even really matter whether the person would be "good" or "bad" to begin with; the result would be "unknowable" anyway. Many (most? all??) of our current emotions and attitudes, the legacy of our evolutionary past, could easily become as antiquated as our biological bodies in the Posthuman world. Altruism may be useful in an evolutionary context where weak, imperfect beings have to rely on cooperation to survive, but to a solitary god-like SI it would just be a dangerous handicap. What would it gain by letting others ascend? Most likely nothing. What could it lose? Possibly everything. Consequently, if its concept of logic will even remotely resemble ours, it probably won't let others become its peers. And even if it's completely, utterly alien, it could still harm or kill us for other (apparently incomprehensible) reasons, or even more or less accidentally, as a side-effect of its ascension, for example. How many insects does the average human crush or otherwise kill during his lifetime? Many thousands, no doubt, and often without even knowing about it. Usually it's not malice or anything of the sort, merely utter indifference. The insects simply aren't important enough to care about -- unless they get in the way, that is, in which case they're bound to be castigated with some chemical weapon of mass destruction. They're non-entities to be ignored and casually stepped on at best, annoying pests to be eradicated at worst. Such are the eternal laws of power. So what's the moral of the story here? Well, make sure that you'll be one of the first Posthumans, obviously, but more on that later.

Sometimes I see
How the brave new world arrives
And I see how it thrives
In the ashes of our lives
Oh yes, man is a fool
And he thinks he'll be okay
Dragging on, feet of clay
Never knowing he's astray
Keeps on going anyway...

-- ABBA; Happy New Year

True, the future doesn't necessarily have to be bad for the less-than-superintelligent -- the SIs could be "eternal" philanthropists for all we know, altruism might turn out to be the most logically stable "Objective Morality", they could be our obedient, genie-like servants, or they might simply choose to ignore us altogether and fly off into space, but depending on such positive scenarios in the face of unknowability is dangerously naive (wishful thinking). Yet, though lip service is occasionally paid to the dangers of the Singularity and powerful new technologies in general, there is no known coordinated effort within the Transhumanist community to actively prepare for the coming changes. This has to do with the generally (too) optimistic, idealistic, and technophilic attitude of many Transhumanists, and perhaps a desire to make/keep the philosophy socially acceptable and [thus] easier to proliferate. Visions of a harsh, devouring technocalypse, no matter how realistic are usually dismissed as being too "pessimistic". Of course lethargy, defeatism, strife, and conservative thinking also contribute to the lack of focus and momentum in Transhumanism, but the main problem seems to be that "we" aren't taking our own ideas seriously enough, fail to fully grasp the implications of things like nanotech, AI, and the Singularity. It's all talk and no action.

Emerald City

Enter transtopianism. This philosophy follows the general outlines of Transhumanism in that it advocates the overcoming our biological and social limits by means of reason, science, and technology, but there are also some important differences. Principally, these are: 1) a much heavier emphasis on the Singularity, 2) the explicit inclusion of various elements -philosophical, political, artistic, economic & otherwise- which are "optional" (or nonexistent) in general Transhumanism, and 3) the intention to become a movement with a clearly defined organizational structure instead of just a (very) loose collection of more or less like-minded individuals (which is what Transhumanism, and to a somewhat lesser extent Extropianism, are). Essentially, transtopianism is an attempt to realize Transhumanism's full potential as a practical way to (significantly) improve one's life in the present, and to survive radical future changes. Unlike regular Transhumanism or even Extropianism, this is a "holistic" philosophy; a complete worldview for those who seek "perfection" in all fields of human endeavor. It is also a strongly dualistic philosophy, motivated by equal amounts of optimism and pessimism, instead of blind (or at least weak-sighted) technophilia.

    Note: A complete overview of transtopian goals, means & values can be found in the Principles section. The Enlightenment Test is an interactive summary of the same, and the Singularity Club goes into more detail regarding the organizational and (other) practical aspects of the transtopian movement. For "hardcore" socio-politics, see the PC-free zone.

transtopianism's main message is as simple as it is radical: assuming that we don't destroy ourselves first, technological progress will profoundly impact society in the (relatively) near future, culminating in the emergence of superintelligence and [thus] the Singularity. Those who will acquire a dominant position during this event, quite possibly a classical Darwinian struggle (survival of the fittest), will undoubtedly reap enormous benefits; they will become "persons of unprecedented physical, intellectual, and psychological capacity. Self-programming, self-constituting, potentially immortal, unlimited individuals''. Those who for whatever reason won't participate or fall behind will face a very uncertain future, and quite possibly extermination.

"Those who have some means think that the most important thing in the world is love. The poor know that it is money."

-- Gerald Brenan

Wealth and power can not only make the present considerably more palatable; acquiring them is nothing short of a logical imperative for those who are serious about realizing their Transhuman hopes and dreams. It is, after all, nearly always the rich & powerful who have first access to new technologies. Unlike, for example, cars, TV sets, and cell phones, the technologies that will enable people to become/create our evolutionary successors aren't likely to eventually trickle down to the general public. Superintelligence won't be for sale at your local supermarket 20-50 years from now for pretty much the same reasons why one can't buy nukes in gunshops, even though the basic design is now more than half a century old (indeed, one can't even legally purchase a simple machine gun, let alone more advanced military hardware, in most countries). Bottom line: the really powerful, dangerous technologies are always hoarded and jealously guarded by (self-proclaimed) elite groups, and nothing is more powerful and potentially dangerous than Superintelligence. Even all-out WW3 would be trivial compared to the damage that a SI (with its arsenal of advanced nanoweapons and who knows what else) could inflict. By "ascending" you don't just obtain the ultimate weapon -- you become it. It doesn't seem very likely, or indeed sane, that SIs will freely hand out such power to anyone who asks for it. Also, even if they wouldn't object to sharing their power in principle, why would they bother? Just because something can be done doesn't automatically mean that it will be done, or "has" to be done. Would we start mass-upgrading ("uplifting") ants or mice if we had the technology? Probably not. Insignificance can be a real killer...

Thus, it logically follows that we should put considerable effort into acquiring a good starting position for the Singularity, which includes things like gathering as much wealth as possible, keeping abreast of the latest technological developments, and implementing them to become more efficient and powerful individuals. Our primary interim subgoal is (must be) becoming part of the economic & technological elite. The Players. Our primary interim supergoal is (obviously) to become uploads -inorganic, "digital" beings- which will open the door to virtually unlimited additional enhancements, and ultimately godhood itself.

"He who sits in the heavens shall laugh."

-- Psalm 2:4

This is a tall order indeed, and to increase one's chances of success, cooperation with like-minded individuals is essential (unless, perhaps, you happen to be some kind of tech-savvy billionaire). Hence the transtopian movement and its Singularity Club, an "elite" mutual aid association for those who want to fully enjoy the present while effectively preparing for the radical future. Our radical future. This is the only such (known) group in existence. The WTA, Extropy Institute, and the various national/regional Transhumanist groups have a social and meme-spreading function. Useful? Certainly. Adequate? Not al all. They do not specifically aim to improve their members' social, mental, physical, financial etc. situation (true, occasionally it does happen, but the effect is usually very limited, short-lived, and not part of a clear strategy). Their vision of a prosperous, peaceful Trans- & Posthuman society largely depends on the wisdom and cooperation of the Establishment and/or "the masses" -- neither of which has a particularly good track record when it comes to rational behavior.

To each his own. We salute the Transhuman, Extropian, and Singularitarian giants on whose shoulders we stand, but we have seen further and now it's time to move on. transtopians don't like to depend on the whims of society, governments, fate, or luck for their current and future well-being; they want to take control of their destiny, make their own rules, their own "luck". Of course, the chances of success may be slim, but since there's nothing to lose (we're all on death row anyway) and a universe to gain, why not give it a try? Given the circumstances, it is the rational thing to do. As the saying goes: shoot for the moon, even if you miss you'll land among the stars (well, maybe it will just be your disassembled molecules that will land among the stars, but you get the idea; it never hurts to try). When crossing uncharted territory, it is wiser to travel in groups, and even if the road ultimately leads to a dead-end, the journey itself might still be well worth the effort. Are you game?

"I have nothing to offer but blood, toil, tears, and sweat. We have before us an ordeal of the most grievous kind. We have before us many, many months of struggle and suffering. You ask, what is our aim? I can answer in one word. It is victory. Victory at all costs - Victory in spite of all terrors - Victory, however long and hard the road may be, for without victory there is no survival."

-- Sir Winston S. Churchill, May 13, 1940



© Copyright 2003 transtopia



Prometheism | Euvolution | Transhumanism