|
|
- | |
Home - |
|
Introduction - |
|
Principles - |
|
Symbolism - |
|
FAQ - |
|
E-Test - |
|
S-Club - |
|
Transhumanism - |
|
Cryonics - |
|
Island Project - |
|
PC-Free Zone - |
|
Library - |
|
Forums - |
|
Membership - |
|
Link Xchange - |
|
Services - |
|
Support - |
|
Feedback | |
- |
- | |||||||||
Introduction to transtopianism
|
|||||||||
This event, the relatively sudden emergence of superintelligence (SI), is often referred to as the (technological) Singularity in Transhumanist circles. The longer definition is:
Assuming, of course, that we actually survive the next 15-40 years and reach the Singularity, which may be assuming too much. There is no shortage of existential risks, after all, with runaway / military nanotech, genetically engineered or synthetic bacteria and viruses, and good old nuclear warfare being among the most likely candidates for near-term human extinction. Even if superhuman intelligence wins the race, survival is by no means guaranteed for those who don't participate or fall behind in this burst of self-directed hyperevolution. Technology, like most things, is a double-edged sword, and will give us or our creations not just the means not to improve life immeasurably --to banish aging, disease, and suffering forever-- but also to extinguish it on an unprecedented scale, practically unopposed. Now I am become death, the destroyer of worlds...
Directed efforts to stop some of the more dangerous technologies which might cause a "malevolent" Singularity (or global destruction in general) can only slow the process down somewhat, not stop it entirely. Unless, of course, one is prepared to sacrifice science, technology, and civilization itself. This, however, would be a certain death sentence for every individual and, eventually, the species itself, while the alternative offers at least some hope, even if the odds are stacked against us. Rather than trying to enforce an essentially immoral and ultimately doomed program of relinquishment, as suggested by Bill Joy et al, we should try to develop the most empowering, mind & body-enhancing technologies as soon as possible. Above all we need to become smarter, more rational than we currently are in order to deal intelligently with the complex challenges ahead.
Vae victis! The Posthuman future may be glorious, filled with wonders far beyond our current comprehension, but what good is that to a person if he can't be part of it? If, for example, AIs become superintelligent before humans do, this will reduce us to second-rate beings that are almost completely at the mercy of this new "master race". During our dominion of the Earth we have wiped out countless animal species, brought others (including our "cousins", the apes) to the brink of extinction, used them for scientific experiments, put them in cages for our enjoyment etc., etc. This is the privilege of (near-absolute) power. If we lose the top spot to our own creations, we will find ourselves in the same precarious position that animals are in now. While it may be more or less impossible to predict what exactly an "alien" Superintelligence would do to/with lower life forms such as humans, the mere fact that we'd be completely at its mercy should be reason enough for concern. Needless to say, from a personal perspective it doesn't matter much who or what exactly will become superintelligent (AIs, genetically engineered humans, cyborgs) -- in each case you'd be faced with an unpredictable, vastly superior being. A god, in effect. Because one's personality would almost certainly change, perhaps even completely beyond recognition, once the augmentation process starts, it doesn't even really matter whether the person would be "good" or "bad" to begin with; the result would be "unknowable" anyway. Many (most? all??) of our current emotions and attitudes, the legacy of our evolutionary past, could easily become as antiquated as our biological bodies in the Posthuman world. Altruism may be useful in an evolutionary context where weak, imperfect beings have to rely on cooperation to survive, but to a solitary god-like SI it would just be a dangerous handicap. What would it gain by letting others ascend? Most likely nothing. What could it lose? Possibly everything. Consequently, if its concept of logic will even remotely resemble ours, it probably won't let others become its peers. And even if it's completely, utterly alien, it could still harm or kill us for other (apparently incomprehensible) reasons, or even more or less accidentally, as a side-effect of its ascension, for example. How many insects does the average human crush or otherwise kill during his lifetime? Many thousands, no doubt, and often without even knowing about it. Usually it's not malice or anything of the sort, merely utter indifference. The insects simply aren't important enough to care about -- unless they get in the way, that is, in which case they're bound to be castigated with some chemical weapon of mass destruction. They're non-entities to be ignored and casually stepped on at best, annoying pests to be eradicated at worst. Such are the eternal laws of power. So what's the moral of the story here? Well, make sure that you'll be one of the first Posthumans, obviously, but more on that later.
True, the future doesn't necessarily have to be bad for the less-than-superintelligent -- the SIs could be "eternal" philanthropists for all we know, altruism might turn out to be the most logically stable "Objective Morality", they could be our obedient, genie-like servants, or they might simply choose to ignore us altogether and fly off into space, but depending on such positive scenarios in the face of unknowability is dangerously naive (wishful thinking). Yet, though lip service is occasionally paid to the dangers of the Singularity and powerful new technologies in general, there is no known coordinated effort within the Transhumanist community to actively prepare for the coming changes. This has to do with the generally (too) optimistic, idealistic, and technophilic attitude of many Transhumanists, and perhaps a desire to make/keep the philosophy socially acceptable and [thus] easier to proliferate. Visions of a harsh, devouring technocalypse, no matter how realistic are usually dismissed as being too "pessimistic". Of course lethargy, defeatism, strife, and conservative thinking also contribute to the lack of focus and momentum in Transhumanism, but the main problem seems to be that "we" aren't taking our own ideas seriously enough, fail to fully grasp the implications of things like nanotech, AI, and the Singularity. It's all talk and no action.
Enter transtopianism. This philosophy follows the general outlines of Transhumanism in that it advocates the overcoming our biological and social limits by means of reason, science, and technology, but there are also some important differences. Principally, these are: 1) a much heavier emphasis on the Singularity, 2) the explicit inclusion of various elements -philosophical, political, artistic, economic & otherwise- which are "optional" (or nonexistent) in general Transhumanism, and 3) the intention to become a movement with a clearly defined organizational structure instead of just a (very) loose collection of more or less like-minded individuals (which is what Transhumanism, and to a somewhat lesser extent Extropianism, are). Essentially, transtopianism is an attempt to realize Transhumanism's full potential as a practical way to (significantly) improve one's life in the present, and to survive radical future changes. Unlike regular Transhumanism or even Extropianism, this is a "holistic" philosophy; a complete worldview for those who seek "perfection" in all fields of human endeavor. It is also a strongly dualistic philosophy, motivated by equal amounts of optimism and pessimism, instead of blind (or at least weak-sighted) technophilia.
transtopianism's main message is as simple as it is radical: assuming that we don't destroy ourselves first, technological progress will profoundly impact society in the (relatively) near future, culminating in the emergence of superintelligence and [thus] the Singularity. Those who will acquire a dominant position during this event, quite possibly a classical Darwinian struggle (survival of the fittest), will undoubtedly reap enormous benefits; they will become "persons of unprecedented physical, intellectual, and psychological capacity. Self-programming, self-constituting, potentially immortal, unlimited individuals''. Those who for whatever reason won't participate or fall behind will face a very uncertain future, and quite possibly extermination.
Wealth and power can not only make the present considerably more palatable; acquiring them is nothing short of a logical imperative for those who are serious about realizing their Transhuman hopes and dreams. It is, after all, nearly always the rich & powerful who have first access to new technologies. Unlike, for example, cars, TV sets, and cell phones, the technologies that will enable people to become/create our evolutionary successors aren't likely to eventually trickle down to the general public. Superintelligence won't be for sale at your local supermarket 20-50 years from now for pretty much the same reasons why one can't buy nukes in gunshops, even though the basic design is now more than half a century old (indeed, one can't even legally purchase a simple machine gun, let alone more advanced military hardware, in most countries). Bottom line: the really powerful, dangerous technologies are always hoarded and jealously guarded by (self-proclaimed) elite groups, and nothing is more powerful and potentially dangerous than Superintelligence. Even all-out WW3 would be trivial compared to the damage that a SI (with its arsenal of advanced nanoweapons and who knows what else) could inflict. By "ascending" you don't just obtain the ultimate weapon -- you become it. It doesn't seem very likely, or indeed sane, that SIs will freely hand out such power to anyone who asks for it. Also, even if they wouldn't object to sharing their power in principle, why would they bother? Just because something can be done doesn't automatically mean that it will be done, or "has" to be done. Would we start mass-upgrading ("uplifting") ants or mice if we had the technology? Probably not. Insignificance can be a real killer... Thus, it logically follows that we should put considerable effort into acquiring a good starting position for the Singularity, which includes things like gathering as much wealth as possible, keeping abreast of the latest technological developments, and implementing them to become more efficient and powerful individuals. Our primary interim subgoal is (must be) becoming part of the economic & technological elite. The Players. Our primary interim supergoal is (obviously) to become uploads -inorganic, "digital" beings- which will open the door to virtually unlimited additional enhancements, and ultimately godhood itself.
This is a tall order indeed, and to increase one's chances of success, cooperation with like-minded individuals is essential (unless, perhaps, you happen to be some kind of tech-savvy billionaire). Hence the transtopian movement and its Singularity Club, an "elite" mutual aid association for those who want to fully enjoy the present while effectively preparing for the radical future. Our radical future. This is the only such (known) group in existence. The WTA, Extropy Institute, and the various national/regional Transhumanist groups have a social and meme-spreading function. Useful? Certainly. Adequate? Not al all. They do not specifically aim to improve their members' social, mental, physical, financial etc. situation (true, occasionally it does happen, but the effect is usually very limited, short-lived, and not part of a clear strategy). Their vision of a prosperous, peaceful Trans- & Posthuman society largely depends on the wisdom and cooperation of the Establishment and/or "the masses" -- neither of which has a particularly good track record when it comes to rational behavior. To each his own. We salute the Transhuman, Extropian, and Singularitarian giants on whose shoulders we stand, but we have seen further and now it's time to move on. transtopians don't like to depend on the whims of society, governments, fate, or luck for their current and future well-being; they want to take control of their destiny, make their own rules, their own "luck". Of course, the chances of success may be slim, but since there's nothing to lose (we're all on death row anyway) and a universe to gain, why not give it a try? Given the circumstances, it is the rational thing to do. As the saying goes: shoot for the moon, even if you miss you'll land among the stars (well, maybe it will just be your disassembled molecules that will land among the stars, but you get the idea; it never hurts to try). When crossing uncharted territory, it is wiser to travel in groups, and even if the road ultimately leads to a dead-end, the journey itself might still be well worth the effort. Are you game?
|
© Copyright 2003 transtopia |