How ‘Longtermism’ Is Helping the Tech Elite Justify Ruining the World – The Swaddle

Posted: September 6, 2022 at 4:17 am

Its as if they want to build a car that goes fast enough to escape from its own exhaust.

These words were the takeaway from a meeting that Douglas Rushkoff, who describes himself as a Marxist media theorist, had with five extremely powerful people in tech who were looking to survive the impending climate catastrophe. Rushkoffs account of this in Survival of the Richestreveals something sinister: that righ-ranking elites in tech genuinely know whats coming but rather than stopping it, theyre planning on saving themselves from it in the form of underground luxury bunkers and armed guards comprised of Navy SEALs.

The people who are almost directly responsible for the worlds biggest problems today the climate crisis, eroding institutions of democracy, and the sale of peoples own attention for profit dont find themselves to be accountable. Not only that, accountability, or even fixing todays problems, isnt even a desirable goal for them. At least not, according to longtermism the philosophy undergirding much of techs trajectory and, if were not careful, our own destruction as a race.

Its an idea that, from its forward-looking scope, seems ambitious and futuristic upon first glance. But its one that believes in a future only for a few people self-appointed representatives of humanity at the cost of all the rest. And when billionaires begin to shack up underground or shoot off into space in a bid to colonize other planets, theyre not doing it for humanity as a whole; theyre doing it for a humanity that consists exclusively of their own ilk.

Stephen Hawking famously declared just a few years ago that we are at the most dangerous moment in the development of humanity. Even if were hardly doing anything about it, most are in some form of agreement that things are looking bleak, and were already seeing the effects of climate change in countries that havent done a lot to bolster it. But if longtermism has its way, this is nothing more than a blip in humanitys record. What makes the philosophy so dangerous is its ethical foundation, summed up by one of its early theoreticians Nick Bostrom: a non-existential disaster causing the breakdown of global civilisation is, from the perspective of humanity as a whole, a potentially recoverable setback.

Related on The Swaddle:

What Is the Environmental Cost of Space Tourism?

Bostroms work is heartily endorsed by tech giants with the resources and capacity to not only outrun any of the worlds current crises, but also irreversibly influence the direction of our species as a whole. There are two key concepts in Bostroms argument: potential, and existential risk. Potential is what longtermists understand to be humanitys capacity on a cosmic scale, a trillion years into the future. Our potential is as vast as the universe itself. An existential risk, according to the longtermist ethic, is one that threatens to wipe out humanity and with it, humanitys potential. This is the most tragic outcome and one that has to be avoided at all costs. Now its possible that a few people say, 15% of the worlds population survive climate change. That doesnt wipe out our potential even if it wipes out an unfathomable number of people and so, according to longtermism, isnt an existential risk.

The case for longtermism rests on the simple idea that future people matterJust as we should care about the lives of people who are distant from us inspace,we should care about people who are distant from us intime, wrote William MacAskill, the public face of longtermism. His book was endorsed by Elon Musk, who cited MacAskills philosophy as a close match for his own. Musk also happens to be one of the biggest players in the privatized space race, and his vision to colonize Mars is one that is increasingly no longer a semi-ironic joke.

Longtermisms roots is in a philosophy called effective altruism. Its one that Effective altruism, which used to be a loose, Internet-enabled affiliation of the like-minded, is now a broadly influential faction, especially in Silicon Valley, and controls philanthropic resources on the order of thirty billion dollars, notes a profile of MacAskill in The New Yorker.

Theres a web of influential figures writing the script of longtermism from various think tanks together, they comprise an enterprise thats worth more than 40 billion dollars. Among others, some advocate for sex redistribution, others say that saving lives in rich countries is more important than saving lives in poor countries, as philosopher mile P. Torres reported in Salon. Longtermisms utopia is a future where human beings are engineered to perfection leading to the creation of posthumans who possess only the best and most superior of traits with no flaws at all. This is an idea rooted in eugenics, and it fuels the most civilizationally cynical ideas of who gets to be considered superior, and who qualifies as inferior enough to be flushed out of our collective gene pool. Its important to note that what holds all of these ideas together is the benign-sounding idea of longtermism and its even creeping into the United Nations. The foreign policy community in general and the United Nations in particular are beginning to embrace longtermism, noted one UN Dispatch.

But if it wasnt already clear why the ideas themselves are dangerous, the people formulating them make it clear whose interests are at stake, and whose arent. contributors to fast-growing fields like the study of existential risk or global catastrophic risk are overwhelmingly white Bostrom idealizes a future in which the continued evolution of (post)humanity culminates in a form of technological maturity that adheres to mainstream norms of white maleness: deeply disembodied, unattached to place, and dominant over, or independent from, nature, note scholars Audra Mitchell and Aadita Chaudhury, who work in the areas of human ethics, ecology science, and technology.

Related on The Swaddle:

What the Soulless Zuckerberg Memes Say About Our Relationship With Tech

Tech overlords figuring out ways to survive what they know to be coming and euphemistically refer to as an event isnt just a short-sighted way out of the mess they themselves are complicit in. Its all part of the long game perhaps the longest one weve ever envisioned.

Nick Bostrom enjoys considerable ideological heft. As the chair of Oxfords Future of Humanity Institute (FHI), he is one among a growing group of philosophers who have their sights set on our future in terms of how much more we can think, accomplish, build and discover on a scale previously thought to be unthinkable. Anders Sandberg, a research fellow at the Future of Humanity Institute, told me that humans might be able to colonise a third of the now-visible universe, before dark energy pushes the rest out of reach. That would give us access to 100 billion galaxies, a mind-bending quantity of matter and energy to play with, wrote Ross Andersen, who investigated the philosophies actively shaping the future of our civilization.

Tech is key to achieving this kind of potential, which is why people in tech are so heavily invested (and investing) in the idea. At the heart of the ethical deliberations is the cost-benefit analysis: how much is it okay to lose for the sake of ensuring the potential of future people? Maybe even post-people? Its the greater good dilemma that has been used to justify devastating wars and policy decisions already: Now imagine what might be justified if the greater good isnt national security but the cosmic potential of Earth-originating intelligent life over the coming trillions of years? Torres asks.

the crucial fact that longtermists miss is thattechnology is far more likely to cause our extinction before this distant future event than to save us from it, they add.

But the lack of transparency, the inordinate resources, and the power concentrated in the hands of the overwhelmingly cis, white, techno-optimistic men who wield the worlds future in their hands stands in the way of recognizing this crucial fact.

See the original post:

How 'Longtermism' Is Helping the Tech Elite Justify Ruining the World - The Swaddle