What’s the worst that could happen? Tackling existential risk – The Interpreter

Posted: December 29, 2021 at 9:58 am

What would happen if you decided to cross the road without checking the traffic? Odds are that youd survive unscathed. But do it enough times and youre likely to come a cropper.

As a species, humanity is now playing with technological innovations that pose a small but real risk of ending our existence. Tens of thousands of nuclear weapons pointed at major cities. Biotechnology that could allow the creation of deadly pathogens. Computer technology that could create a machine that is smarter than us and doesnt share our goals. And all the while climate change could lead to unstoppable feedback loops.

As a teenager, I joined Palm Sunday anti-nuclear rallies. As an adult, Ive been a strong advocate of climate change action. But when I entered parliament in 2010, the issue of existential risk didnt loom large on my radar. My priority was peoples quality of life, not the end of life itself.

Ive come to believe that catastrophic risk is a vital issue. In my new book Whats the Worst That Could Happen? Existential Risk and Extreme Politics (MIT Press), I quote the estimate of Oxford philosopher Toby Ord that the chance of a species-ending event in the next century is one in six. That basically means that humanity is playing Russian roulette once a century. If we keep it up for another millennium, theres a five in six chance that humans never make it to the year 3000.

Like a person who crosses the road without checking for traffic, the odds are that youll eventually get hit.

Thats tragic for those who perish and for those who would never get to experience life at all. Weve got another billion years or so before the sun engulfs Earth. Thats enough time for another 30 million generations of humans. Not bad for a species thats only been around for about 10,000 generations so far. Far from being the stuff of science fiction, ensuring the safety of the human project should be a vital responsibility for all of us today.

What are the biggest risks? Naturally occurring hazards arent trivial. They include super volcanoes such as the one that formed Yellowstone National Park. An asteroid, of the kind that wiped out the dinosaurs 65 million years ago. Naturally occurring pandemics such as the Black Death. Such dangers are real and merit an appropriate response. In September, NASAs Planetary Defense Coordination Office will carry out an experiment in which it intercepts a nearby asteroid (not one that threatens Earth) and attempts to knock it off course.

But the biggest risks are the ones that our technologies have wrought. Unexpected climate change feedback loops such as the melting of the Greenland and Antarctic ice sheets could lead to long-term temperature rises of 6C or more. Nuclear missiles kept on hair trigger alert might lead to a miscalculation that ends up in a large-scale nuclear conflict. The misuse of genetic technologies could see terrorists produce a bug that spreads as quickly as measles, but is far more deadly. When computers become smarter than humans, we need to ensure that the first superintelligence doesnt regard humanity the same way that most of us see the worlds insects.

Underlying all of this is the rise of populism: the philosophy that politics is a conflict between the pure mass of people and a vile elite. Since 1990, the number of populist leaders holding office worldwide has quintupled. Most are right-wing populists, who demonise intellectuals, immigrants and the international order.

Everyone who cares passionately about the future of humanity should view populism as a cross-cutting danger and consider how to stem its rise.

As Covid-19 demonstrated, populists angry approach to politics, score towards experts and disdain for institutions made the pandemic much worse. The same goes for other catastrophic risks. Donald Trumps unilateral withdrawal from the Iran nuclear deal and Paris climate accord made these two catastrophic risks worse. Forging an international agreement on artificial intelligence safety will likely prove impossible if the populists run the show.

What can we do about it? For each existential peril, theres a handful of sensible solutions. For example, to reduce the threat of bioterrorism, we should improve the security of DNA synthesis. To tackle climate change, we need to cut carbon emissions and assist developing nations to follow a low-emissions path. To lower the chance of atomic catastrophe, we should take missiles off hair-trigger alert and adopt a universal principle of no first use. To improve the odds that a super intelligent computer will serve humanitys goals, research teams should adopt programming principles that mandate advanced computers to be observant, humble and altruistic.

Beyond this, everyone who cares passionately about the future of humanity should view populism as a cross-cutting danger and consider how to stem its rise. This means sustaining well-paid jobs in communities hit by technological change. Ensuring that the education system is accessible to everyone, not just the fortunate few. And reforming democracy so that electoral outcomes represent the popular will. Instead of angry populism, the cardinal Stoic virtues courage, prudence, justice and moderation can guide a more principled politics and ultimately shape a better world.

Andrew Leigh is a member of the Australian parliament and the author of Whats the Worst That Could Happen? Existential Risk and Extreme Politics (MIT Press).

Read the original here:

What's the worst that could happen? Tackling existential risk - The Interpreter

Related Posts