It may seem strange, during the worst global crisis since the Second World War, to spend time thinking about even more destructive catastrophes. But that’s exactly what I’ve been doing, reading Toby Ord’s new book The Precipice. Ord writes about so-called “existential risks” for humanity, which include outright extinction, but also the permanent collapse of civilization, or an irrevocable dystopia such as a world-spanning totalitarian regime. What these scenarios have in common is a permanent destruction of the potential for human flourishing.
Now, coming from anyone else, I’d probably brush off such talk about “human extinction” as sensationalist doomsday prophesying. I probably wouldn’t even buy the book. So why did I? Not just because Ord is a philosophical colleague who teaches at Oxford university, but mainly because he’s one of the founding figures of the Effective Altruism movement, which I hold in very high regard. In my experience, few communities have higher epistemic standards than Effective Altruism.
And indeed, Ord’s book didn’t disappoint. While it’s highly readible even for a lay audience, it’s meticulously well-researched (half of the book consists of long discursive footnotes with technical details and references). And it changed my thinking on a number of issues.
First, it’s important to note that Ord is not a pessimist. He does not claim that existential catastrophes WILL occur, or even that they are likely to occur. Indeed, like other Effective Altruists, he believes that humanity has achieved tremendous progress over the past 200 years, and that a much better world is possible, if we put our minds to it. Humanity can live on for millions or even billions of years, achieving much higher levels of flourishing than today, provided we don’t screw up somewhere along the way.
Precisely because we have already achieved so much, conquering poverty, hunger and disease for the first time in history, there is so much at stake. Science and technology have brought incredible progress, but they have also given us unprecedented powers of destruction. Effective Altruists have been mostly focusing on solving current world problems (what’s the best way to alleviate extreme poverty and eradicate disease?), but you can apply the same rational principles to the future: What’s the best way to ensure humanity’s potential for flourishing? What are the biggest risks facing humanity that could permanently destroy our potential?
So what sort of catastrophes is Ord talking about? Many people are worried about the future of our climate, and with good reason. Though there is a long chapter on extreme climate change, Ord is at least as concerned about pandemics, especially ones caused by lab-created viruses. (He wrote all this before the current coronavirus pandemic). For an existential catastrophe in the next century caused by a man-made pandemic, Ord assigns a probability of no less than 1/30, much higher than for extinction through runaway climate change (1/1000). Other major risks in the book: super-volcanoes, asteroids, nuclear war, AI running amok.
I’m still inclined to think that Ord overestimates the risks. Most of his probabilities are not derived through calculations, but are personal best estimates reflecting his knowledge of the field. But I don’t know for sure. And uncertainty is not reassuring in this domain. As my colleague Simon Friederich and I write in a forthcoming piece: “These scenarios have one thing in common: the chances that they will ever occur may be (very) small, but it’s still prudent to seek protection against them, given the high levels of uncertainty and how much is at stake.
Anyway, The Precipice is one of the most thought-provoking and well-researched books I’ve read in a while. Recommended to my fellow philosophers as well as to ordinary mortals!