We just published an interview: Christian Ruhl on why we’re entering a new nuclear age — and how to reduce the risks. You can click through for the audio, a full transcript, and related links. Below are the episode summary and some key excerpts.
Episode summary
We really, really want to make sure that nuclear war never breaks out. But we also know — from all of the examples of the Cold War, all these close calls — that it very well could, as long as there are nuclear weapons in the world. So if it does, we want to have some ways of preventing that from turning into a civilisation-threatening, cataclysmic kind of war.
And those kinds of interventions — war limitation, intrawar escalation management, civil defence — those are kind of the seatbelts and airbags of the nuclear world. So to borrow a phrase from one of my colleagues, right-of-boom is a class of interventions for when “shit hits the fan.”
- Christian Ruhl
In this episode of 80k After Hours, Luisa Rodriguez and Christian Ruhl discuss underrated best bets to avert civilisational collapse from global catastrophic risks — things like great power war, frontier military technologies, and nuclear winter.
They cover:
- How the geopolitical situation has changed in recent years into a “three-body problem” between the US, Russia, and China.
- How adding AI-enabled technologies into the mix makes things even more unstable and unpredictable.
- Why Christian recommends many philanthropists focus on “right-of-boom” interventions — those that mitigate the damage after a catastrophe — over traditional preventative measures.
- Concrete things policymakers should be considering to reduce the devastating effects of unthinkable tragedies.
- And on a more personal note, Christian’s experience of having a stutter.
Who this episode is for:
- People interested in the most cost-effective ways to prevent nuclear war, such as:
- Deescalating after accidental nuclear use.
- Civil defence and war termination.
- Mitigating nuclear winter.
Who this episode isn’t for:
- People interested in the least cost-effective ways to prevent nuclear war, such as:
- Coating every nuclear weapon on Earth in solid gold so they’re no longer functional.
- Creating a TV show called The Real Housewives of Nuclear Winter about the personal and professional lives of women in Beverly Hills after a nuclear holocaust.
- A multibillion dollar programme to invent a laser beam that could write permanent messages on the Moon, and using it just once to spell out #nonukesnovember.
Producer: Keiran Harris
Audio Engineering Lead: Ben Cordell
Technical editing: Ben Cordell and Milo McGuire
Content editing: Katy Moore, Luisa Rodriguez, and Keiran Harris
Transcriptions: Katy Moore
“Gershwin – Rhapsody in Blue, original 1924 version” by Jason Weinberger is licensed under creative commons
Highlights
The three-body problem
Christian Ruhl: For much of the Cold War, the US and the Soviet Union were the two nuclear superpowers. Other states eventually did acquire nuclear weapons, but in terms of arsenals, those two just towered over all of them. We’re talking orders of magnitude bigger. And that had been the case for a long time, this kind of bipolar order.
After the Cold War, people in many cases kind of stopped paying attention to this altogether. And what’s happened in the last couple of years is that China seems poised to expand its own arsenal. So in 2020, their number of warheads, best estimate, is in the low 200s — 220 or so. Last year, that was up to 400 something. And now we’re talking 500, and the projections suggest it could be as high as 1,000 by 2030 and 1,500 by 2035 — so really this massive increase.
Luisa Rodriguez: Wow. Yeah. In thinking through the significance of this, I remember when I was learning about nuclear war and nuclear weapons a few years ago, I remember kind of concluding for myself that nuclear war between the US and Russia seemed most terrifying, because they had so many warheads between them that you could get this terrible, scary thing called nuclear winter — which theoretically seems only likely to happen when you have thousands of nuclear warheads detonated. So one thing that just sticks out to me immediately is like, agh, there’s another global power that might eventually, potentially have enough warheads to create this kind of catastrophic-type outcome.
Are there other things significant about this, besides just that nuclear wars could be much worse? Well, at least the ones involving China now? For example, things about the kind of game theoretic dynamics of how all of these countries relate to each other?
Christian Ruhl: Yeah, that’s exactly right. Not all nuclear wars would be the same, and the very biggest wars would be by far the worst. So I think what you’ve written in the past is exactly right on the issue.
But yeah, I think there are some structural changes too that happen. So negotiations just become more complex when you have three parties rather than two, and there are issues with targeting when you’re potentially facing two adversaries.
I think it’s helpful to think about this with an analogy. Let’s say you’re an outlaw, and a fellow outlaw has challenged you to a duel. And you’re outside, and the tumbleweed is rolling, and the vultures are flying overhead — it’s a standoff, and we’ve been in that standoff for a while. And suddenly a third person joins, and you don’t know what to do. Are they going to point their gun at you? And that totally changes the structure of the game.
How AI could change the pace of war
Christian Ruhl: AI-enabled warfare changes the risk of nuclear war. You can even just have AI integrated in conventional warfare — where, let’s say, you automate more and more of the process, where AI-enabled weapon systems are just faster. You have AI-enabled ships, support systems, and many things start happening at machine speed, and shaving off some time here and there. The cumulative result of that might look like speeding up the pace of war. In China, they sometimes call this idea “battlefield singularity.” And in the West, it’s sometimes called “hyperwar.”
I think it’s kind of a subset of the broader point that AI is this general purpose technology. So we might expect it to transform a whole lot of how our world works, and this is a subset of that.
To maybe make this more concrete, there’s this article from Michael Horowitz called “When speed kills,” and that gets at some of the intuitions. One of those could be that, with this increasing speed of war, you’re compressing a two-week crisis into two hours, right?
So imagine the Cuban Missile Crisis playing out much, much faster. So that might not leave time for people like Vasily Arkhipov, this Soviet naval officer who famously helped prevent a nuclear torpedo launch during the crisis. When things happen so fast, there just might not be time for that.
Interventions to the "right of boom"
Christian Ruhl: The logic here is we should have a layered defence against catastrophic risks. There’s this great article, I think from 2020, called “Defence in depth against human extinction.” So imagine you live in a world, again, with cars — but no seatbelts, no airbags, or any other safety features. That’s the world we live in right now when it comes to nuclear war. And fundamentally, that’s why I think we should be dedicating more resources to right-of-boom interventions.
So that’s the general idea, but it’s also, as you suggest, a subtler argument about philanthropic strategy here, and about making allocations in philanthropy under high uncertainty. So fundamentally, this is about taking not just one step back but like 10 steps back, and thinking about the structure of the problem at a really high level, to kind of figure out the most effective ways to do good at the margins.
So we know a few things about nuclear war. First of all, not all nuclear wars are created equal. There’s a qualitative difference between a single weapon going off, and the superpowers unleashing their full arsenals. One of those is, as you said, a truly horrific humanitarian disaster, but it’s mostly local. And the other one is this unprecedented global cataclysm that might well threaten modern civilisation itself.
So Herman Kahn, the Cold War strategist, has this phrase from his book On Thermonuclear War, in which he points to “tragic but distinguishable postwar states.” What he’s saying is the largest nuclear wars are disproportionately worse than smaller nuclear wars, which means that much of the total expected cost there lies with those largest wars. It’s a familiar feature in catastrophic risk; I think we see something very similar when looking at pandemics and biosecurity.
That’s for a few reasons. One of those is, as you pointed out, nuclear winter potentially kicking in. And now it turns out that these very interventions that, as we just established, might be the most important ones from keeping a limited nuclear war from turning into the largest possible nuclear wars, also happen to be the interventions that are very neglected. And from a philanthropist’s point of view, that’s a philanthropic jackpot. That’s exactly what you want.
Nuclear hotlines
Christian Ruhl: So back to the car analogy. Let’s say your car suddenly breaks down on the highway: you can honk your horn, you can turn on your warning lights to signal to other drivers you had an accident, you can call somebody to help take care of your car.
So in nuclear crises, that might include that leader-to-leader hotlines exist, and that they’re actually used as intended if, god forbid, something goes wrong and a nuclear weapon accidentally goes off. I have a report on this called Call Me, Maybe? that goes into this a bit more, but essentially the US and Russia have this long history of working together to reduce nuclear risks, and that includes the hotline that was established after the Cuban Missile Crisis.
So if, god forbid, something goes wrong, you can contact the other state and say, “Hey, sorry, please don’t nuke us back.” This actually isn’t a literal phone; it’s called the DCL — the direct communications link. It used to be teletype via cable. Now it’s basically email via satellites.
One big concrete problem here is that China is very bad at answering during crisis situations. There’s one example here where Chinese leaders didn’t respond to repeated US contact attempts during the Hainan Island incident that was in 2001. So in this incident, Chinese fighter jets got too close to a US spy plane doing routine operations, and the spy plane had to make an emergency landing on Hainan Island. And the US plane contained highly classified technology, and the crew sort of tried to destroy as much of it as possible as they could before being captured. If you read through some of the reports, apparently they were pouring coffee on it at one point.
Throughout the incident, US leaders tried to reach Chinese leaders via the hotline, but the Chinese didn’t answer. So the deputy secretary of state at the time remarked, “It seems to be the case that when very difficult issues arise, it’s sometimes hard to get the Chinese to answer the phone.”
So this was 2001. Scary enough back then. I think with heightened tensions now over Taiwan and over the South China Sea, we can imagine what might happen. A Biden administration official recently said that hotlines that have been set up have just “rung kind of endlessly in empty rooms in China.” So here we have a concrete problem, a concrete funding opportunity that people actually haven’t looked into much: basically to fund a study to understand Chinese attitudes towards these systems, fund track two diplomatic dialogues, see if they can find common ground on, “Hey, maybe pick up the phone?”
I loved this episode as it clearly laid out the challenges with nuclear weapons and looked at possible interventions. I am a bit curious why it was "demoted" to after hours - it felt perhaps more relevant than some recent "main show" episodes on evolutionary X (evolutionary history, evolutionary psychology, etc.). Or maybe you are trying to draw in a wider audience by covering a wider array of topics, including topics starting to fall outside of priority causes.
Thanks for sharing!
I think this overstates the case for focussing on large nuclear wars (relatedly), because these may be significantly less likely. To illustrate:
Nuclear wars arguably scale much faster than conventional ones, but it in not obvious to me whether large nuclear wars are being unfairly neglected given the consideration above.