Sign up to the newsletter
Artificial Escalation
Our fictional film depicts a world where artificial intelligence ('AI') is integrated into nuclear command, control and communications systems ('NC3') with terrifying results. When disaster strikes, American, Chinese and Taiwanese military commanders quickly discover that with their new operating system in place, everything has sped up. They have little time to work out what is going on, and even less time to prevent the situation escalating into a major catastrophe.
See the related Op-Ed in the Bulletin of the Atomic Scientists.
What is the danger of AI in NC3?
The safety of our world is already at risk from accidental or intentional nuclear war. AI integration into the critical functions of NC3 systems could further destabilize this delicate dynamic, with calamitous consequences.
Here are just a few reasons why, which are depicted in the film:
- The Nature of AI. AI can be unpredictable, unreliable and is vulnerable to cyberattacks – not ideal for the systems controlling the world’s most dangerous weapons. With no real nuclear war scenarios available, they would also train primarily on simulations, meaning they may respond erratically to unexpected events. Nuclear escalations are not likely to unfold by the book, and AI systems can often react (or fail) in ways quite different from humans.
- Losing Control at Breakneck Speed. AI can accelerate the speed of warfare, leaving less time for understanding, communication and clear-headed decision-making. With only a moment to think, commanders are likelier to trust computer readouts or judgements, and less likely to interrogate or reject them.
- Geopolitical Instability. A world of arms races and nuclear tensions often prioritizes speed over safety, conflict over diplomacy, action over understanding. At such times, novel technology could be adopted before it has been properly tested.
Now is the time for countries to draw lines on prohibiting uses of AI in NC3, develop robust mitigation measures, and identify stabilizing policies to ensure humans always maintain control over nuclear stability.
Backstory
Policy Primer
What can you do about this?
- Learn more about the Risks of AI in NC3 and potential policy solutions.
- Learn more about FLI's work to Mitigate the risks of AI integration in nuclear launch.
- Spread the word by sharing the film and this webpage.
How Bad Can It Be?
Along with explosions, radioactive fallout and electromagnetic pulse, a nuclear war could cause black smoke to block sunlight across the northern hemisphere, destroying agriculture for several years. This is called nuclear winter, and it could kill 2 in 3 people on Earth. Watch the video below to see what this would look like.
Further resources
If you would like to learn more about these scenarios, we recommend the following papers:
Xia L, Robock A, Scherrer K, et al. Global food insecurity and famine from reduced crop, marine fishery and livestock production due to climate disruption from nuclear war soot injection. Nature Food. August 2022.
Boulanin V, Saalman L, Topychkanow P, Su F, Carlsson MP. Artificial Intelligence, Strategic Stability and Nuclear Risk. Stockholm International Peace Research Institute. 2020.
Hruby J, Miller MN. Assessing and Managing the Benefits and Risks of Artificial Intelligence in Nuclear-Weapon Systems. Nuclear Threat Initiative. 2021.
Wehsener A, Walker L, Beck R, Philips L, Leader A. Forecasting the AI and Nuclear Landscape. Institute for Security and Technology. September 2022.
Bajema N & Gower J. A Handbook for Nuclear Decision-making and Risk Reduction in an Era of Technological Complexity. Council on Strategic Risks. December 2022.