Skip to content
All Newsletters

FLI August 2022 Newsletter

Published
September 14, 2022
Author
Will Jones

Contents

Nuclear Winter Deservedly Back in the Public Eye

On 6th August, the Future of Life Institute proudly announced the winners of the 2022 Future of Life Award. They are John Birks, Paul Crutzen, Jeannie Peterson, Alan Robock, Carl Sagan, Georgiy Stenchikov, Brian Toon and Richard Turco.

All eight of these heroes win the Future of Life Award for their roles in discovering and popularising nuclear winter.

We hope that drawing attention to these individuals’ work helps to refocus attentions on the risks of nuclear war. As FLI President Max Tegmark put it, ‘The current geopolitical conflict discourse is absurdly cavalier about nuclear war risk’, given that ‘the latest nuclear winter research confirms that Reagan was right when he said that a nuclear war cannot be won and must never be fought.’

That same day FLI held an evening event called ‘Winter is Coming?‘ at Pioneer Works, Brooklyn, to celebrate these winners, and explore the science and history of their discovery, nuclear winter. The date held a particular significance to the subject matter: August 6th marked the 77th anniversary of the nuclear bombing of Hiroshima at the end of World War II.

At the event, one panel discussion, moderated by FLI Advocacy Director Dr. Emilia Javorsky, saw Robock, Brian Toon and Turco explaining how they came to their initial findings in the 1980s, and what they’ve discovered since. Another, chaired by FLI President Max Tegmark, looked at ‘The Communication’ of Nuclear Winter, with contributions by Carl Sagan’s widow and collaborator, Ann Druyan, alongside Birks and Stenchikov. The winners, or relatives on their behalf, then received their prizes. It was a delight to see these unsung heroes get the applause they deserved from such a large audience.

You can watch this video to hear ‘The Story of Nuclear Winter’, and find out more about the winners and their award here. We hope that these heroes will continue to be honoured, by the further sharing and recognition of their terrifying warning.

New Study Confirms Worst Fears About Nuclear Winter and Its Impact on Global Food Supplies

Last week, a team of researchers, including Future of Life Award winners Brian Toon and Alan Robock, and led by Lili Xia of Rutgers University, published a paper for Nature Food examining ‘Global food insecurity and famine from reduced crop, marine fishery and livestock production due to climate disruption from nuclear war soot injection’. FLI released avideo depicting ‘What nuclear war looks like from space‘, made using state-of-the-art simulation data from this and other recent science papers on nuclear winter. Xia and her team’s study was covered by Metro, Science magazine, Sky News, The Sun, and numerous other media outlets. It would appear that the facts – that full-scale nuclear war would cause a global famine, killing five billion people, and that even a regional war could devastate agriculture around the world and affect billions – caught people’s attention.d your own text

Grants

Applications Re-Open for the Vitalik Buterin PhD Fellowship

The Future of Life Institute is delighted to announce that it is now accepting applications for its PhD Fellowship focused on AI Existential Safety research, to join the 8 current fellows. Both current and future PhD students are invited to apply. Fellows will receive funding to cover tuition, fees, and the stipend of the student’s PhD program up to $40,000, as well as a fund of $10,000 that can be used for research-related expenses such as travel and computing. Funding is conditional on being accepted to a PhD program, working on AI existential safety research, and having an advisor who can confirm that they will support the student’s work on AI existential safety research. The deadline to apply is November 15, 2022. This fellowship is made possible by the generosity of cryptocurrency pioneer Vitalik Buterin and the Shiba Inu community.

New Grant Opportunities for Research into the Humanitarian Impact of Nuclear War

Building upon this year’s Future of Life Award, which celebrated contributions to the discovery and popularisation of Nuclear Winter, FLI is excited to launch a new research grant opportunity for research into the humanitarian impacts of nuclear war. The full RFP will be available soon; possible research areas will include: nuclear war scenario development; fuel loads; urban fires; climate simulation models; and the impacts of nuclear war on humanity. We anticipate that the deadline for brief pre-proposals will be November 15, 2022. Please stay tuned to our website and social media for more details.

Policy

NIST Framework

On August 18, the United States National Institute of Standards and Technology (NIST) released the second draft of the Artificial Intelligence Risk Management Framework (RFM). The AI RMF, which will likely be published in early 2023, will serve as a voluntary set of risk assessment and management guidelines for AI developers, users, and auditors. FLI scholars previously commented on the first draft of the framework and offered their thoughts to the EU on how the RMF can augment the EU AI Act. We look forward to engaging the NIST to strengthen the RMF and expand its utility to other jurisdictions.xt

Media

Synthetic Media Raising Genuine Concerns

AI systems like DALL-E 2 and GPT-3 are creating new marketplaces, particularly for synthetic media. Last month, TechCrunch profiled the rise of “Prompt Engineering.” Kyle Wiggers reports that the idea behind prompt engineering is ‘to provide an AI system “guidelines” or detailed instructions so that it, drawing on its knowledge of the world, reliably accomplishes the thing being asked of it.’

While these developments raise several ethical and moral concerns, researchers at the Stockholm Peace Research Institute and the United Nations Office for Disarmament Affairs are worried about strategic stability. Vincent Boulanin and Charles Obink argue that the AI community isn’t paying enough attention to how these commercial developments might be misused by malicious state and non-state actors.

Among the risks they cite are the impact of deepfakes on strategic stability. Earlier this year, a synthetic video of President Zelensky asking his troops to surrender was widely circulated in the media before it was debunked. How long before these developments affect nuclear security? The Pope is certainly concerned. At the 10th Review Conference of the Non-Proliferation Treaty which concluded on 26th August, the Vatican warned of the ‘serious risks of deep fakes and poisoned data triggering nuclear weapons use in quick response to false information.’

Other Media You Might Like

Measuring capabilities and collaboration: The Centre for Security and Emerging Technology at Georgetown University released a new dataset which provides country level data for researchers to measure domestic AI capabilities and cross-national collaboration.

Consensus eludes the 10th NPT RevCon:  The Tenth Review Conference of the Parties to the Treaty on the Non-Proliferation Treaty concluded without consensus on August 26. While lamenting the RevCons’ failure  “to address the urgency of the moment,” the International Campaign to Abolish Nuclear Weapons noted that there were some successes this year. Earlier in June, State Parties to the Treaty on the Prohibition of Nuclear Weapons adopted an ambitious agendatowards nuclear disarmament.

How do we represent the future? The Journal of Ethics and International Affairs published a series of articles on climate change negotiations and the representation of future generations this August.

Hidden preferences and collective action: A new paper published in Nature Communications by Gregg Sparkman and colleagues finds that the American public drastically underestimates how concerned other Americans are about climate change. It finds that “..while 66–80% Americans support these policies, Americans estimate the prevalence to only be between 37–43% on average.” In an accompanying comment, Cynthia Frantz writes that science communicators must address this knowledge gap to overcome policy inaction.

The future of food: Are genetically edited crops the future of agriculture and food security? Find out in the Wall Street Journals Future of Everything podcast.

Stay informed on the EU AI Act: for regular updates on upcoming AI regulation from Europe, follow our EU Policy Researcher Risto Uuk’s EU AI Act Newsletter. Twice a month Risto highlights the latest legislative updates and curates important analysis about the EU’s AI Act.

FLI is a 501c(3) non-profit organisation, meaning donations are tax exempt in the United States.If you need our organisation number (EIN) for your tax return, it’s 47-1052538. FLI is registered in the EU Transparency Register. Our ID number is 787064543128-10.

Our newsletter

Regular updates about the Future of Life Institute, in your inbox

Subscribe to our newsletter and join over 20,000+ people who believe in our mission to preserve the future of life.

Recent newsletters

Future of Life Institute Newsletter: FLI x The Elders, and #BanDeepfakes

Former world leaders call for action on pressing global threats, launching the campaign to #BanDeepfakes, new funding opportunities from our Futures program, and more.
March 4, 2024

Future of Life Institute Newsletter: The Year of Fake

Deepfakes are dominating headlines - with much more disruption expected, the Doomsday Clock has been set for 2024, AI governance updates, and more.
February 2, 2024

Future of Life Institute Newsletter: Wrapping Up Our Biggest Year Yet

A provisional agreement is reached on the EU AI Act, highlights from the past year, and more.
December 22, 2023
All Newsletters

Sign up for the Future of Life Institute newsletter

Join 40,000+ others receiving periodic updates on our work and cause areas.
cloudmagnifiercrossarrow-up linkedin facebook pinterest youtube rss twitter instagram facebook-blank rss-blank linkedin-blank pinterest youtube twitter instagram