Skip to content
All Newsletters

FLI February 2021 Newsletter

Published
February 20, 2021
Author
Georgiana Gilgallon

Contents

FLI February 2021 Newsletter

Nominate an Unsung Hero for the 2021 Future of Life Award!


We’re excited to share that we’re accepting nominations for the 2021 Future of Life Award!

The Future of Life Award is given to an individual who, without having received much recognition at the time, has helped make today dramatically better than it may otherwise have been.

The first two recipients, Vasili Arkhipov and Stanislav Petrov, made judgements that likely prevented a full-scale nuclear war between the U.S. and U.S.S.R. In 1962, amid the Cuban Missile Crisis, Arkhipov, stationed aboard a Soviet submarine headed for Cuba, refused to give his consent for the launch of a nuclear torpedo when the captain became convinced that war had broken out. In 1983, Petrov decided not to act on an early-warning detection system that had erroneously indicated five incoming US nuclear missiles. We know today that a global nuclear war would cause a nuclear winter, possibly bringing about the permanent collapse of civilisation, if not human extinction. The third recipient, Matthew Meselson, was the driving force behind the 1972 Biological Weapons Convention. Having been ratified by 183 countries, the treaty is credited with preventing biological weapons from ever entering into mainstream use. The 2020 winners, William Foege and Viktor Zhdanov, made critical contributions towards the eradication of smallpox. Foege pioneered the public health strategy of ‘ring vaccination’ and surveillance while Zhdanov, the Deputy Minister of Health for the Soviet Union at the time, convinced the WHO to launch and fund a global eradication programme. Smallpox is thought to have killed 500 million people in its last century and its eradication in 1980 is estimated to have saved 200 million lives so far.

The Award is intended not only to celebrate humanity’s unsung heroes, but to foster a dialogue about the existential risks we face. We also hope that by raising the profile of individuals worth emulating, the Award will contribute to the development of desirable behavioural norms.

If you know of someone who has performed an incredible act of service to humanity but been overlooked by history, nominate them for the 2021 Award. This person may have made a critical contribution to a piece of groundbreaking research, set an important legal precedent, or perhaps alerted the world to a looming crisis; we’re open to suggestions! If your nominee wins, you’ll receive $3,000 from FLI as a token of our gratitude.

Policy & Outreach Efforts


FLI’s Director of AI Projects Richard Mallah Co-Founded the Consortium on the Landscape of AI Safety

FLI’s Director of AI Projects Richard Mallah co-founded the Consortium on the Landscape of AI Safety. An IEEE-ISTO (Industry Standards and Technology Organisation) programme as of 1st January 2021, the CLAIS is a formal industry consortium that works to bridge disparate perspectives on trustworthy AI. Richard will represent FLI as a member of the consortium.

On 8th January, Richard presented on the benefits and coming work products of CLAIS at the 2021 International Joint Conference on Artificial Intelligence’s AI Safety workshop, which seeks to explore new ideas on safety engineering, as well as broader strategic, ethical and policy aspects of safety-critical AI-based systems.

Richard also co-chaired the SafeAI workshop at AAAI on 8th February and co-led panel discussions on avoiding negative side effects, adversarial robustness, and safe reinforcement learning.

FLI Launches nuclearweapons.info

In celebration of the United Nations’ Treaty for the Prohibition of Nuclear Weapons entering into force on 22nd January, FLI launched nuclearweapons.info. The concept for this site was first developed by Aaron Stupple who worked with FLI to support its development. The site represents a collaboration of two dozen organisations, and with it, we’ve created a single location on the internet where you can go to learn about the threat of nuclear weapons, the organisations involved in ending the threat, and easy ways you can help.

The site has launched with 150 resources, and we expect it to continue growing. You can learn about why nuclear weapons pose more of a threat today than at almost any other point in history, the odds of accidental launches, the existing national and international policies for reducing nuclear arsenals, and more. Whether you learn best through reading articles, watching videos, listening to podcasts or playing around with interactive applications, this site has it all.


FLI Engages with the U.S. National Institute of Standards and Technology

FLI is engaged in discussions with the U.S. National Institute of Standards and Technology (NIST) in their continued development of their Four Principles of Explainable AI. FLI will continue to actively advise NIST as they develop a “risk management framework for trustworthy AI systems” as required by recent law.


FLI Seeks to Develop Policy Recommendations to Ensure Productive AI Governance

With the turn of the new year, a new Administration in place in the U.S., and plans from the European Union to build on last year’s work on AI policy, FLI is working internally and collaborating with numerous organisations to develop a concrete set of ambitious policy recommendations that can be pursued to ensure near-term AI governance efforts help shape a more positive future with increasingly powerful AI systems being developed and deployed.

New Podcast Episodes


Beatrice Fihn on the Total Elimination of Nuclear Weapons

In this commemorative episode of the Future of Life Podcast, we are joined by Beatrice Fihn, Executive Director of the International Campaign to Abolish Nuclear Weapons.

Beatrice discusses the relative risks of accidental versus intentional nuclear war, whether nuclear deterrence makes sense as a defence strategy and the policy initiatives she would like to see the Biden administration pursue as a matter of high priority. These include renewing the New START Treaty, adopting a No First Use policy and removing ground-based missile systems from the nuclear triad. In addition, she considers the pervasiveness of toxic masculinity among the leaders of nuclear-armed nations and what it truly means for a country to be strong.


Max Tegmark and the FLI Team on 2020 and Existential Risk Reduction in the New Year

In this episode of the annual X-Hope podcast, FLI team members discuss our favourite projects from 2020, what we’ve learned amid the global pandemic, and what it is we think is needed for existential risk reduction in 2021.

The most obvious and in-our-face lesson is that humanity is more fragile than many would have liked to think. Before 2020, it was very easy to dismiss people who worried about existential risk or global catastrophic risk and say ‘Hey, you’re just a bunch of loser doomsayers. It’s not going to happen. Or if things start to happen, don’t worry; I’m sure our governments are competent enough to handle this.’ After COVID-19 and the response we’ve seen, I think people are much more receptive to the idea that we are actually much more vulnerable than we should be. – Max Tegmark

News & Reading


Max Tegmark: AI and Physics | Lex Fridman Podcast #155

Max Tegmark makes his second appearance on the Lex Fridman podcast. He discusses whether AI could discover new laws of physics, how it is we can reverse the growing phenomenon of fake news and widespread information, the dawn of lethal autonomous weapons, and Elon Musk’s thinking about AI, among other things.


The Doomsday Clock Remains at 100 Seconds to Midnight

On the 27th January, the Bulletin of the Atomic Scientists announced that the Doomsday Clock will remain at 100 seconds to midnight for 2021.

The “Clock has become a universally recognised indicator of the world’s vulnerability to catastrophe from nuclear weapons, climate change, and disruptive technologies in other domains.” It began in 1947 at 7 minutes to midnight, where midnight represents the apocalypse, and by the end of the Cold War had regressed to 17 minutes to. Unfortunately, the situation has steadily worsened since and in January 2020, the Clock was set at 100 seconds to, closer to midnight than ever in its history.


Global Ice Loss Accelerating at Record Rate, Study Finds

report has found that the rate of global ice loss is now in line with the worst-case scenarios of the Intergovernmental Panel on Climate Change, the world’s leading authority on climate. About 28 tonnes of ice was lost between 1994 and 2017, enough to put an ice sheet 100 metres thick across the United Kingdom. It’s estimated that about two thirds of the ice was caused by the warming of the atmosphere, with the remainder caused by an increase in sea temperature.


Winners of FLI-sponsored 2020 Short Fiction Contest Announced

In 2020, FLI partnered with Sapiens Plurum to sponsor a short fiction contest. The competition challenged participants to consider how technology could one day be used to increase connection and empathy, even between different species. In particular, it welcomed stories that encouraged the reader to view life from another species’ point of view.

The winning entry tells the story of a human and a hyena who, in virtue of protein microchips implanted in their brains, are able to communicate from afar via their thoughts. The runner-up focuses on Annalia, a young girl assigned to study Harvey, an octopus in captivity, using an AI trained to recognise and report animals’ basic emotional states. The story that placed in third is about a quadriplegic who uses a miniature drone to discover, infiltrate and attempt to rescue a dying bee colony.

Our newsletter

Regular updates about the Future of Life Institute, in your inbox

Subscribe to our newsletter and join over 20,000+ people who believe in our mission to preserve the future of life.

Recent newsletters

Future of Life Institute Newsletter: FLI x The Elders, and #BanDeepfakes

Former world leaders call for action on pressing global threats, launching the campaign to #BanDeepfakes, new funding opportunities from our Futures program, and more.
March 4, 2024

Future of Life Institute Newsletter: The Year of Fake

Deepfakes are dominating headlines - with much more disruption expected, the Doomsday Clock has been set for 2024, AI governance updates, and more.
February 2, 2024

Future of Life Institute Newsletter: Wrapping Up Our Biggest Year Yet

A provisional agreement is reached on the EU AI Act, highlights from the past year, and more.
December 22, 2023
All Newsletters

Sign up for the Future of Life Institute newsletter

Join 40,000+ others receiving periodic updates on our work and cause areas.
cloudmagnifiercrossarrow-up linkedin facebook pinterest youtube rss twitter instagram facebook-blank rss-blank linkedin-blank pinterest youtube twitter instagram