Skip to content
All Newsletters

FLI May, 2020 Newsletter

Published
15 July, 2020
Author
Revathi Kumar

Contents

FLI May, 2020 Newsletter

New Podcast Episodes: Sam Harris & George Church

On Global Priorities, Existential Risk, and What Matters Most with Sam Harris


Human civilization increasingly has the potential both to improve the lives of everyone and to completely destroy everything. The proliferation of emerging technologies calls our attention to this never-before-seen power — and the need to cultivate the wisdom with which to steer it towards beneficial outcomes. If we’re serious both as individuals and as a species about improving the world, it’s crucial that we converge around the reality of our situation and what matters most. What are the most important problems in the world today and why? In this episode of the Future of Life Institute Podcast, Sam Harris joins us to discuss some of these global priorities, the ethics surrounding them, and what we can do to address them. Listen here.

You can find all the FLI Podcasts here and all the AI Alignment Podcasts here. Or listen on SoundCloud, iTunes, Google Play and Stitcher.

On the Future of Computation, Synthetic Biology, and Life with George Church


Progress in synthetic biology and genetic engineering promise to bring advancements in human health sciences by curing disease, augmenting human capabilities, and even reversing aging. At the same time, such technology could be used to unleash novel diseases and biological agents which could pose global catastrophic and existential risks to life on Earth. George Church, a titan of synthetic biology, joins us on this episode of the FLI Podcast to discuss the benefits and risks of our growing knowledge of synthetic biology, its role in the future of life, and what we can do to make sure it remains beneficial. Will our wisdom keep pace with our expanding capabilities? Listen here.

FLI’s Unsung Hero Search

Help Us Find the Next Winner of the FLI Award!


The Future of Life Institute is seeking nominations for the Future of Life Award, a $50,000 prize given to an individual who, without receiving much recognition at the time, has helped make today dramatically better than it may otherwise have been. Our first two recipients were Vasili Arkhipov and Stanislav Petrov, whose heroic actions may have prevented all-out nuclear war in 1962 and 1983, respectively. Our 2019 recipient was Dr. Matthew Meselson, who spearheaded the international ban on bioweapons. We are confident that there are many more unsung heroes out there who have done incredible work to ensure a beneficial future of life on Earth, and we need your help to ensure they get the recognition and honor they deserve.

Do you know someone who deserves the Future of Life Award? If so, please submit their name to our Unsung Hero Search page. If your nominee wins the award, you will receive a $3,000 award from FLI as a token of our appreciation. Moreover, to incentivize the search, we also reward those recruiting other nominators, inspired by MIT’s successful red balloon strategy, allocating up to $6,000 in nomination bonus for each award: the first to nominate the winner gets $3,000, the first to invite that person to nominate gets $1,500, whoever first invited them gets $750, whoever first invited them gets $375, and so on. So please pass this email along to anyone else who may have information about unsung heroes! For example, do you know someone who may know someone who’s worked in nuclear command-and-control or high-security Biolabs and know about a close call when someone averted disaster? Or someone who behind the scenes averted a war, massacre, pandemic, or new type of arms race? Let’s pay it forward by honoring these unsung heroes!

Nominate a Hero

FLI in the News


MODERN WAR INSTITUTE: Swarms of Mass Destruction: The Case for Declaring Armed and Fully Autonomous Drone Swarms as WMD

Our newsletter

Regular updates about the Future of Life Institute, in your inbox

Subscribe to our newsletter and join over 20,000+ people who believe in our mission to preserve the future of life.

Recent newsletters

Future of Life Institute Newsletter: Illustrating Superintelligence

Need a break from US election news? Explore the results of our $70K creative contest; new national security AI guidance from the White House; polling teens on AI; and much more.
1 November, 2024

Future of Life Institute Newsletter: On SB 1047, Gov. Newsom Caves to Big Tech

A disappointing outcome for the AI safety bill, updates from UNGA, our $1.5 million grant for global risk convergence research, and more.
1 October, 2024

Future of Life Institute Newsletter: California’s AI Safety Bill Heads to Governor’s Desk

Latest policymaking updates, OpenAI safety team reportedly halved, moving towards an AWS treaty, and more.
30 August, 2024
All Newsletters

Sign up for the Future of Life Institute newsletter

Join 40,000+ others receiving periodic updates on our work and cause areas.
cloudmagnifiercrossarrow-up linkedin facebook pinterest youtube rss twitter instagram facebook-blank rss-blank linkedin-blank pinterest youtube twitter instagram