Skip to content

Grantmaking work

Supporting vital cutting-edge work with a wise, future-oriented mindset.

Introduction

Financial support for promising work aligned with our mission.

Crises like COVID-19 show us that our civilisation is fragile, and needs to plan ahead better. FLI’s grants are for those who take this fragility seriously, who wish to study the risks from ever more powerful technologies and develop strategies for reducing them. The goal is to win the wisdom race: the race between the growing power of our technology and the wisdom with which we manage it.

Grants process

We are excited to offer multiple opportunities to apply for support:

RFPs and Contests

Requests for Proposals (RFPs) and Contests, which we publish on our website on a regular basis. Examples include our grants program on the Humanitarian Impacts of Nuclear War, our call for papers on AI and the SDGs, and our Worldbuilding competition.

Collaborations

Collaborations in direct support of FLI internal projects and initiatives. For example, FLI has funded film production on autonomous weapons at partner organisations. FLI staff work closely with these grantees.

Fellowships

The Vitalik Buterin Fellowships bolster the talent pipeline and support both PhD and Postdoctoral students. Since 2021, we have offered PhD and Postdoctoral fellowships in Technical AI Existential Safety. In 2024, we are excited to launch a PhD fellowship in US-China AI Governance.
View fellowships

We do not accept unsolicited requests. Please subscribe to our newsletter and follow us on social media to learn more about our upcoming RFPs and Contests. Any questions about grants can be sent to grants@futureoflife.org.

Grant programs

All our grant programs

Open programs

Multistakeholder Engagement for Safe and Prosperous AI

Open for submissions
Deadline: 4 February 2025, 23:59 EST

Technical Postdoctoral Fellowships

Open for submissions
Deadline: 6 January 2025

Closed or completed programs

AI Safety Community

A community dedicated to ensuring AI is developed safely.

The way to ensure a better, safer future with AI is not to impede the development of this new technology but to accelerate our wisdom in handling it by supporting AI safety research.

Since it may take decades to complete this research it is prudent to start now. AI safety research prepares us better for the future by pre-emptively making AI beneficial to society and reducing its risks.

This mission motivates research across many disciplines, from economics and law to technical areas like verification, validity, security, and control. We’d love you to join!

View the community
Our content

Related posts

Here are various posts relating to our grantmaking work:

Future of Life Institute Announces 16 Grants for Problem-Solving AI

Announcing the 16 recipients of our newest grants program supporting research on how AI can be safely harnessed to solve specific, intractable problems facing humanity around the world.
11 July, 2024

Realising Aspirational Futures – New FLI Grants Opportunities

Our Futures Program, launched in 2023, aims to guide humanity towards the beneficial outcomes made possible by transformative technologies. This year, as […]
14 February, 2024

Statement on a controversial rejected grant proposal

For those unfamiliar with the Future of Life Institute (FLI), we are a nonprofit charitable organization that since 2014 works […]
18 January, 2023

FLI September 2022 Newsletter: $3M Impacts of Nuclear War Grants Program!

Welcome to the FLI newsletter! Every month, we bring 24,000+ subscribers the latest news on how emerging technologies are transforming our world - for better and worse.
17 October, 2022

The Future of Life Institute announces grants program for existential risk reduction

The Future of Life Institute announces $25M grants program for existential risk reduction Emerging technologies have the potential to help […]
3 June, 2021

New International Grants Program Jump-Starts Research to Ensure AI Remains Beneficial

Elon-Musk-backed program signals growing interest in new branch of artificial intelligence research July 1, 2015 Amid rapid industry investment in […]
28 October, 2015
Our content

Contact us

Let's put you in touch with the right person.

We do our best to respond to all incoming queries within three business days. Our team is spread across the globe, so please be considerate and remember that the person you are contacting may not be in your timezone.
Please direct media requests and speaking invitations for Max Tegmark to press@futureoflife.org. All other inquiries can be sent to contact@futureoflife.org.

Sign up for the Future of Life Institute newsletter

Join 40,000+ others receiving periodic updates on our work and cause areas.
cloudmagnifiercrossarrow-up linkedin facebook pinterest youtube rss twitter instagram facebook-blank rss-blank linkedin-blank pinterest youtube twitter instagram