• Home
  • About us
    • Team
    • Newsletters
    • Tax Forms
      • 2020 Form 990
      • 2019 Form 990
      • 2018 Form 990
      • 2017 Form 990
      • 2016 Form 990
  • Activities
    • Grant Programs
    • Advocacy
      • Policy Work
      • Lethal Autonomous Weapons Advocacy
    • AI Safety Research Program
    • Future Of Life Award
    • Podcast
      • The FLI Podcast
      • AI Alignment Podcast
      • Not Cool: A Climate Podcast
    • Events
  • Areas of Focus
    • Existential Risk
    • Artificial Intelligence
    • Nuclear Weapons
    • Biotechnology
    • Climate
  • Careers
  • Contact
  • Donate
  • Search
  • Technology is giving life
    the potential to flourish
    like never before...
    dead tree image
    ...or to self-destruct.
    Let's make a difference!
    Future of Life Institute
    • Menu Menu
    • Home
    • About us
      • Team
      • Newsletters
      • Tax Forms
        • 2020 Form 990
        • 2019 Form 990
        • 2018 Form 990
        • 2017 Form 990
        • 2016 Form 990
    • Activities
      • Grant Programs
      • Advocacy
        • Policy Work
        • Lethal Autonomous Weapons Advocacy
      • AI Safety Research Program
      • Future Of Life Award
      • Podcast
        • The FLI Podcast
        • AI Alignment Podcast
        • Not Cool: A Climate Podcast
      • Events
    • Areas of Focus
      • Existential Risk
      • Artificial Intelligence
      • Nuclear Weapons
      • Biotechnology
      • Climate
    • Careers
    • Contact
    • Donate

    The Future of Life Institute announces $25M grants program for existential risk reduction

    Emerging technologies have the potential to help life flourish like never before – or self-destruct. The Future of Life Institute is delighted to announce a $25M multi-year grant program aimed at tipping the balance toward flourishing, away from extinction. This is made possible by the generosity of cryptocurrency pioneer Vitalik Buterin and the Shiba Inu community.

    COVID-19 showed that our civilization is fragile, and can handle risk better when planning ahead. Our grants are for those who have taken these lessons to heart, wish to study the risks from ever more powerful technologies, and develop strategies for reducing them. The goal is to help humanity win the wisdom race: the race between the growing power of our technology and the wisdom with which we manage it.

    Program areas

    Our grants program is focused on reducing the very greatest risks, which receive remarkably little funding and attention relative to their importance. Specifically, they are focused on xrisk (existential risk, i.e., events that could cause human extinction or permanently and drastically curtail humanity’s potential) and ways of reducing it directly or indirectly:

    1. Directly reduce xrisk
      Example: Ensure that increasingly powerful artificial intelligence is aligned with humanity’s interests.
    2. Don’t destroy collaboration
      Avoid things that significantly increase xrisk by destabilizing the world and reducing geopolitical cooperation. Examples: nuclear war, bioengineered pandemics, a lethal autonomous weapons arms race, media-bias-fueled hyper-nationalism and jingoism
    3. Support collaboration
      Support things that significantly decrease x-risk by improving geopolitical cooperation. Examples: institutions, processes and activities that improve global communication and cooperation toward shared goals
    4. Create incentives & goals for collaboration
      Develop shared positive visions for the long-term future that incentivize global cooperation and the development of beneficial technologies. Examples:  nurture existential hope, study how people can be helped and incentivized to set and pursue positive long-term goals

    The emphasis on collaboration stems from FLI’s conviction that technology is not a zero-sum game, and that the most likely outcomes are that all of humanity will ultimately flourish or flounder together.

    Types of grants

    We will be running a series of grants competitions of two types: Shiba Inu Grants and Vitalik Buterin Fellowships. Shiba Inu grants support projects, specifically research, education or other beneficial  activities in the program areas. Buterin Fellowships bolster the talent pipeline through which much-needed talent flows into our program areas, tentatively including funding for high school summer programs, college summer internships, graduate fellowships and postdoctoral fellowships. For example, the Vitalik Buterin Postdoctoral Fellowship for AI Safety will tentatively open for applications in September, and will fund computer science postdocs for three years at institutions of their choice. Academic research grants and fellowships are focused in three areas: computer science, behavioral science, and policy/governance.

    To conclude, we wish to once again express our profound gratitude to all who’ve made this possible, from Vitalik Buterin and the Shiba Inu Community to the amazing team at Alameda Research.

    Media inquiries: Max Tegmark, max@futureoflife.org

    The Latest from the Future of Life Institute
    Subscribe To Our Newsletter

    Stay up to date with our grant announcements, new podcast episodes and more.

    Invalid email address
    You can unsubscribe at any time.
    Thanks for subscribing!

    FREQUENTLY ASKED QUESTIONS:

    Q: What’s the Future of Life Institute?

    A: A 501(c)3 non-profit that wants the long-term future of life to exist and be as positive as possible. We focus particularly on the benefits and risks of transformative technology.

    Q: Who’s Vitalik Buterin?

    A: A cryptocurrency pioneer and philanthropic supporter of effective altruism.

    Q: What’s the Shiba Inu Community?

    A: An experiment in decentralized spontaneous community building with hundreds of thousands of members, that by promoting the Shiba Inu cryptocurrency token is having remarkable positive impact on charities, including Indian COVID-19 relief.

    Q: When and how can I apply?

    A. If you sign up for our mailing list,  we will send you instructions when grants programs open for applications. For efficiency and fairness, we do not accept unsolicited applications.

    Q: Who can apply?

    A: We wish to support promising people and ideas anywhere in the world. Since we are a non-profit organization, we are normally only able to support work associated with research institutions and other non-profits; if you’re unsure whether you qualify, please reach out once our application portal is live.

    Q: Will the Shiba Inu Grants be paid  in cryptocurrency?

    A: No, in US Dollars etc., as our past grants.

    Q: Why would humanity cause its own destruction?

    A: By mistake or miscommunication, which has brought humanity to the brink of catastrophe many times in the past (example, more examples, comic relief), and biotech & AI poses arguably greater threats.

    Q: Isn’t this naïve to think that humanity would abstain from developing destructive technologies?

    A: No. Several national bioweapon programs existed around 1970, and yet bioweapons are now illegal under international law. Thanks in significant part to Future of Life Award winner Prof. Matthew Meselson, this such weapons of mass destruction never entered into widespread use, and biology’s main use is saving lives.

    “The saddest aspect of life right now is that science gathers knowledge faster than society gathers wisdom.”

    Isaac Asimov

    SIGN UP FOR PERIODIC UPDATES FROM THE FUTURE OF LIFE INSTITUTE!

    Join our email list to get the latest blog posts straight to your inbox
    Invalid email address
    We promise not to spam you. Unsubscribe at any time.
    Thanks for subscribing!

    Follow Us

    Facebooktwitterlinkedinyoutube

    Artificial Intelligence

    Benefits & Risks of AI

    Global AI Policy Resource

    Lethal Autonomous Weapons

    Research Program / Grants

    AI Value Alignment Map

    Asilomar AI Principles

    AI News Archive

    Nuclear

    Risks of Nuclear Weapons

    A Global fight against WMDs

    Trillion Dollar Nuke App

    Declassified Nuclear Targets

    A Timeline of Close Calls

    Nuclear News Archive

    Biotechnology

    Benefits and Risks of Biotech

    Smallpox Eradication

    COVID-19 & Catastrophic Risk

    Biological Weapons Convention

    Biotech News Archive

    Climate

    Risks of Climate Change

    Not Cool Podcast

    Climate News Archive

    © Copyright - FLI - Future of Life Institute. Please read our updated Privacy Policy
    • Twitter
    • Facebook
    • Youtube
    Nicolas Berggruen on the Dynamics of Power, Wisdom, and Ideas in the Age of... FLI June 2021 Newsletter
    Scroll to top
    x
    Sign up for periodic updates from the Future of Life Institute!
    Subscribe To Newsletter
    We promise not to spam you. Unsubscribe at any time.
    Invalid email address
    Thanks for subscribing!
    We are in the process of redesigning the Future of Life Institute website. Changes coming soon...
    x
    We promise not to spam you. You can unsubscribe at any time.
    Invalid email address
    Thanks for subscribing!