Contents
FLI June 2021 Newsletter
FLI Announces $25M Grants Program for Existential Risk Reduction
The Future of Life Institute is delighted to announce a $25M multi-year grant program aimed at reducing existential risk. Existential risks are events that could cause human extinction or permanently and drastically curtail humanity’s potential, and currently efforts to mitigate these risks receive remarkably little funding and attention relative to their importance. This program is made possible by the generosity of cryptocurrency pioneer Vitalik Buterin and the Shiba Inu community.
Specifically, the program will support interventions designed to directly reduce existential risk; prevent politically destabilising events that compromise international cooperation; actively improve international cooperation; and develop positive visions for the long-term future that incentivise both international cooperation and the development of beneficial technologies. The emphasis on collaboration stems from our conviction that technology is not a zero-sum game, and that in all likelihood it will cause humanity to either flourish, or else flounder.
Shiba Inu Grants will support projects; particularly research. Vitalik Buterin Fellowships will bolster the pipeline through which talent flows towards our areas of focus; this may include funding for high school summer programs, college summer internships, graduate fellowships and postdoctoral fellowships.
To read more about the program, click here.
New Podcast Episodes
Nicolas Berggruen on the Dynamics of Power, Wisdom and Ideas in the Age of AI
In this episode of the Future of Life Institute Podcast, Lucas is joined by investor and philanthropist Nicolas Berggruen to discuss the nature of wisdom, why it lags behind technological growth and the power that comes with technology, and the role ideas play in the value alignment of technology.
Later in the episode, the conversation turns to the increasing concentration of power and wealth in society, universal basic income and a proposal for universal basic capital.
To listen, click here.
Reading & Resources
The Centre for the Study of Existential Risk is hiring for a Deputy Director!
The Centre for the Study of Existential Risk, University of Cambridge, is looking for a new Deputy Director. This role will involve taking full operational responsibility for the day-to-day activities of the Centre, including people and financial management, and contributing to strategic planning for the centre.
CSER is looking for someone with strong experience in operations and strategy, with the interest and intellectual versatility to engage with and communicate the Centre’s research.
The deadline for applications is Sunday 4 July. More details on both the role and person profile are available in the further particulars, here.
The Leverhulme Centre for the Future of Intelligence (CFI) and CSER are also hiring for a Centre Administrator to lead the Department’s professional services support team. Further details can be found here.
The Global Catastrophic Risk Institute is looking for collaborators and advisees!
The Global Catastrophic Risk Institute (GCRI) is currently welcoming inquiries from people who are interested in seeking their advice and/or collaborating with them. These inquiries can concern any aspect of global catastrophic risk but GCRI is particularly interested to hear from those interested in its ongoing projects. These projects include AI policy, expert judgement on long-term AI, forecasting global catastrophic risks and improving China-West relations.
Participation can consist of anything from a short email exchange to more extensive project work. In some cases, people may be able to get involved by contributing to ongoing dialogue, collaborating on research and outreach activities, and co-authoring publications. Inquiries are welcome from people at any career point, including students, any academic or professional background, and any place in the world. People from underrepresented groups are especially encouraged to reach out.
Find more details here!