Steering transformative technology towards benefiting life and away from extreme large-scale risks.
Since its earliest days, human civilization - and through it increasingly all life on Earth - has been shaped by technology. In the current decades, however, something unprecedented is happening. Humanity now holds the power to affect, and even destroy, all life on earth. Through the continued development of biotechnology and AI, we have entered an era in which life will be engineered by intelligence, rather than by evolution. The rapidly increasing power of these technologies means that these changes will be profound - and perilous.
The mission of the Future of Life Institute, established in 2015, is to steer transformative technology towards benefitting life and away from extreme large-scale risks.
In pursuit of this mission, we support the development of institutions and visions necessary to manage world-driving technologies and enable positive futures. Furthermore, we seek to reduce large-scale harm and existential risk resulting from accidental or intentional misuse of transformative technologies.
Towards these aims, FLI engages in a number of initiatives and strategies, including:
- Policy development and advocacy to bridge the gap between the experts who understand transformative technologies and the public institutions with the legitimacy and means to govern them.
- Outreach and education to help policymakers, technologists and the general public understand the challenges and opportunities we face. This includes envisioning positive futures as an antidote to prevailing defeatism, and formulating the interventions we might need now if we're to steer toward such futures.
- Research and research support because managing transformative technologies will require research on problems and in fields that existing sources have not sufficiently supported through grantmaking or otherwise.
- Grantmaking to support research on problems and in fields stemming from transformative technologies that are otherwise insufficiently resourced.
- Institution-building to design, launch, and support new organizations and agreements to improve the governance of transformative technologies.
- Convening and coordinating events and activities, because these issues are global and addressing them will require large-scale coordinated action, even amongst rivals.
FLI presently focuses its energy on issues of advanced artificial intelligence, militarized AI, nuclear war, and new pro-social platforms. We also pursue topics in other critical fields including climate change, bio-risk and the preservation of biodiversity. However, we consider our purview to be as wide as necessary to pursue our core mission.
In carrying out its activities, FLI is committed to a number of core principles:
- Impact-driven: We take on projects because in our assessment we anticipate they will have large positive impact, both now and in the longer term.
- Cognizant of urgency: Analysis paralysis should not indefinitely forestall action which is based on the best available information. Key events in human history are transpiring, and people and the Earth are being harmed right now.
- Forward-thinking and anticipatory: While recognizing that some aspects of the future are difficult or impossible to forecast, many problems and opportunities can be anticipated and must be planned for before they are fully manifest.
- Driven by science and reason: FLI aims to act on humanity's hard-won understanding of both how the world functions, and how we can come to better understand how it functions.
- Inclusive: In our work we seek to honour the experiences, respected rights, and autonomy of the world's people and other sentient beings. FLI believes that technology, social structures and governments are instrumental: they are meant to serve us and not to be served. In contrast, people are not instrumental, which means that positive impact on the world should be achieved while maintaining integrity, kindness, and respect for others.
Were you looking for something else?
Here are a couple of other pages you might have been looking for: