FLI Open Letters
We believe that individuals need to make their voices heard when it comes to matters of emerging technologies and their risks. The Future of Life Institute has facilitated this dialogue in the form of many open letters throughout the years.
Our content
Featured letters
Add your signature to our most important published open letters:
Closed
AI Licensing for a Better Future: On Addressing Both Present Harms and Emerging Threats
This joint open letter by Encode Justice and the Future of Life Institute calls for the implementation of three concrete US policies in order to address current and future harms of AI.
25 October, 2023
31810
Pause Giant AI Experiments: An Open Letter
We call on all AI labs to immediately pause for at least 6 months the training of AI systems more powerful than GPT-4.
22 March, 2023
5218
Lethal Autonomous Weapons Pledge
Artificial intelligence (AI) is poised to play an increasing role in military systems. There is an urgent opportunity and necessity for citizens, policymakers, and leaders to distinguish between acceptable and unacceptable uses of AI. In this light, we the undersigned agree that the decision to take a human life should never be delegated to a machine.
6 June, 2018
5720
Asilomar AI Principles
The Asilomar AI Principles, coordinated by FLI and developed at the Beneficial AI 2017 conference, are one of the earliest and most influential sets of AI governance principles.
11 August, 2017
34378
Autonomous Weapons Open Letter: AI & Robotics Researchers
Autonomous weapons select and engage targets without human intervention. Artificial Intelligence (AI) technology has reached a point where the deployment of such systems is — practically if not legally — feasible within years, not decades, and the stakes are high: autonomous weapons have been described as the third revolution in warfare, after gunpowder and nuclear arms.
9 February, 2016
11251
Research Priorities for Robust and Beneficial Artificial Intelligence: An Open Letter
There is now a broad consensus that AI research is progressing steadily, and that its impact on society is likely to increase. The potential benefits are huge, since everything that civilization has to offer is a product of human intelligence. Because of the great potential of AI, it is important to research how to reap its benefits while avoiding potential pitfalls.
28 October, 2015
Our content
All our other open letters
Here are all of the open letters we have published:
2672
Open letter calling on world leaders to show long-view leadership on existential threats
The Elders, Future of Life Institute and a diverse range of co-signatories call on decision-makers to urgently address the ongoing impact and escalating risks of the climate crisis, pandemics, nuclear weapons, and ungoverned AI.
14 February, 2024
998
Open Letter Against Reckless Nuclear Escalation and Use
The abhorrent Ukraine war has the potential to escalate into an all-out NATO-Russia nuclear conflict that would be the greatest catastrophe in human history. More must be done to prevent such escalation.
18 October, 2022
Closed
Foresight in AI Regulation Open Letter
The emergence of artificial intelligence (AI) promises dramatic changes in our economic and social structures as well as everyday life […]
14 June, 2020
276
Autonomous Weapons Open Letter: Global Health Community
Given our commitment to do no harm, the global health community has a long history of successful advocacy against inhumane weapons, and the World and American Medical Associations have called for bans on nuclear, chemical and biological weapons. Now, recent advances in artificial intelligence have brought us to the brink of a new arms race in lethal autonomous weapons.
13 March, 2019
Closed
2018 Statement to United Nations on Behalf of LAWS Open Letter Signatories
The following statement was read on the floor of the United Nations during the August, 2018 CCW meeting, in which […]
4 September, 2018
3789
UN Ban on Nuclear Weapons Open Letter
Nuclear arms are the only weapons of mass destruction not yet prohibited by an international convention, even though they are the most destructive and indiscriminate weapons ever created. We scientists bear a special responsibility for nuclear weapons, since it was scientists who invented them and discovered that their effects are even more horrific than first thought.
19 June, 2018
110
An Open Letter to the United Nations Convention on Certain Conventional Weapons
As companies building the technologies in Artificial Intelligence and Robotics that may be repurposed to develop autonomous weapons, we feel […]
20 August, 2017
Closed
Closed
Autonomous Weapons Open Letter: AI & Robotics Researchers – Signatories List
Click here to view the Autonomous Weapons Open Letter for AI & Robotics Researchers.
9 February, 2016
Closed
AI Open Letter – Signatories List
Click here to view the Research Priorities for Robust and Beneficial AI Open Letter.
4 February, 2016
Closed
Digital Economy Open Letter
An open letter by a team of economists about AI’s future impact on the economy. It includes specific policy suggestions […]
25 January, 2016
Closed
AI Economics Open Letter
Inspired by our Puerto Rico AI conference and open letter, a team of economists and business leaders have now launched […]
19 June, 2015