Skip to content
All Open Letters

Lethal Autonomous Weapons Pledge

Artificial intelligence (AI) is poised to play an increasing role in military systems. There is an urgent opportunity and necessity for citizens, policymakers, and leaders to distinguish between acceptable and unacceptable uses of AI. In this light, we the undersigned agree that the decision to take a human life should never be delegated to a machine.
Signatures
5218
Add your signature
Published
6 June, 2018

Artificial intelligence (AI) is poised to play an increasing role in military systems. There is an urgent opportunity and necessity for citizens, policymakers, and leaders to distinguish between acceptable and unacceptable uses of AI.

In this light, we the undersigned agree that the decision to take a human life should never be delegated to a machine. There is a moral component to this position, that we should not allow machines to make life-taking decisions for which others – or nobody – will be culpable. There is also a powerful pragmatic argument: lethal autonomous weapons, selecting and engaging targets without human intervention, would be dangerously destabilizing for every country and individual. Thousands of AI researchers agree that by removing the risk, attributability, and difficulty of taking human lives, lethal autonomous weapons could become powerful instruments of violence and oppression, especially when linked to surveillance and data systems. Moreover, lethal autonomous weapons have characteristics quite different from nuclear, chemical and biological weapons, and the unilateral actions of a single group could too easily spark an arms race that the international community lacks the technical tools and global governance systems to manage. Stigmatizing and preventing such an arms race should be a high priority for national and global security.

We, the undersigned, call upon governments and government leaders to create a future with strong international norms, regulations and laws against lethal autonomous weapons. These currently being absent, we opt to hold ourselves to a high standard: we will neither participate in nor support the development, manufacture, trade, or use of lethal autonomous weapons. We ask that technology companies and organizations, as well as leaders, policymakers, and other individuals, join us in this pledge.


Independent from this pledge, 30 countries in the United Nations have explicitly endorsed the call for a ban on lethal autonomous weapons systems: Algeria, Argentina, Austria, Bolivia, Brazil, Chile, China, Colombia, Costa Rica, Cuba, Djibouti, Ecuador, Egypt, El Salvador, Ghana, Guatemala, Holy See, Iraq, Jordan, Mexico, Morocco, Namibia, Nicaragua, Pakistan, Panama, Peru, State of Palestine, Uganda, Venezuela, Zimbabwe.

 

Add your signature

To express your support, please add your name below:

Full Name *

This is a required question


 

For more information about what is at stake and about ongoing efforts at the UN, please visit autonomousweapons.org and stopkillerrobots.org

If you have questions about this letter, please contact FLI.

Signatories

To date this pledge has been signed by organizations and individuals.

Organizations:

    Individuals:

      Read More
      OPEN LETTERS

      Related posts

      If you enjoyed this, you also might like:
      Signatories
      2672

      Open letter calling on world leaders to show long-view leadership on existential threats

      The Elders, Future of Life Institute and a diverse range of co-signatories call on decision-makers to urgently address the ongoing impact and escalating risks of the climate crisis, pandemics, nuclear weapons, and ungoverned AI.
      14 February, 2024
      Signatories
      Closed

      AI Licensing for a Better Future: On Addressing Both Present Harms and Emerging Threats

      This joint open letter by Encode Justice and the Future of Life Institute calls for the implementation of three concrete US policies in order to address current and future harms of AI.
      25 October, 2023
      Signatories
      31810

      Pause Giant AI Experiments: An Open Letter

      We call on all AI labs to immediately pause for at least 6 months the training of AI systems more powerful than GPT-4.
      22 March, 2023
      Signatories
      998

      Open Letter Against Reckless Nuclear Escalation and Use

      The abhorrent Ukraine war has the potential to escalate into an all-out NATO-Russia nuclear conflict that would be the greatest catastrophe in human history. More must be done to prevent such escalation.
      18 October, 2022

      Sign up for the Future of Life Institute newsletter

      Join 40,000+ others receiving periodic updates on our work and cause areas.
      cloudmagnifiercrossarrow-up linkedin facebook pinterest youtube rss twitter instagram facebook-blank rss-blank linkedin-blank pinterest youtube twitter instagram