Autonomous Weapons Open Letter: Global Health Community
Hosting, signature verification and list management are supported by FLI; for administrative questions about this letter, please contact Dr. Emilia Javorsky.
Lethal Autonomous Weapons: An Open Letter from the Global Health Community
Given our commitment to do no harm, the global health community has a long history of successful advocacy against inhumane weapons, and the World and American Medical Associations have called for bans on nuclear, chemical and biological weapons. Now, recent advances in artificial intelligence have brought us to the brink of a new arms race in lethal autonomous weapons.
In contrast to semi-autonomous weapons that require human oversight to ensure that each target is validated as ethically and legally legitimate, such fully autonomous weapons select and engage targets without human intervention, representing complete automation of lethal harm. This ability to selectively and anonymously target groups of people without human oversight would carry dire humanitarian consequences and be highly destabilizing. By nature of being cheap and easy to mass produce, lethal autonomous weapons can fall into the hands of terrorists and despots, lower the barriers to armed conflict, and become weapons of mass destruction enabling very few to kill very many. Furthermore, autonomous weapons are morally abhorrent, as we should never cede the decision to take a human life to algorithms. As healthcare professionals, we believe that breakthroughs in science have tremendous potential to benefit society and should not be used to automate harm. We therefore call for an international ban on lethal autonomous weapons.
There's a similar letter available to AI & Robotics Researchers here.
Add your signature
National & International Organizations: