Skip to content
All Newsletters

FLI November Newsletter

Published
November 18, 2015
Author
Ariel Conn
News Site Launch
We are excited to present our new xrisk news site! With improved layout and design, it aims to provide you with daily technology news relevant to the long-term future of civilization, covering both opportunities and risks. This will, of course, include news about the projects we and our partner organizations are involved in to help prevent these risks. We’re also developing a section of the site that will provide more background information about the major risks, as well as what people can do to help reduce them and keep society flourishing.
Reducing Risk of Nuclear War

Some investments in nuclear weapons systems might increase the risk of accidental nuclear war and are arguably done primarily for profit rather than national security. Illuminating these financial drivers provides another opportunity to reduce the risk of nuclear war. FLI is pleased to support financial research about who invests in and profits from the production of new nuclear weapons systems, with the aim of drawing attention to and stigmatizing such productions.

On November 12, Don’t Bank on the Bomb released their 2015 report on European financial institutions that have committed to divesting in any companies related to the manufacture of nuclear weapons. The report also highlights financial groups who have made positive steps toward divestment, and it provides a detailed list of companies that are still heavily invested in nuclear weapons. With the Cold War long over, many people don’t realize that the risk of nuclear war still persists and that many experts believe it to be increasing.Here is FLI’s assessment and position of the nuclear weapons situation. 

In case you missed itâ€Ĥ
Here are some other interesting things we and our partners have done in the last few months:
  • On September 1, FLI and CSER co-organized an event at the Policy Exchange in London where Huw Price, Stuart Russell, Nick Bostrom, Michael Osborne and Murray Shanahan discussed AI safety in front of a veritable who’s who of the scientifically minded in Westminster, including many British members of parliament.
  • Max Tegmark and Nick Bostrom were invited to speak at a United Nations event about AI safety.
  • Stephen Hawking answered the AMA questions about artificial intelligence.
  • Our co-founder, Meia Chita-Tegmark wrote a spooky Halloween op-ed that was featured on the Huffington Post about the man who saved the world from nuclear apocalypse in 1962.
  • Nobel-prize winning physicist, Frank Wilczek, shared a sci-fi short story he wrote about a future of AI wars.
  • FLI volunteer, Eric Gastfriend, wrote a popular piece, in which he consider the impact of exponential increase in the number of scientists.
  • And two of our partner organizations have published their newsletters. The Machine Intelligence Research Institute (MIRI) published an October and  November newsletter, and the Global Catastrophic Risk Institute released newsletters inSeptember and October.

Contents

Our content

Related posts

If you enjoyed this, you also might like:

FAQs about FLI’s Open Letter Calling for a Pause on Giant AI Experiments 

Disclaimer: Please note that these FAQ’s have been prepared by FLI and do not necessarily reflect the views of the […]
March 31, 2023

Pause Giant AI Experiments: An Open Letter

We call on all AI labs to immediately pause for at least 6 months the training of AI systems more powerful than GPT-4.
March 22, 2023

Sign up for the Future of Life Institute newsletter

Join 20,000+ others receiving periodic updates on our work and cause areas.
View previous editions
cloudmagnifiercrossarrow-up linkedin facebook pinterest youtube rss twitter instagram facebook-blank rss-blank linkedin-blank pinterest youtube twitter instagram