Op-ed: Climate Change Is the Most Urgent Existential Risk

Climate change and biodiversity loss may pose the most immediate and important threat to human survival given their indirect effects on other risk scenarios.

Humanity faces a number of formidable challenges this century. Threats to our collective survival stem from asteroids and comets, supervolcanoes, global pandemics, climate change, biodiversity loss, nuclear weapons, biotechnology, synthetic biology, nanotechnology, and artificial superintelligence.

With such threats in mind, an informal survey conducted by the Future of Humanity Institute placed the probability of human extinction this century at 19%. To put this in perspective, it means that the average American is more than a thousand times more likely to die in a human extinction event than a plane crash.*

So, given limited resources, which risks should we prioritize? Many intellectual leaders, including Elon Musk, Stephen Hawking, and Bill Gates, have suggested that artificial superintelligence constitutes one of the most significant risks to humanity. And this may be correct in the long-term. But I would argue that two other risks, namely climate change and biodiveristy loss, should take priority right now over every other known threat.

Why? Because these ongoing catastrophes in slow-motion will frame our existential predicament on Earth not just for the rest of this century, but for literally thousands of years to come. As such, they have the capacity to raise or lower the probability of other risks scenarios unfolding.

Multiplying Threats

Ask yourself the following: are wars more or less likely in a world marked by extreme weather events, megadroughts, food supply disruptions, and sea-level rise? Are terrorist attacks more or less likely in a world beset by the collapse of global ecosystems, agricultural failures, economic uncertainty, and political instability?

Both government officials and scientists agree that the answer is “more likely.” For example, the current Director of the CIA, John Brennan, recently identified “the impact of climate change” as one of the “deeper causes of this rising instability” in countries like Syria, Iraq, Yemen, Libya, and Ukraine. Similarly, the former Secretary of Defense, Chuck Hagel, has described climate change as a “threat multiplier” with “the potential to exacerbate many of the challenges we are dealing with today — from infectious disease to terrorism.”

The Department of Defense has also affirmed a connection. In a 2015 report, it states, “Global climate change will aggravate problems such as poverty, social tensions, environmental degradation, ineffectual leadership and weak political institutions that threaten stability in a number of countries.”

Scientific studies have further shown a connection between the environmental crisis and violent conflicts. For example, a 2015 paper in the Proceedings of the National Academy of Sciences argues that climate change was a causal factor behind the record-breaking 2007-2010 drought in Syria. This drought led to a mass migration of farmers into urban centers, which fueled the 2011 Syrian civil war. Some observers, including myself, have suggested that this struggle could be the beginning of World War III, given the complex tangle of international involvement and overlapping interests.

The study’s conclusion is also significant because the Syrian civil war was the Petri dish in which the Islamic State consolidated its forces, later emerging as the largest and most powerful terrorist organization in human history.

A Perfect Storm

The point is that climate change and biodiversity loss could very easily push societies to the brink of collapse. This will exacerbate existing geopolitical tensions and introduce entirely new power struggles between state and nonstate actors. At the same time, advanced technologies will very likely become increasingly powerful and accessible. As I’ve written elsewhere, the malicious agents of the future will have bulldozers rather than shovels to dig mass graves for their enemies.

The result is a perfect storm of more conflicts in the world along with unprecedentedly dangerous weapons.

If the conversation were to end here, we’d have ample reason for placing climate change and biodiversity loss at the top of our priority lists. But there are other reasons they ought to be considered urgent threats. I would argue that they could make humanity more vulnerable to a catastrophe involving superintelligence and even asteroids.

The basic reasoning is the same for both cases. Consider superintelligence first. Programming a superintelligence whose values align with ours is a formidable task even in stable circumstances. As Nick Bostrom argues in his 2014 book, we should recognize the “default outcome” of superintelligence to be “doom.”

Now imagine trying to solve these problems amidst a rising tide of interstate wars, civil unrest, terrorist attacks, and other tragedies? The societal stress caused by climate change and biodiversity loss will almost certainly compromise important conditions for creating friendly AI, such as sufficient funding, academic programs to train new scientists, conferences on AI, peer-reviewed journal publications, and communication/collaboration between experts of different fields, such as computer science and ethics.

It could even make an “AI arms race” more likely, thereby raising the probability of a malevolent superintelligence being created either on purpose or by mistake.

Similarly, imagine that astronomers discover a behemoth asteroid barreling toward Earth. Will designing, building, and launching a spacecraft to divert the assassin past our planet be easier or more difficult in a world preoccupied with other survival issues?

In a relatively peaceful world, one could imagine an asteroid actually bringing humanity together by directing our attention toward a common threat. But if the “conflict multipliers” of climate change and biodiversity loss have already catapulted civilization into chaos and turmoil, I strongly suspect that humanity will become more, rather than less, susceptible to dangers of this sort.

Context Risks

We can describe the dual threats of climate change and biodiversity loss as “context risks.” Neither is likely to directly cause the extinction of our species. But both will define the context in which civilization confronts all the other threats before us. In this way, they could indirectly contribute to the overall danger of annihilation — and this worrisome effect could be significant.

For example, according to the Intergovernmental Panel on Climate Change, the effects of climate change will be “severe,” “pervasive,” and “irreversible.” Or, as a 2016 study published in Nature and authored by over twenty scientists puts it, the consequences of climate change “will extend longer than the entire history of human civilization thus far.”

Furthermore, a recent article in Science Advances confirms that humanity has already escorted the biosphere into the sixth mass extinction event in life’s 3.8 billion year history on Earth. Yet another study suggests that we could be approaching a sudden, irreversible, catastrophic collapse of the global ecosystem. If this were to occur, it could result in “widespread social unrest, economic instability and loss of human life.”

Given the potential for environmental degradation to elevate the likelihood of nuclear wars, nuclear terrorism, engineered pandemics, a superintelligence takeover, and perhaps even an impact winter, it ought to take precedence over all other risk concerns — at least in the near-term. Let’s make sure we get our priorities straight.

* How did I calculate this? First, the average American’s lifetime chance of dying in an “Air and space transport accident” was 1 in 9737 as of 2013, according to the Insurance Information Institute. The US life expectancy is currently 78.74 years, which we can round up to 80 years for simplicity. Second, the informal Future of Humanity Institute (FHI) survey puts the probability of human extinction this century at 19%. Assuming independence, it follows that the probability of human extinction in an 80-year period (the US life expectancy) is 15.5%. Finally, the last step is to figure out the difference between the 15.5% figure and the 1 in 9737 statistic. To do this, divide .155 by 1/9737. This gives 1509.235. And from here we can conclude that, if the FHI survey is accurate, “the average American is more than a thousand times more likely to die in a human extinction event than a plane crash.”
4 replies
  1. Mindey
    Mindey says:

    Yes, there are non-linear dependencies between problems. For example, there are very few people who die from traffic congestions per se, but considering that it takes 5 minutes without oxygen for human brain to die in case of heart attack, the traffic congestions could be contributing to a significant number of deaths from heart failures (~7.4 mln. in 2012). Arguably, if we had efficient transportation, many of these deaths would not occur, and there are many other problems, which depend on efficient transportation.

    Now, looking from this perspective, of course, climate change can affect our ability to deal with other existential problems and maintain social stability. It would be great to hvae these non-linear dependencies modeled more precisely, with something like Bayesian networks of probabilistic programs. What if people become mobile ( http://infty.xyz/i/16/en ), and don’t live in houses over the next 15-20 years, and can be wherever on the globe they want, will the climate change still pose a threat like that? How fast could the runaway climate change ( http://infty.xyz/g/13/en ) happen, and what is the near-term risk for it happening?

    • Phil Torres
      Phil Torres says:

      Mindey:

      Great points! Yes, it would be great (as you say) to have more precise models of non-linear dependencies. I suspect that there are researchers working on this, but I’m not that conversant with the relevant literature. As for a runaway greenhouse effect, I haven’t yet read the recent James Hansen et al. article that you link to, but my understanding is that there’s considerable uncertainty about the probability of this happening. Nonetheless, it appears possible. (The marvelous book Global Catastrophic Risks has some helpful insights on the issue.)

      Thanks so much for your comment and please don’t hesitate to post more.

  2. Alexey Turchin
    Alexey Turchin says:

    I agree that context risks are very important and I would even name name x-risks of the second kind.

    I would like to add that there is 1 per cent probability that global warming will be much severe than it is thought, and itself it may result in civilisational collapse and even human extinction in next decades. It is connected with idea of runaway global warming caused by methane hydrates eruption from Arctic. I read about it in http://arctic-news.blogspot.ru/ and where are many argument why it could happen.

    I even have my own strong argument for it – we are underestimating fragility of our environment because of anthropic observation selection effects. it make atmosphere much more unstable than we should think based on previous observations.

    The second point I would like to present is that earlier extinction events overshadow later one. If we have probability to die from UFAI 50 per cent in 2100 and 50 per cent to die from climate change in 2050, than combined probability is that we will die from UFAI is only 25 per cent.
    This makes prevention of earlier risks more important even if they are less severe.

    The main problem in climate change is the problem of truth and believe. Our climate model is too complex to be perfect and public believe in it is even less.

  3. Mike
    Mike says:

    I understand the argument, “climate change could make everything worse.” I don’t follow how you then conclude that climate change must be the top priority. Say global pandemic is a baseline 5% risk in the century, and climate change makes it a 6%. If we prioritize climate change, we might make that a 5.8%. But if we prioritize direct work on a global pandemic, we might make it a 1.8%.
    There just seems to be no logical connection between “this makes things worse” and “it must be our top priority.”

Comments are closed.