Error message

Warning: Use of undefined constant openphil_social - assumed 'openphil_social' (this will throw an Error in a future version of PHP) in openphil_social_block_block_view() (line 90 of /var/www/html/openphil/live/sites/all/modules/custom/openphil_social_block/openphil_social_block.module).

Possible Global Catastrophic Risks

Note: Before the launch of the Open Philanthropy Project Blog, this post appeared on the GiveWell Blog. Uses of “we” and “our” in the below post may refer to the Open Philanthropy Project or to GiveWell as an organization. Additional comments may be available at the original post.

I previously discussed our view that in general, further economic development and general human empowerment are likely to be substantially net positive, and are likely to lead to improvement on many dimensions in unexpected ways. In my view, the most worrying counterpoint to this view is the possibility of global catastrophic risks. Broadly speaking, while increasing interconnectedness and power over our environment seem to have many good consequences, these things may also put us at greater risk for a major catastrophe - one that affects the entire world (or a large portion of it) and threatens to reverse, halt, or substantially slow the ongoing global progress in living standards.

This post lists the most worrying global catastrophic risks that I’m aware of, and briefly discusses the role that further technological and economic development could play in exacerbating - or mitigating - them. A future post will discuss how I think about the overall contribution of economic/technological development to exacerbating/mitigating global catastrophic risks in general (including risks that aren’t salient today). The purpose of this post is to (a) continue fleshing out the broad view that further economic development and general human empowerment are likely to be substantially net positive, which is one of the deep value judgments and worldview characteristics underlying our approach to giving recommendations; (b) catalogue some possible candidates for philanthropic focus areas (under the theory that major global catastrophic risks are potentially promising areas for philanthropy to address).

Possible global catastrophic risks that I’m aware of
I consider the following to be the most worrying possibilities I’m aware of for reversing, halting, or substantially slowing the ongoing global progress in living standards. There are likely many such risks I’m not aware of, and likely many such risks that essentially no one today is aware of. I hope that readers of this post will mention important possibilities that I’ve neglected in the comments.

In general, I’m trying to list factors that could do not just large damage, but the kind of damage that could create an unprecedented global challenge.

  1. More powerful technology - particularly in areas such as nuclear weapons, biological weapons, and artificial intelligence - may make wars, terrorist acts, and accidents more dangerous. Further technological progress is likely to lead to technology with far more potential to do damage. Somewhat offsetting this, technological and economic progress may also lead to improved security measures and lower risks of war and terrorism.
  2. A natural pandemic may cause unprecedented damage, perhaps assisted by the development of resistance to today’s common antibiotics. On this front I see technological and economic development as mostly risk-reducing, via the development of better surveillance systems, better antibiotics, better systems for predicting/understanding/responding to pandemics, etc.
  3. Climate change may lead to a major humanitarian crisis (such as unprecedented numbers of refugees due to sea level rise) or to other unanticipated consequences. Economic development may speed this danger by increasing the global rate of CO2 emissions; economic and technological development may mitigate this danger via the development of better energy sources (as well as energy storage and grid systems and other technology for more efficiently using energy), as well as greater wealth leading to more interest in - and perceived ability to afford - emissions reduction.
  4. Technological and economic progress could slow or stop due to a failure to keep innovating at a sufficient rate. Gradual growth in living standards has been the norm for a long time, and a prolonged stagnation could cause unanticipated problems (e.g., values could change significantly if people don’t perceive living standards as continuing to rise).
  5. Global economic growth could become bottlenecked by a scarcity of a particular resource (the most commonly mentioned concern along these lines is “peak oil,” but I have also heard concerns about supplies of food and of water for irrigation). Technological and economic progress could worsen this risk by speeding our consumption of a key resource, or could mitigate it by leading to the development of better technologies for finding and extracting resources and/or effective alternatives to such resources.
  6. An asteroid, supervolcano or solar flare could cause unprecedented damage. Here I largely see economic and technological progress as risk-reducing factors, as they may give us better tools for predicting, preventing and/or mitigating damage from such natural disasters.
  7. An oppressive government may gain power over a substantial part of the world. Technological progress could worsen this risk by improving the tools of such a government to wage war and monitor and control citizens; technological and economic progress could mitigate this risk by strengthening others’ abilities to defend themselves.

I should note that I perceive the odds of complete human extinction from any of the above factors, over the next hundred years or so, to be quite low. #1 would require the development of weapons with destructive potential far in excess of anything that exists today, plus the deployment of such weapons either by superpowers (which seems unlikely if they hold the potential for destroying the human race) or by rogue states/individuals (which seems unlikely since rogue states/individuals don’t have much recent track record of successfully obtaining and deploying the world’s most powerful weapons). #2 would require a disease to emerge with a historically unusual combination of propensity-to-kill and propensity-to-spread. And in either case, the odds of killing all people - taking into account the protected refuges that many governments likely have in place and the substantial number of people who live in remote areas - seem substantially less than the odds of killing many people. We have looked into #3 and and parts of #6 to some degree, and currently believe that there are no particularly likely-seeming scenarios with risk of human extinction.

Global upside possibilities
In addition to global catastrophic risks, there are what I call “global upside possibilities.” That is, future developments may lead to extremely dramatic improvements in quality and quantity of life, and in the robustness of civilization to catastrophic risks. Broadly speaking, these may include

  • Massive reduction or elimination of poverty.
  • Massive improvements in quality of life for the non-poor.
  • Improved intelligence, wisdom, and propensity for making good decisions across society.
  • Increased interconnectedness, empathy and altruism.
  • Space colonization or other developments leading to lowered potential consequences of global catastrophic risks.

I feel that humanity’s future may end up being massively better than its past, and unexpected new developments (particularly technological innovation) may move us toward such a future with surprising speed. Quantifying just how much better such a future would be does not strike me as a very useful exercise, but very broadly, it’s easy for me to imagine a possible future that is at least as desirable as human extinction is undesirable. In other words, if I somehow knew that economic and technological development were equally likely to lead to human extinction or to a brighter long-term future, it’s easy for me to imagine that I could still prefer such development to stagnation.

I see technological and economic development as essential to raising the odds of reaching a much brighter long-term future, and I see such a future as being much less vulnerable to global catastrophic risks than today’s world. I believe that any discussion of technological/economic development global catastrophic risks (and the role of technological/economic development in such risks) is incomplete if it leaves out this consideration.

A future post will discuss how I think about the overall contribution of economic/technological development to our odds of having a very bright, as opposed to very problematic, future. For now, I’d appreciate comments on any major, broad far-future considerations this post has neglected.

Leave a comment