• Partner With Us
  • Focus Areas
    • Cause Selection
    • Global Health & Wellbeing
      • Abundance & Growth
      • Effective Giving & Careers
      • Farm Animal Welfare
      • Global Aid Policy
      • Global Health & Development
      • Global Health R&D
      • Global Public Health Policy
      • Scientific Research
    • Global Catastrophic Risks
      • Biosecurity & Pandemic Preparedness
      • Forecasting
      • Global Catastrophic Risks Capacity Building
      • Potential Risks from Advanced AI
    • Other Areas
      • History of Philanthropy
  • Grants
  • Research & Updates
    • Blog Posts
    • In the News
    • Research Reports
    • Notable Lessons
  • About Us
    • Grantmaking Process
    • How to Apply for Funding
    • Careers
    • Team
    • Operating Values
    • Stay Updated
    • Contact Us
  • Partner With Us
  • Focus Areas
    • Cause Selection
    • Global Health & Wellbeing
      • Abundance & Growth
      • Effective Giving & Careers
      • Farm Animal Welfare
      • Global Aid Policy
      • Global Health & Development
      • Global Health R&D
      • Global Public Health Policy
      • Scientific Research
    • Global Catastrophic Risks
      • Biosecurity & Pandemic Preparedness
      • Forecasting
      • Global Catastrophic Risks Capacity Building
      • Potential Risks from Advanced AI
    • Other Areas
      • History of Philanthropy
  • Grants
  • Research & Updates
    • Blog Posts
    • In the News
    • Research Reports
    • Notable Lessons
  • About Us
    • Grantmaking Process
    • How to Apply for Funding
    • Careers
    • Team
    • Operating Values
    • Stay Updated
    • Contact Us

Possible Global Catastrophic Risks

  • Category: Global Catastrophic Risks
  • Content Type: Blog Posts
Published: May 23, 2013 | by Holden Karnofsky

Note: Before the launch of the Open Philanthropy Project Blog, this post appeared on the GiveWell Blog. Uses of “we” and “our” in the below post may refer to the Open Philanthropy Project or to GiveWell as an organization. Additional comments may be available at the original post.

I previously discussed our view that in general, further economic development and general human empowerment are likely to be substantially net positive, and are likely to lead to improvement on many dimensions in unexpected ways. In my view, the most worrying counterpoint to this view is the possibility of global catastrophic risks. Broadly speaking, while increasing interconnectedness and power over our environment seem to have many good consequences, these things may also put us at greater risk for a major catastrophe – one that affects the entire world (or a large portion of it) and threatens to reverse, halt, or substantially slow the ongoing global progress in living standards.

This post lists the most worrying global catastrophic risks that I’m aware of, and briefly discusses the role that further technological and economic development could play in exacerbating – or mitigating – them. A future post will discuss how I think about the overall contribution of economic/technological development to exacerbating/mitigating global catastrophic risks in general (including risks that aren’t salient today). The purpose of this post is to (a) continue fleshing out the broad view that further economic development and general human empowerment are likely to be substantially net positive, which is one of the deep value judgments and worldview characteristics underlying our approach to giving recommendations; (b) catalogue some possible candidates for philanthropic focus areas (under the theory that major global catastrophic risks are potentially promising areas for philanthropy to address).

Possible global catastrophic risks that I’m aware of

I consider the following to be the most worrying possibilities I’m aware of for reversing, halting, or substantially slowing the ongoing global progress in living standards. There are likely many such risks I’m not aware of, and likely many such risks that essentially no one today is aware of. I hope that readers of this post will mention important possibilities that I’ve neglected in the comments.

In general, I’m trying to list factors that could do not just large damage, but the kind of damage that could create an unprecedented global challenge.

  1. More powerful technology – particularly in areas such as nuclear weapons, biological weapons, and artificial intelligence – may make wars, terrorist acts, and accidents more dangerous. Further technological progress is likely to lead to technology with far more potential to do damage. Somewhat offsetting this, technological and economic progress may also lead to improved security measures and lower risks of war and terrorism.
  2. A natural pandemic may cause unprecedented damage, perhaps assisted by the development of resistance to today’s common antibiotics. On this front I see technological and economic development as mostly risk-reducing, via the development of better surveillance systems, better antibiotics, better systems for predicting/understanding/responding to pandemics, etc.
  3. Climate change may lead to a major humanitarian crisis (such as unprecedented numbers of refugees due to sea level rise) or to other unanticipated consequences. Economic development may speed this danger by increasing the global rate of CO2 emissions; economic and technological development may mitigate this danger via the development of better energy sources (as well as energy storage and grid systems and other technology for more efficiently using energy), as well as greater wealth leading to more interest in – and perceived ability to afford – emissions reduction.
  4. Technological and economic progress could slow or stop due to a failure to keep innovating at a sufficient rate. Gradual growth in living standards has been the norm for a long time, and a prolonged stagnation could cause unanticipated problems (e.g., values could change significantly if people don’t perceive living standards as continuing to rise).
  5. Global economic growth could become bottlenecked by a scarcity of a particular resource (the most commonly mentioned concern along these lines is “peak oil,” but I have also heard concerns about supplies of food and of water for irrigation). Technological and economic progress could worsen this risk by speeding our consumption of a key resource, or could mitigate it by leading to the development of better technologies for finding and extracting resources and/or effective alternatives to such resources.
  6. An asteroid, supervolcano or solar flare could cause unprecedented damage. Here I largely see economic and technological progress as risk-reducing factors, as they may give us better tools for predicting, preventing and/or mitigating damage from such natural disasters.
  7. An oppressive government may gain power over a substantial part of the world. Technological progress could worsen this risk by improving the tools of such a government to wage war and monitor and control citizens; technological and economic progress could mitigate this risk by strengthening others’ abilities to defend themselves.

I should note that I perceive the odds of complete human extinction from any of the above factors, over the next hundred years or so, to be quite low. #1 would require the development of weapons with destructive potential far in excess of anything that exists today, plus the deployment of such weapons either by superpowers (which seems unlikely if they hold the potential for destroying the human race) or by rogue states/individuals (which seems unlikely since rogue states/individuals don’t have much recent track record of successfully obtaining and deploying the world’s most powerful weapons). #2 would require a disease to emerge with a historically unusual combination of propensity-to-kill and propensity-to-spread. And in either case, the odds of killing all people – taking into account the protected refuges that many governments likely have in place and the substantial number of people who live in remote areas – seem substantially less than the odds of killing many people. We have looked into #3 and and parts of #6 to some degree, and currently believe that there are no particularly likely-seeming scenarios with risk of human extinction.

Global upside possibilities

In addition to global catastrophic risks, there are what I call “global upside possibilities.” That is, future developments may lead to extremely dramatic improvements in quality and quantity of life, and in the robustness of civilization to catastrophic risks. Broadly speaking, these may include

  • Massive reduction or elimination of poverty.
  • Massive improvements in quality of life for the non-poor.
  • Improved intelligence, wisdom, and propensity for making good decisions across society.
  • Increased interconnectedness, empathy and altruism.
  • Space colonization or other developments leading to lowered potential consequences of global catastrophic risks.

I feel that humanity’s future may end up being massively better than its past, and unexpected new developments (particularly technological innovation) may move us toward such a future with surprising speed. Quantifying just how much better such a future would be does not strike me as a very useful exercise, but very broadly, it’s easy for me to imagine a possible future that is at least as desirable as human extinction is undesirable. In other words, if I somehow knew that economic and technological development were equally likely to lead to human extinction or to a brighter long-term future, it’s easy for me to imagine that I could still prefer such development to stagnation.

I see technological and economic development as essential to raising the odds of reaching a much brighter long-term future, and I see such a future as being much less vulnerable to global catastrophic risks than today’s world. I believe that any discussion of technological/economic development global catastrophic risks (and the role of technological/economic development in such risks) is incomplete if it leaves out this consideration.

A future post will discuss how I think about the overall contribution of economic/technological development to our odds of having a very bright, as opposed to very problematic, future. For now, I’d appreciate comments on any major, broad far-future considerations this post has neglected.

Subscribe to new blog alerts
Open Philanthropy
Open Philanthropy
  • We’re Hiring!
  • Press Kit
  • Governance
  • Privacy Policy
  • Stay Updated
Mailing Address
Open Philanthropy
182 Howard Street #225
San Francisco, CA 94105
Email
info@openphilanthropy.org
Media Inquiries
media@openphilanthropy.org
Anonymous Feedback
Feedback Form

© Open Philanthropy 2025 Except where otherwise noted, this work is licensed under a Creative Commons Attribution-Noncommercial 4.0 International License.

We use cookies on our website to give you the most relevant experience by remembering your preferences and repeat visits. By clicking “Accept All”, you consent to the use of ALL the cookies. However, you may visit "Cookie Settings" to provide a controlled consent.
Cookie SettingsAccept All
Manage consent

Privacy Overview

This website uses cookies to improve your experience while you navigate through the website. Out of these, the cookies that are categorized as necessary are stored on your browser as they are essential for the working of basic functionalities of the website. We also use third-party cookies that help us analyze and understand how you use this website. These cookies will be stored in your browser only with your consent. You also have the option to opt-out of these cookies. But opting out of some of these cookies may affect your browsing experience.
Necessary
Always Enabled
Necessary cookies are absolutely essential for the website to function properly. These cookies ensure basic functionalities and security features of the website, anonymously.
CookieDurationDescription
cookielawinfo-checkbox-analytics11 monthsThis cookie is set by GDPR Cookie Consent plugin. The cookie is used to store the user consent for the cookies in the category "Analytics".
cookielawinfo-checkbox-functional11 monthsThe cookie is set by GDPR cookie consent to record the user consent for the cookies in the category "Functional".
cookielawinfo-checkbox-necessary11 monthsThis cookie is set by GDPR Cookie Consent plugin. The cookies is used to store the user consent for the cookies in the category "Necessary".
cookielawinfo-checkbox-others11 monthsThis cookie is set by GDPR Cookie Consent plugin. The cookie is used to store the user consent for the cookies in the category "Other.
cookielawinfo-checkbox-performance11 monthsThis cookie is set by GDPR Cookie Consent plugin. The cookie is used to store the user consent for the cookies in the category "Performance".
viewed_cookie_policy11 monthsThe cookie is set by the GDPR Cookie Consent plugin and is used to store whether or not user has consented to the use of cookies. It does not store any personal data.
Functional
Functional cookies help to perform certain functionalities like sharing the content of the website on social media platforms, collect feedbacks, and other third-party features.
Performance
Performance cookies are used to understand and analyze the key performance indexes of the website which helps in delivering a better user experience for the visitors.
Analytics
Analytical cookies are used to understand how visitors interact with the website. These cookies help provide information on metrics the number of visitors, bounce rate, traffic source, etc.
Advertisement
Advertisement cookies are used to provide visitors with relevant ads and marketing campaigns. These cookies track visitors across websites and collect information to provide customized ads.
Others
Other uncategorized cookies are those that are being analyzed and have not been classified into a category as yet.
SAVE & ACCEPT