• Focus Areas
    • Cause Selection
    • Global Health & Wellbeing
      • Effective Altruism Community Growth (Global Health and Wellbeing)
      • Farm Animal Welfare
      • Global Aid Policy
      • Global Health & Development
      • Scientific Research
      • South Asian Air Quality
    • Longtermism
      • Biosecurity & Pandemic Preparedness
      • Effective Altruism Community Growth (Longtermism)
      • Potential Risks from Advanced AI
    • Other Areas
      • Criminal Justice Reform
      • History of Philanthropy
      • Immigration Policy
      • Land Use Reform
      • Macroeconomic Stabilization Policy
  • Grants
  • Research & Updates
    • Research Reports
    • Blog Posts
    • Notable Lessons
    • In the News
  • About Us
    • Grantmaking Process
    • How to Apply for Funding
    • Team
    • Contact Us
    • Stay Updated
  • We’re hiring!
  • Focus Areas
    • Cause Selection
    • Global Health & Wellbeing
      • Effective Altruism Community Growth (Global Health and Wellbeing)
      • Farm Animal Welfare
      • Global Aid Policy
      • Global Health & Development
      • Scientific Research
      • South Asian Air Quality
    • Longtermism
      • Biosecurity & Pandemic Preparedness
      • Effective Altruism Community Growth (Longtermism)
      • Potential Risks from Advanced AI
    • Other Areas
      • Criminal Justice Reform
      • History of Philanthropy
      • Immigration Policy
      • Land Use Reform
      • Macroeconomic Stabilization Policy
  • Grants
  • Research & Updates
    • Research Reports
    • Blog Posts
    • Notable Lessons
    • In the News
  • About Us
    • Grantmaking Process
    • How to Apply for Funding
    • Team
    • Contact Us
    • Stay Updated
  • We’re hiring!

Stanford Existential Risks Initiative — Global Catastrophic Risk Education

Visit Grantee Site
  • Category: Longtermism
  • Organization Name: Stanford Existential Risks Initiative
  • Amount: $1,500,000

  • Award Date: January 2021

Table of Contents

    Grant Investigator: Claire Zabel

    This page was reviewed but not written by the grant investigator. Stanford Existential Risks Initiative staff also reviewed this page prior to publication.


    Open Philanthropy recommended a grant of $1,500,000 over two years to the Stanford Existential Risks Initiative to support student education about global catastrophic risks. Stephen Luby and Paul Edwards, who together teach the course “Preventing Human Extinction,” plan to use these funds to expand their activities in this area, including student fellowships, research, career development opportunities, a speaker series, and faculty support.

    This follows our December 2019 support and falls within our work on global catastrophic risks.

    Related Items

    • Longtermism

      Forecasting Research Institute — Science of Forecasting

      Open Philanthropy recommended a grant of $5,352,800 over three years to the Forecasting Research Institute (FRI) to support several projects related to advancing the science of forecasting as...

      Read more
    • Longtermism

      Berkeley Existential Risk Initiative — Machine Learning Alignment Theory Scholars

      Open Philanthropy recommended a grant of $2,047,268 to the Berkeley Existential Risk Initiative to support their collaboration with the Stanford Existential Risks Initiative (SERI) on SERI’s Machine Learning Alignment Theory Scholars...

      Read more
    • Longtermism

      Future of Humanity Institute — Administrative and Operational Support

      Open Philanthropy recommended a grant of $400,000 to the Future of Humanity Institute (FHI), via Effective Ventures Foundation UK, to support their administrative and operational expenses. FHI is...

      Read more
    Back to Grants Database
    Open Philanthropy
    Open Philanthropy
    • Careers
    • Press Kit
    • Governance
    • Privacy Policy
    • Stay Updated
    Mailing Address
    Open Philanthropy
    182 Howard Street #225
    San Francisco, CA 94105
    Email
    [email protected]
    Media Inquiries
    [email protected]
    Anonymous Feedback
    Feedback Form

    © Open Philanthropy 2022 Except where otherwise noted, this work is licensed under a Creative Commons Attribution-Noncommercial 4.0 International License. If you'd like to translate this content into another language, please get in touch!