• Focus Areas
    • Cause Selection
    • Global Health & Wellbeing
      • Effective Altruism Community Growth (Global Health and Wellbeing)
      • Farm Animal Welfare
      • Global Aid Policy
      • Global Health & Development
      • Scientific Research
      • South Asian Air Quality
    • Longtermism
      • Biosecurity & Pandemic Preparedness
      • Effective Altruism Community Growth (Longtermism)
      • Potential Risks from Advanced AI
    • Other Areas
      • Criminal Justice Reform
      • History of Philanthropy
      • Immigration Policy
      • Land Use Reform
      • Macroeconomic Stabilization Policy
  • Grants
  • Research & Updates
    • Research Reports
    • Blog Posts
    • Notable Lessons
    • In the News
  • About Us
    • Grantmaking Process
    • How to Apply for Funding
    • Team
    • Contact Us
    • Stay Updated
  • We’re hiring!
  • Focus Areas
    • Cause Selection
    • Global Health & Wellbeing
      • Effective Altruism Community Growth (Global Health and Wellbeing)
      • Farm Animal Welfare
      • Global Aid Policy
      • Global Health & Development
      • Scientific Research
      • South Asian Air Quality
    • Longtermism
      • Biosecurity & Pandemic Preparedness
      • Effective Altruism Community Growth (Longtermism)
      • Potential Risks from Advanced AI
    • Other Areas
      • Criminal Justice Reform
      • History of Philanthropy
      • Immigration Policy
      • Land Use Reform
      • Macroeconomic Stabilization Policy
  • Grants
  • Research & Updates
    • Research Reports
    • Blog Posts
    • Notable Lessons
    • In the News
  • About Us
    • Grantmaking Process
    • How to Apply for Funding
    • Team
    • Contact Us
    • Stay Updated
  • We’re hiring!

Stanford University — Percy Liang Planning Grant

Visit Grantee Site
  • Focus Area: Potential Risks from Advanced AI
  • Organization Name: Stanford University
  • Amount: $25,000

  • Award Date: March 2017

Table of Contents

    Grant investigator: Daniel Dewey
    This page was reviewed but not written by the grant investigator. Stanford University staff also reviewed this page prior to publication.

    The Open Philanthropy Project recommended a planning grant of $25,000 to Professor Percy Liang at Stanford University. This grant was recommended to enable Professor Liang to spend significant time engaging in our process to determine whether to provide his research group with a much larger grant. We did end up recommending that larger grant, which we have written about in more detail here.

    Related Items

    • Potential Risks from Advanced AI

      Stanford University — AI Index

      Open Philanthropy recommended a grant of $78,000 to Stanford University to support the AI Index, which collects and reports data related to artificial intelligence, including data relevant to AI...

      Read more
    • Potential Risks from Advanced AI

      Stanford University — AI Alignment Research (Barrett and Viteri)

      Open Philanthropy recommended a grant of $153,820 to Stanford University to support research on AI alignment by Professor Clark Barrett and Stanford student Scott Viteri. This falls within...

      Read more
    • Potential Risks from Advanced AI

      Stanford University — Adversarial Robustness Research (Dimitris Tsipras)

      Open Philanthropy recommended a grant of $330,792 over three years to Stanford University to support early-career research by Dimitris Tsipras on adversarial robustness as a means to improve...

      Read more
    Back to Grants Database
    Open Philanthropy
    Open Philanthropy
    • Careers
    • Press Kit
    • Governance
    • Privacy Policy
    • Stay Updated
    Mailing Address
    182 Howard Street #225
    San Francisco, CA 94105
    Email
    [email protected]
    Media Inquiries
    [email protected]
    Anonymous Feedback
    Feedback Form

    © Open Philanthropy 2022 This work is licensed under a Creative Commons Attribution-Noncommercial-Share-alike 4.0 International License. If you'd like to translate this content into another language, please get in touch!