• Focus Areas
    • Cause Selection
    • Global Health & Wellbeing
      • Effective Altruism (Global Health and Wellbeing)
      • Farm Animal Welfare
      • Global Aid Policy
      • Global Health & Development
      • Global Health R&D
      • Innovation Policy
      • Land Use Reform
      • Scientific Research
      • South Asian Air Quality
    • Global Catastrophic Risks
      • Biosecurity & Pandemic Preparedness
      • Global Catastrophic Risks Capacity Building
      • Potential Risks from Advanced AI
    • Other Areas
      • Criminal Justice Reform
      • History of Philanthropy
      • Immigration Policy
      • Macroeconomic Stabilization Policy
  • Grants
  • Research & Updates
    • Research Reports
    • Blog Posts
    • Notable Lessons
    • In the News
  • About Us
    • Grantmaking Process
    • How to Apply for Funding
    • Team
    • Contact Us
    • Stay Updated
  • We’re hiring!
  • Focus Areas
    • Cause Selection
    • Global Health & Wellbeing
      • Effective Altruism (Global Health and Wellbeing)
      • Farm Animal Welfare
      • Global Aid Policy
      • Global Health & Development
      • Global Health R&D
      • Innovation Policy
      • Land Use Reform
      • Scientific Research
      • South Asian Air Quality
    • Global Catastrophic Risks
      • Biosecurity & Pandemic Preparedness
      • Global Catastrophic Risks Capacity Building
      • Potential Risks from Advanced AI
    • Other Areas
      • Criminal Justice Reform
      • History of Philanthropy
      • Immigration Policy
      • Macroeconomic Stabilization Policy
  • Grants
  • Research & Updates
    • Research Reports
    • Blog Posts
    • Notable Lessons
    • In the News
  • About Us
    • Grantmaking Process
    • How to Apply for Funding
    • Team
    • Contact Us
    • Stay Updated
  • We’re hiring!

AI Safety Support — Situational Awareness Research

Visit Grantee Site
  • Focus Area: Potential Risks from Advanced AI
  • Organization Name: AI Safety Support
  • Amount: $443,716

  • Award Date: April 2023

Table of Contents

    Open Philanthropy recommended three grants totaling $443,716 to support research led by Owain Evans to evaluate whether machine learning models have situational awareness. These grants were made to AI Safety Support, Effective Ventures Foundation USA, and the Berkeley Existential Risk Initiative, and will support salaries, office space, and compute for this research project.

    This falls within our focus area of potential risks from advanced artificial intelligence.

    Related Items

    • Potential Risks from Advanced AI

      AI Safety Support — SERI MATS Program

      Open Philanthropy recommended three grants totaling $1,538,000 to AI Safety Support to support their collaboration with Stanford Existential Risks Initiative (SERI) on SERI’s Machine Learning Alignment Theory Scholars (MATS) program. MATS is...

      Read more
    • Potential Risks from Advanced AI

      AI Safety Support — Research on Trends in Machine Learning

      Open Philanthropy recommended a grant of $42,000 to AI Safety Support to scale up a research group, led by Jaime Sevilla, which studies trends in machine learning. This...

      Read more
    • Potential Risks from Advanced AI

      Brian Christian — Psychology Research

      Open Philanthropy recommended a grant of £29,700 (approximately $37,903 at the time of conversion) to Brian Christian to support a DPhil in psychology at the University of Oxford....

      Read more
    Back to Grants Database
    Open Philanthropy
    Open Philanthropy
    • Careers
    • Press Kit
    • Governance
    • Privacy Policy
    • Stay Updated
    Mailing Address
    Open Philanthropy
    182 Howard Street #225
    San Francisco, CA 94105
    Email
    [email protected]
    Media Inquiries
    [email protected]
    Anonymous Feedback
    Feedback Form

    © Open Philanthropy 2022 Except where otherwise noted, this work is licensed under a Creative Commons Attribution-Noncommercial 4.0 International License. If you'd like to translate this content into another language, please get in touch!