This program aims to provide grant support to academics for the development of new university courses (including online courses). At present, we are looking to fund the development of courses on a range of topics that are relevant to certain areas of Open Philanthropy’s grantmaking that form part of our work to reduce global catastrophic risks (potential risks from advanced AI, biosecurity and pandemic preparedness, other global catastrophic risks), or to issues that are of cross-cutting relevance to our work. We are primarily looking to fund the development of new courses, but we are also accepting proposals from applicants who are looking for funding to turn courses they have already taught in an in-person setting into freely-available online courses.

Applications are open until further notice and will be assessed on a rolling basis.

Apply here

1. Possible Topics

We are interested in funding the development of courses on the following topics:

  • Technical work on AI alignment, i.e. the problem of creating AI systems that will reliably do what their designers want them to do even when AI systems become much more capable than their designers across a broad range of tasks. We would be interested in funding the development of courses that discuss the nature of AI alignment risk(s) (especially potential existential risks), explore relevant areas of technical AI research (see here for a non-exhaustive list of examples), and examine how this work bears on the problem.
  • Other topics relevant to understanding the long-run impacts of transformative developments in AI. For example, we would be interested in funding the development of courses on the likely effects of human-level or near-human-level AI on economic growth (as discussed e.g. in this paper).
  • Global catastrophic biological risks. We are concerned that as biotechnology progresses, it will become easier to develop biological agents with even more destructive potential than COVID-19, to the extent that they could threaten humanity’s long-term future. We are interested in funding the development of courses that focus to a substantial degree on averting these global catastrophic biological risks, either via exploring potential technical countermeasures or governance/policy solutions.
  • Global catastrophic risks, i.e. adverse events that cause serious harm on a global scale and thereby threaten to permanently worsen humanity’s future. In addition to courses specifically on the risks mentioned above (potential risks from advanced AI and global catastrophic biological risks), we are interested in funding the development of overview courses that discuss a broader range of risks, including, for example, risks from nuclear war or extreme climate change.
  • Effective altruism, i.e. (roughly) the idea of using reason and evidence to do as much good as possible with a given amount of resources. We are open to funding the development of courses that engage with this topic from a range of different angles, including:
    • Courses designed to help students think better about how to do good for others over the course of their lives, e.g. via their career choices, civic engagement, volunteering, and/or donations;
    • Courses that focus on the philosophical foundations of effective altruism;
    • Courses that explore the topic through the lens of disciplines other than philosophy (for example, psychology).
  • Longtermism, i.e. (roughly) the philosophical view that in deciding how to act, we should pay significant attention to the long-term consequences of our actions (as discussed e.g. here and here). We are open to funding both courses that:
    • Focus on the philosophical case for and against longtermism;
    • Focus more on exploring the view’s implications (e.g. courses exploring what types of interventions a longtermist policymaker or philanthropist should favor).
  • Other topics. We expect that we’ll primarily support the development of courses on the topics listed above. That said, we are in principle open to considering proposals for courses on other topics that are directly or indirectly relevant to the above-mentioned areas of grantmaking and the project of improving the long-term future more broadly. If you have a promising idea for a course you would like to develop which is on a topic that is not included in the list above but which you think we might be interested in funding, please email coursedevelopmentgrants@openphilanthropy.org with a brief description of the envisaged course, and we’ll let you know whether we think it is a good fit, in which case we’ll invite you to submit a full application.

For all of these topics, we would be excited to fund grantees who would approach the topic from original (including critical) perspectives.

2. Other information

  • We are looking to fund academics who have demonstrated excellence in their respective fields and who propose courses that seem compelling, on-topic, and likely to engage students in productive ways.
  • Other things being equal, we prefer to fund courses that are likely to attract substantial enrollments. However, we are also open to funding smaller courses (including graduate classes and seminars) which look like an especially good fit in other respects.
  • Applications are open until further notice and will be assessed on a rolling basis. We will aim to respond to candidates within a few months of receiving their applications at the latest (and possibly earlier). Candidates who require more timely decisions are able to indicate this in their application form, and we may be able to expedite the decision process in such cases.
  • We welcome joint applications from pairs or larger groups of faculty members who would like to work together on developing a course.
  • In some cases, we may ask outside advisors to help us review and evaluate applications. By submitting your application, you agree that we may share your application with our outside advisors for evaluation purposes.
  • We may make changes to this program from time to time. Any such changes will be reflected on this page.
  • We encourage individuals with diverse backgrounds and experiences to apply, especially self-identified women and people of color.
  • The program is open to applicants in any country.1

3. Examples of existing courses

Past Open Philanthropy course development grantees:

Courses which were not funded by Open Philanthropy,2 but which we would have been excited to fund:

  • Andrew Critch & Stuart Russell, “Safety and Control for Artificial General Intelligence” (syllabus)
  • Paul Edwards & Stephen Luby, “Preventing Human Extinction” (syllabus)
  • Kevin Esvelt, “The Great Problems” (syllabus)
  • Hilary Greaves, “Foundational issues in effective altruism” (syllabus)
  • Anton Korinek, “Growth and Labor in the Age of AI” (syllabus)
  • Jeff Sanford Russell, “The End of the World” (syllabus)
  • Peter Singer, “Effective Altruism” (syllabus)

4. How funding can be used

The purpose of the funding is to assist in the development and execution of your course. Eligible uses of funding include (but are not necessarily limited to):

  • Summer salary;
  • Teaching buyouts;
  • Graduate student assistance;
  • Guest speakers (≤ $10,000).

Note that as a general policy, when making grants to universities, Open Philanthropy will not pay for indirect costs in excess of 10% of direct costs.

Grant size will vary depending on the career stage of the grantee, but as a rough indication, we expect grant totals for mid-career academics who are applying for grants that cover summer salary/teaching buyouts (potentially along with other costs) to amount to approximately $30,000-50,000.

5. How to apply

To apply, please use this form, which asks you to provide us with:

  1. A brief outline (roughly one page in length) describing in broad terms:
    1. How you currently tentatively think you might structure such a course,
    2. At whom the course would be aimed, and
    3. Roughly what sort of enrollment numbers you anticipate;
  2. A brief (by no means exhaustive) list of readings you are considering using for the course;
  3. An estimated budget;
  4. An up-to-date CV or resume.

If you are submitting a joint application, please designate a primary contact for the purposes of this application and complete only one (joint) application form.

We are aware that your thinking about how to best structure such a course might evolve significantly in the process of actually developing it following the receipt of a grant. Accordingly, we would not view it as problematic if the course you ended up teaching wound up looking somewhat different from the course as characterized by the 1-page outline submitted for the purpose of this application and would not view the exact details of this outline as binding, although we obviously would want you to stick to the proposed topic (unless explicitly agreed).

6. Grantee expectations

  • We would like grantees to continue teaching the developed course in the future (at least three times), but this is not a requirement of a grant.
  • Grantees are required to provide us, after completion of the course, with a copy of the course syllabus, a copy of the final exam/final paper (if permitted of by the relevant university’s policies), enrollment statistics, student evaluations, and a brief summary (roughly half a page in length) describing their own experience teaching the course.
  • We will strongly encourage grantees to make their syllabi available online, but we won’t require this.

Questions? Please contact coursedevelopmentgrants@openphilanthropy.org.

Expand Footnotes Collapse Footnotes

1.However, we may decline to make an award if we are not able to comply with local laws.

2.Note that although we did not fund the development of these courses, some of the instructors work for organizations that are Open Philanthropy grantees:

  • Stuart Russell leads the Center for Human-Compatible AI (CHAI).
  • Hilary Greaves leads the Global Priorities Institute.
  • Kevin Esvelt leads a lab at the MIT Media Lab.
  • Paul Edwards and Stephen Luby lead the Stanford Existential Risk Initiative, which works on student education about global catastrophic risks, building on Professor Edwards and Professor Luby’s work in teaching “Preventing Human Extinction”.