We are interested in funding the development of courses on the following topics:
- Technical work on AI alignment, i.e. the problem of creating AI systems that will reliably do what their designers want them to do even when AI systems become much more capable than their designers across a broad range of tasks. We would be interested in funding the development of courses that discuss the nature of AI alignment risk(s) (especially potential existential risks), explore relevant areas of technical AI research (see here for a non-exhaustive list of examples), and examine how this work bears on the problem.
- Other topics relevant to understanding the long-run impacts of transformative developments in AI. For example, we would be interested in funding the development of courses on the likely effects of human-level or near-human-level AI on economic growth (as discussed e.g. in this paper).
- Global catastrophic biological risks. We are concerned that as biotechnology progresses, it will become easier to develop biological agents with even more destructive potential than COVID-19, to the extent that they could threaten humanity’s long-term future. We are interested in funding the development of courses that focus to a substantial degree on averting these global catastrophic biological risks, either via exploring potential technical countermeasures or governance/policy solutions.
- Global catastrophic risks, i.e. adverse events that cause serious harm on a global scale and thereby threaten to permanently worsen humanity’s future. In addition to courses specifically on the risks mentioned above (potential risks from advanced AI and global catastrophic biological risks), we are interested in funding the development of overview courses that discuss a broader range of risks, including, for example, risks from nuclear war or extreme climate change.
- Effective altruism, i.e. (roughly) the idea of using reason and evidence to do as much good as possible with a given amount of resources. We are open to funding the development of courses that engage with this topic from a range of different angles, including:
- Courses designed to help students think better about how to do good for others over the course of their lives, e.g. via their career choices, civic engagement, volunteering, and/or donations;
- Courses that focus on the philosophical foundations of effective altruism;
- Courses that explore the topic through the lens of disciplines other than philosophy (for example, psychology).
- Longtermism, i.e. (roughly) the philosophical view that in deciding how to act, we should pay significant attention to the long-term consequences of our actions (as discussed e.g. here and here). We are open to funding both courses that:
- Focus on the philosophical case for and against longtermism;
- Focus more on exploring the view’s implications (e.g. courses exploring what types of interventions a longtermist policymaker or philanthropist should favor).
- Critical thinking. We are interested in funding the development of critical thinking courses that approach this topic in an original way, for example by focussing less on deductive fallacies and placing more emphasis on providing students with the tools to improve their probabilistic reasoning, decision-making, and ability to analyze, weigh, and critique evidence in real-life situations.
- Cost-effectiveness analysis, with a particular focus on real-life applications across a range of domains which students are likely to encounter in their future careers.
- Other topics. We expect that we’ll primarily support the development of courses on the topics listed above. That said, we are in principle open to considering proposals for courses on other topics that are directly or indirectly relevant to the above-mentioned areas of grantmaking and the project of improving the long-term future more broadly. If you have a promising idea for a course you would like to develop which is on a topic that is not included in the list above but which you think we might be interested in funding, please email [email protected] with a brief description of the envisaged course, and we’ll let you know whether we think it is a good fit, in which case we’ll invite you to submit a full application.
For all of these topics, we would be excited to fund grantees who would approach the topic from original (including critical) perspectives.