This program aims to provide support – in the form of funding for graduate study, unpaid internships, independent study, career transition and exploration periods, and other activities relevant to building career capital – for individuals at any career stage who want to pursue careers that could help to reduce global catastrophic risks or otherwise improve the long-term future.1

Apply here.

We’re especially interested in supporting individuals who want to pursue careers that are in some way related to mitigating potential risks posed by future advances in artificial intelligence or global catastrophic biological risks.

Applications are open until further notice and will be assessed on a rolling basis.

Generally speaking, we aim to review proposals within 6 weeks of receiving them, although this may not prove possible for all applications. Candidates who require more timely decisions can indicate this in their application forms, and we may be able to expedite the decision process in such cases.

Until recently, this program was known as the “early-career funding program”, but we’ve decided to broaden its scope to explicitly include later-career individuals.

1. Scope

We’re open to receiving applications from individuals who are already pursuing careers related to reducing global catastrophic risk (or otherwise improving the long-term future), looking to transition into such careers from other lines of work, or only just starting their careers. We think there are many career tracks which are potentially promising from this perspective (including many of the ones in this list from 80,000 Hours), and there is therefore a correspondingly wide range of proposals we would consider funding.

We’re open to supporting a variety of career development and transition activities, including (but not necessarily limited to) graduate study, unpaid internships, independent study, career transition and exploration periods, postdocs, obtaining professional certifications, online courses, and other types of one-off career-capital-building activities.

To name a few concrete examples of the kinds of applicants we’re open to funding, in no particular order:

  • A final-year undergraduate student who wants to pursue a master’s or a PhD program in machine learning in order to contribute to technical research that helps mitigate risks from advanced artificial intelligence.
  • An individual who wants to do an unpaid internship at a think tank focused on biosecurity, with the aim of pursuing a career dedicated to reducing global catastrophic biological risk.
  • A former senior ML engineer at an AI company who wants to spend six months on independent study and career exploration in order to gain context on and investigate career options in AI risk mitigation.
  • An individual who wants to attend law school or obtain an MPP, with the aim of working in government on policy issues relevant to improving the long-term future.
  • A recent physics PhD who wants to spend six months going through a self-guided ML curriculum and working on projects in interpretability, in order to transition to contributing to technical research that helps mitigate risks from advanced AI systems.
  • A software engineer who wants to spend the next three months doing independent study in order to gain relevant certifications for a career in information security, with the longer-term goal of working for an organization focused on reducing global catastrophic risk.
  • An experienced management consultant who wants to spend three months exploring different ways to apply their skill set to reducing global catastrophic risk and applying to relevant jobs, with an eye to transitioning to a related career.
  • A PhD graduate in an unrelated sub-area of computational biology who wants to spend four months getting up to speed on DNA synthesis screening in order to transition to working on this topic.
  • A professor in machine learning, theoretical computer science, or another technical field who wants funding to take a one-year sabbatical to explore ways to contribute to technical AI safety or AI governance.
  • An individual who wants to attend journalism school, with the aim of covering topics relevant to the long-term future (potentially among other important topics).

2. Funding criteria

  • This program aims to provide support for individuals who want to pursue careers that could help to reduce global catastrophic risk or otherwise improve the long-term future. We are particularly interested in funding people who have deeply engaged with questions about global catastrophic risk and/or the long-term future, and who have skills and abilities that could allow them to make substantial contributions in the relevant areas.
  • Candidates should describe how the activity for which they are seeking funding will help them enter or transition into a career path that plausibly allows them to make these contributions. We appreciate that candidates’ plans may be uncertain or even unlikely to work out, but we are looking for evidence that candidates have thought in a critical and reasonably detailed manner about those plans — not just about what career path(s) might open up for them, but also about how entering said career path(s) could allow them to reduce global catastrophic risk or otherwise positively impact the long-term future.
  • We are looking to fund applications where our funding would make a difference — i.e. where the candidate is otherwise unable to find sufficient funding, or the funding they were able to secure imposes significant restrictions or requirements on them (for example, in the case of graduate study, restrictions on their research focus or teaching requirements). We may therefore turn down promising applicants who were able to secure equivalent support from other sources.
  • The program is open to applicants in any country.2

3. Other information

  • There is neither a maximum nor a minimum number of applications we intend to fund; rather, we intend to fund any applications that seem sufficiently promising to us to be above our general funding bar for this program.
  • In some cases, we may ask outside advisors to help us review and evaluate applications. By submitting your application, you agree that we may share your application with our outside advisors for evaluation purposes.
  • We encourage individuals with diverse backgrounds and experiences to apply, especially self-identified women and people of color.
  • We plan to respond to all applications.
  • This program now subsumes what was previously called the Open Philanthropy Biosecurity Scholarship; for the time being, candidates who would previously have applied to that program should apply to this program instead. (We may decide to split out the Biosecurity Scholarship again as a separate program at a later point, but for practical purposes, current applicants can ignore this.) 
  • We may make changes to this program from time to time. Any such changes will be reflected on this page.

4. Application process

Applications are open until further notice. You can apply using this form.

Required application materials:

  • Proposal, no longer than 500 words
  • Personal statement, no longer than 500 words
  • Approximate budget, no longer than half a page
  • CV or resume, no longer than 2 pages
  • Academic transcript (undergrad and graduate, if applicable)
  • Answers to a few other questions (see application form)

We may contact you to request additional information. And in some cases, the assessment may also involve a brief interview, via video teleconference.

We are aware that if you are applying for graduate school or internships, you will typically not know at this point which specific programs will admit you. If you are applying to several different but related programs and are doing so with a similar career trajectory in mind (e.g. different law schools and MPP programs to pursue a career in public policy), please specify this in your proposal and budget and submit a single application. If you are applying to several rather different programs with clearly distinct career trajectories in mind (e.g. journalism schools and history PhD programs), please submit separate applications.

If you have questions, you can reach us via [email protected].

Expand Footnotes Collapse Footnotes

1.By “improving the long-term future”, we specifically mean actions that could positively affect the very long-run trajectory of civilization over millions of years or even longer timeframes, as discussed for example by Beckstead (2013) or Greaves & MacAskill (2019). One way to affect the long-term future is to mitigate the risk of human extinction (see “The Precipice” for a recent discussion), but there may be other ways to improve the long-run trajectory of civilization (see this post by 80,000 Hours for some potential ideas).

2.However, we may decline to make an award if we are not able to comply with local laws.