Open Philanthropy recommended a grant of $189,350 to the Supervised Program for Alignment Research to support a program matching aspiring researchers with mentors for AI safety research projects.
Open Philanthropy recommended a grant of $6,600,000 to 80,000 Hours for general support. 80,000 Hours works to help people have more impact with their careers by providing online content and one-on-one coaching.
Photo courtesy of the Fred Hutchinson Cancer Center
Open Philanthropy recommended a grant of $195,000 to the Fred Hutchinson Cancer Center to support research investigating a potential way to make CAR T-cell therapy safer for cancer patients. The research, led by Alexandre V. Hirayama, will study whether an antibody (anti-Syndecan 2) can reduce dangerous side effects of CAR T-cell therapy while maintaining its effectiveness against cancer.
Photo courtesy of the Cambridge Boston Alignment Initiative
Open Philanthropy recommended a grant of $1,170,000 to the Cambridge Boston Alignment Initiative to support operating costs for Harvard’s AI Safety Student Team (AISST) and MIT AI Alignment (MAIA).
Open Philanthropy recommended a grant of $886,285 to the Kairos Project to support its work to improve the talent pipeline for the field of AI safety through targeted advising, workshops, residencies, and mentorship programs for AI safety groups.
Open Philanthropy recommended two grants totaling $3,382,029 to MATS Research to support the ML Alignment & Theory Scholars (MATS) program. The MATS program is an educational seminar and independent research program that provides talented scholars with talks, workshops, and research mentorship in the fields of AI alignment, interpretability, and governance. The program also connects participants with the Berkeley AI safety research community.
Open Philanthropy recommended two grants totaling $2,381,609 to AI Safety Support to support the ML Alignment & Theory Scholars (MATS) program. The MATS program is an educational seminar and independent research program that provides talented scholars with talks, workshops, and research mentorship in the fields of AI alignment, interpretability, and governance. The program also connects participants with the Berkeley AI safety research community.
Open Philanthropy recommended two grants totaling $461,069 to Training for Good to support the EU Tech Policy Fellowship, a seven-month program that helps recipients of the fellowship launch careers focused on emerging technologies. These grants will support two cohorts of the program, as well as provide general operating support to Training for Good.
Open Philanthropy recommended a grant of $469,625 to Harmony Intelligence to support the development of a publicly available benchmark on the autonomous moneymaking capabilities of large language model (LLM) agents.
Open Philanthropy recommended a grant of £298,384 (approximately $589,903 at the time of conversion) to support the Cambridge AI Safety Hub (CAISH), a network of students and professionals in Cambridge, UK, working on AI safety. CAISH will use the funds to host events, programs, fellowships, and retreats, as well as pay for staff salaries and other expenses.
We use cookies on our website to give you the most relevant experience by remembering your preferences and repeat visits. By clicking “Accept All”, you consent to the use of ALL the cookies. However, you may visit "Cookie Settings" to provide a controlled consent.
This website uses cookies to improve your experience while you navigate through the website. Out of these, the cookies that are categorized as necessary are stored on your browser as they are essential for the working of basic functionalities of the website. We also use third-party cookies that help us analyze and understand how you use this website. These cookies will be stored in your browser only with your consent. You also have the option to opt-out of these cookies. But opting out of some of these cookies may affect your browsing experience.
Necessary cookies are absolutely essential for the website to function properly. These cookies ensure basic functionalities and security features of the website, anonymously.
Cookie
Duration
Description
cookielawinfo-checkbox-analytics
11 months
This cookie is set by GDPR Cookie Consent plugin. The cookie is used to store the user consent for the cookies in the category "Analytics".
cookielawinfo-checkbox-functional
11 months
The cookie is set by GDPR cookie consent to record the user consent for the cookies in the category "Functional".
cookielawinfo-checkbox-necessary
11 months
This cookie is set by GDPR Cookie Consent plugin. The cookies is used to store the user consent for the cookies in the category "Necessary".
cookielawinfo-checkbox-others
11 months
This cookie is set by GDPR Cookie Consent plugin. The cookie is used to store the user consent for the cookies in the category "Other.
cookielawinfo-checkbox-performance
11 months
This cookie is set by GDPR Cookie Consent plugin. The cookie is used to store the user consent for the cookies in the category "Performance".
viewed_cookie_policy
11 months
The cookie is set by the GDPR Cookie Consent plugin and is used to store whether or not user has consented to the use of cookies. It does not store any personal data.
Functional cookies help to perform certain functionalities like sharing the content of the website on social media platforms, collect feedbacks, and other third-party features.
Performance cookies are used to understand and analyze the key performance indexes of the website which helps in delivering a better user experience for the visitors.
Analytical cookies are used to understand how visitors interact with the website. These cookies help provide information on metrics the number of visitors, bounce rate, traffic source, etc.
Advertisement cookies are used to provide visitors with relevant ads and marketing campaigns. These cookies track visitors across websites and collect information to provide customized ads.