Research Fellowship

Research Fellowship

The Pivotal Research Fellowship is a 9-week program designed to enable promising researchers to produce impactful research and accelerate their careers in AI safety, AI governance, and biosecurity. Throughout the fellowship, fellows collaborate with experienced mentors, engage in workshops and seminars, and build strong networks within the AI safety research community in London and beyond.

The 2025 Q1 Research Fellowship will run from February 3rd to April 4th. Fellows will receive:

  • Direct mentorship from established researchers

  • Conduct research in person in London at the London Initiative for Safe AI alongside leading AI Safety researchers

  • £5000 stipend, meals included, support for travel, accommodation, and compute costs

This marks our 5th research fellowship, building on a strong track record of supporting researchers in tackling important questions about the safety and governance of emerging technology.

Applications are open until November 21st, 2024.

For questions about the application process, please reach out.

Deadline: November 21st, 2024 (AoE)

Dates & Locations

February 3rd to April 4th,
London UK

Eligibility

Anyone passionate about a career in global catastrophic risk

Financial Support

£5,000 stipend, relocation support, free weekday lunches

We will also run a Research Fellowship in summer 2025. Express interest in all upcoming opportunities.

Recommend others who may be a good fit.
Receive $100 for each accepted candidate we contact through you.

 FAQs

FAQs
  • The exact application process might change. But we currently expect to have 3 short stages:

    1. Written Application

    2. Short, Automated Interview

    3. Personal Interview (with opportunity for applicants to ask questions)

  • Fellows typically work on research papers ranging from 10-20 pages. However, alternative formats such as blog posts, forum posts, or audiovisual work are also welcome. Our main priority is ensuring a valuable learning process and giving you the opportunity to test your fit for this type of work.

  • While most fellows work on individual research projects with mentor support, we are open to collaborative work on certain topics. After acceptance, we'll work together to determine the optimal arrangement of fellows, research projects, and potential teammates.

  • While meeting all relevant entry requirements is each fellow's responsibility, we provide an information document on entry requirements for the Research Fellowship (accuracy and completeness not guaranteed). We may be able to offer assistance during the process, depending on the circumstances.

  • Anyone who is at least 18 years old at the start of the fellowship is eligible to apply.

  • To offer context for potential research areas, we have curated a selection of resources that cover possible directions: AI Governance & Policy (here, here, here), Technical AI Safety , Technical AI Governance, and Biosecurity.

    These examples illustrate the range of research we support but are not meant to limit your options. We encourage you to develop your own original research proposal that reflects your interests and expertise.

  • Global catastrophic risks (GCRs) are events or situations that pose a risk of major harm on a global scale, with the potential to impact millions or even billions of people.

  • Pivotal is committed to addressing global catastrophic risks, which we consider to be among the most important and neglected global challenges. We believe that building a strong field of researchers, policymakers, and academics who are dedicated to reducing these risks is crucial. Our primary focus at the moment is on delivering a high-quality research fellowship to cultivate talent in the global catastrophic risk field. In the future, we may explore opportunities to create year-round research opportunities.

  • We’re always excited about hearing from people who want to work with us in some form. You can find out at “Work With Us.”

Previous Mentors

Previous Mentors
  • Jonas Schuett

    2022 & 2023 Mentor

    Senior Research Fellow at the Centre for the Governance of AI.

    Linkedin

  • Ingvild Bode

    Ingvild Bode

    2024 Mentor

    Professor, Center for War Studies, University of Southern Denmark

    Website

  • Sebastien Krier

    2023 Mentor

    Policy Development & Strategy Manager at DeepMind.

    Linkedin

  • Jonas Sandbrink

    2022 & 2023 Mentor

    Chem-Bio Lead, UK AI Safety Institute.

    Linkedin

  • Jenny Xiao

    2022 Mentor

    Researcher VC at Leonis Capital

    Linkedin

  • Marius Hobbhahn

    2022 & 2023 Mentor

    Director and co-founder of Apollo Research.

    Linkedin

  • Dave Denkenberger

    2021 & 2023 Mentor

    Assistant Professor of Mechanical Engineering; Co-founder of ALLFED.

    Linkedin

  • Jakob Graabak

    2023 Mentor

    Research lead at ICFG.eu, working on risks from AI and biotechnology

    Linkedin

  • Jérémy Scheurer

    2022 Mentor

    Research Scientist - AI Alignment at Apollo Research

    Linkedin

  • Lee Sharkey

    2022 Mentor

    Research engineer at Conjecture, Neural Data Analysis, AI Safety.

    Linkedin

  • Sophie-Charlotte Fischer

    2021 Mentor

    Senior Researcher at the Center for Security Studies at ETH Zurich

    Linkedin

  • Nicolas Moës

    2022 Mentor

    Executive Director at The Future Society

    Linkedin

  • Michael Parker

    Michael Parker

    2024 Mentor

    Assistant Dean, Georgetown University

    Linkedin

Testimonials

Testimonials