Scaling AI Safety Through Mentorship w/ Dr. Ryan Kidd

Download MP3
What does it actually take to build a successful AI safety organization? I'm joined by Dr. Ryan Kidd, who has co-led MATS from a small pilot program to one of the field's premier talent pipelines. In this episode, he reveals the low-hanging fruit in AI safety field-building that most people are missing: the amplifier archetype.

I pushed Ryan on some hard questions, from balancing funder priorities and research independence, to building a robust selection process for both mentors and participants. Whether you're considering a career pivot into AI safety or already working in the field, this conversation offers practical advice on how to actually make an impact.

Chapters
  • (00:00) - - Intro
  • (08:16) - - Building MATS Post-FTX & Summer of Love
  • (13:09) - - Balancing Funder Priorities and Research Independence
  • (19:44) - - The MATS Selection Process
  • (33:15) - - Talent Archetypes in AI Safety
  • (50:22) - - Comparative Advantage and Career Capital in AI Safety
  • (01:04:35) - - Building the AI Safety Ecosystem
  • (01:15:28) - - What Makes a Great AI Safety Amplifier
  • (01:21:44) - - Lightning Round Questions
  • (01:30:30) - - Final Thoughts & Outro

Links
Ryan's Writing
  • LessWrong post - Talent needs of technical AI safety teams
  • LessWrong post - AI safety undervalues founders
  • LessWrong comment - Comment permalink with 2025 MATS program details
  • LessWrong post - Talk: AI Safety Fieldbuilding at MATS
  • LessWrong post - MATS Mentor Selection
  • LessWrong post - Why I funded PIBBSS
  • EA Forum post - How MATS addresses mass movement building concerns
FTX Funding of AI Safety
  • LessWrong blogpost - An Overview of the AI Safety Funding Situation
  • Fortune article - Why Sam Bankman-Fried’s FTX debacle is roiling A.I. research
  • NY Times article - FTX probes $6.5M in payments to AI safety group amid clawback crusade
  • Cointelegraph article - FTX probes $6.5M in payments to AI safety group amid clawback crusade
  • FTX Future Fund article - Future Fund June 2022 Update (archive)
  • Tracxn page - Anthropic Funding and Investors
Training & Support Programs
Funding Organizations
Coworking Spaces
Research Organizations & Startups
Other Sources
  • AXRP website - The AI X-risk Research Podcast
  • LessWrong blogpost - Shard Theory: An Overview

Creators and Guests

Jacob Haimes
Host
Jacob Haimes
Host of the podcast and all-around great dude.
Dr. Ryan Kidd
Guest
Dr. Ryan Kidd
Co-executive director of MATS and AI safety field-building expert.
Scaling AI Safety Through Mentorship w/ Dr. Ryan Kidd
Broadcast by