Select Page

Opportunity: BASE Fellows Spring 2026 — Black in AI Safety; Ethics Fellowship

Opportunity: BASE Fellows Spring 2026 — Black in AI Safety; Ethics Fellowship

The Black in AI Safety & Ethics (BASE) Fellowship is now accepting applications for its Spring 2026 cohort. This 13-week, part-time, fully remote fellowship begins in April 2026 and is designed to cultivate the next generation of Black researchers, practitioners, and leaders in AI Safety, AI Security, and AI Governance.

With expert mentorship, structured training, and hands-on research experience, BASE Fellows gain the skills, networks, and knowledge needed to shape the rapidly evolving AI safety ecosystem.


Program Overview

The BASE Fellowship offers a dual-phase structure and three specialized research tracks to support impactful, mission-driven research.

Research Tracks

1. AI Alignment (Technical AI Safety)

  • Interpretability
  • Oversight & Control
  • Agency and autonomous systems

2. AI Security

  • Cybersecurity and infosec
  • Risk management
  • Adversarial robustness
  • Threat detection

3. AI Governance

  • Policy and regulatory frameworks
  • Standards and evaluation
  • Systemic risk analysis

Program Structure

Phase 1 (Weeks 1–5): Foundations & Training

Fellows receive baseline training tailored to their stream:

  • Technical AI Safety
  • AI Governance
  • InfoSec / AI Security

Core activities include:

  • Seminars led by researchers and practitioners
  • Guided assignments to strengthen technical/analytical skills
  • Structured learning in AI safety fundamentals

Phase 2 (Weeks 6–13): Research & Project Development

Fellows conduct a focused AI safety project with a track-aligned mentor.
Features include:

  • Weekly 1-hour mentor meetings
  • Research support and feedback loops
  • Project milestones and structured development
  • A capstone output such as:
    • A research preprint
    • A technical blog post
    • A fellowship or grant application

Minimum Requirements

Applicants must:

  • Be enrolled in or have completed a BA/BS degree
  • Demonstrate alignment with BASE’s mission of empowering the global Black diaspora
  • Have strong coding/ML skills for technical tracks
  • Show relevant academic or professional experience for governance/security tracks
  • Commit 20–25 hours/week to seminars, assignments, and research
  • Submit a final capstone project

Preferred Qualifications

  • Prior research experience or work in collaborative environments
  • Clear intention to continue work in AI Safety, Security, or Governance
    (e.g., graduate study, research roles, policy internships, technical positions)
  • Strong alignment with BASE’s values of fairness, accountability, and collective well-being

Skills by Track

AI Safety

  • Background in Computer Science, ML, Mathematics, or related fields
  • Proficiency in Python, ML models, or research workflows

AI Security

  • Background in Cybersecurity, Infosec, Threat Detection, Adversarial ML, etc.

AI Governance

  • Experience or interest in policy, law, political science, economics, sociology, or technology governance

Application & Selection Process

Phase 1 — Initial Review

  • Review of application and references
  • Pre-screening assessments in mid–late January
    • Coding test (technical tracks)
    • Work test (governance/security tracks)
  • Successful candidates proceed to mentor project selection

Phase 2 — Mentor Review & Offers

  • Mentors review aligned applicants
  • Mentor-mentee matches occur
  • Final acceptance offers sent in early March

Why Join BASE?

BASE is building a connected, empowered, and informed global Black community working at the forefront of AI Safety, Security, and Governance. The fellowship aims to ensure that emerging AI systems reflect fairness, accountability, and collective well-being—with Black voices playing a central role in shaping that future.


Deadline: January 9, 2026
Apply Now


Discover more from Opportunities for Youth

Subscribe to get the latest posts sent to your email.

error: Content is protected !!

Discover more from Opportunities for Youth

Subscribe now to keep reading and get access to the full archive.

Continue reading

Impact-Site-Verification: 4c9a16e6-8d30-4e3b-b21e-4c1d34187f52