SUCHIR SALHAN

Cognitive AI Researcher & Founder of Per Capita Media

Graduation Cambridge
Find Out More
Affiliations & Profiles

SUCHIR SALHAN

Cognitive AI Researcher & Founder of Per Capita Media

Suchir Salhan studies interpretable AI and small language models while leading Per Capita Media, translating research into actionable insights for policy and society.

Overview of research themes and cognitive modeling
Figure 1: Conceptual overview of research themes.

Research

My research explores how abstract linguistic structure emerges in compact neural systems under realistic developmental and multilingual constraints. I focus on interpretability, learning dynamics, and practical applications of small language models.

Core Claims

  • Small models can acquire structure under severe data constraints
  • Multilingual exposure shifts inductive biases, not just capacity
  • Evaluation should capture learning dynamics, not only end-state performance
PicoLM

Small Language Models Pretraining Framework

*This work draws on BabyLM benchmarks and related developmental datasets.*

Per Capita Media

Per Capita Media is a national independent student newspaper and magazine. PCM is led by students at the University of Cambridge with contributors from across the UK, including Oxford, London, Birmingham, and Bristol. I am the Editor, Per Capita Media (University of Cambridge) and President of Cambirdge University PCM.

What We Do

  • Original reporting, features, and cultural commentary on politics, society, technology, arts, and more
  • Platform for emerging writers and thinkers beyond traditional university media
  • Outreach and mentorship initiatives for secondary and sixth-form students

Our Mission

Explore issues that matter to young people and broader audiences through thoughtful, research-informed journalism and commentary, building a community of independent student voices. Learn more about our team and vision on LinkedIn.

*Founded August 2023. Contributions from students across the UK.*

Research Projects

  • Pretraining bilingual and multilingual small language models
  • Using IRT to evaluate in-context and developmental learning

Research Focus

Affiliation

University of Cambridge

Research Areas

Multilingual NLP · Language Acquisition · Cognitive Modeling

Methods

Pretraining · Evaluation · IRT · Developmental Benchmarks

Tools

PyTorch · Hugging Face · BabyLM · Weights & Biases

Download full CV (PDF)

Selected Writing

Contact

SUCHIR SALHAN