Pydata MCR talk on training LLMs

My talk on training LLMs at Pydata MCR

September 25, 2025 · 1 min · 145 words

Distributed communication for GPUs (part 2)

Introduction to collective communication operations used for distributed training.

September 13, 2025 · 13 min · 2567 words

Distributed communication for GPUs (part 1)

Introduction to distributed communication for GPUs.

September 9, 2025 · 11 min · 2146 words

Choosing a batch size and provider for LLM training

Notes on choosing appropriate batch size and compute for training LLMs

June 27, 2025 · 4 min · 756 words

Ultra-scale Playbook - ZeRO Sharding

Notes on training LLMs using sharding strategies

June 21, 2025 · 8 min · 1518 words