Mixed Precision Training | Explanation and PyTorch Implementation from Scratch

ExplainingAI · Beginner ·📄 Research Papers Explained ·4mo ago
In this video, we break down Mixed Precision Training. You’ll learn why FP16, BF16, and FP32 matter, what we gain (and lose) when we switch precision, and how mixed precision training lets us train AI models faster and with lesser resources without sacrificing accuracy. We start by understanding floating point formats(specifically FP32), what precision is , and from there transition to lower precision formats like FP16, BF16 . We then explore the real benefits of lower precision, implement mixed precision from scratch, and finally switch to PyTorch’s built-in AMP for training our deep learni…
Watch on YouTube ↗ (saves to browser)

Chapters (7)

Why care about Mixed Precision ?
1:19 What is Precision? (FP32 vs FP16 vs BF16 Explained)
10:55 Why Lower Precision Helps
14:01 Mixed Precision Training From Scratch (Step-by-Step)
25:10 Loss Scaling
29:30 Mixed Precision Training in PyTorch (autocast + GradScaler)
31:50 Summary
The Secret Spy Tech Inside Every Credit Card
Next Up
The Secret Spy Tech Inside Every Credit Card
Veritasium