How to train a Million Context LLM — with Mark Huang of Gradient.ai

Latent Space · Advanced ·🧠 Large Language Models ·1y ago
Scaling Llama3 beyond 1M context window with ~perfect utilization, the difference between ALiBi and RoPE, how to use GPT-4 to create synthetic data for your context extension finetunes, and more! Full writeup: https://www.latent.space/p/gradient 00:00:00 Introductions 00:01:30 Founding story of Gradient and its mission 00:04:35 "Minimum viable agents" 00:09:19 Differentiating ML and AI, focusing on out-of-domain generalization 00:10:12 Extending Llama3 to 1M tokens 00:14:32 Technical challenges with long context sequences 00:17:45 Data quality and the importance of diverse datasets 00:19:45 What's a theta value? 00:22:42 RoPE vs Ring Attention vs ALiBi vs YaARN 00:25:06 Why RingAttention matters 00:28:01 How to refine datasets for context extension 00:33:34 Multi-stage training data and avoiding overfitting to recent data 00:34:27 The potential of using synthetic data in training 00:38:22 Applying LoRa adapters to extend model capabilities 00:42:25 Benchmarking long context models and evaluating their performance 00:47:20 Pushing to 4M context and output quality degradation 00:50:08 What do you need this context for? 00:52:57 Impact of long context in chat vs Docs Summarization 00:56:25 Future directions for long context models and multimodality 00:59:38 How do you know what research matters? 01:02:47 Routine for staying updated with AI research and industry news 01:05:33 Deciding which AI developments to invest time in 01:10:37 Request for collaboration and data set construction for long context
Watch on YouTube ↗ (saves to browser)
Sign in to unlock AI tutor explanation · ⚡30

Playlist

Uploads from Latent Space · Latent Space · 30 of 60

1 Ep 18: Petaflops to the People — with George Hotz of tinycorp
Ep 18: Petaflops to the People — with George Hotz of tinycorp
Latent Space
2 FlashAttention-2: Making Transformers 800% faster AND exact
FlashAttention-2: Making Transformers 800% faster AND exact
Latent Space
3 RWKV: Reinventing RNNs for the Transformer Era
RWKV: Reinventing RNNs for the Transformer Era
Latent Space
4 Generating your AI Media Empire - with Youssef Rizk of Wondercraft.ai
Generating your AI Media Empire - with Youssef Rizk of Wondercraft.ai
Latent Space
5 RAG is a hack - with Jerry Liu of LlamaIndex
RAG is a hack - with Jerry Liu of LlamaIndex
Latent Space
6 The End of Finetuning — with Jeremy Howard of Fast.ai
The End of Finetuning — with Jeremy Howard of Fast.ai
Latent Space
7 Why AI Agents Don't Work (yet) - with Kanjun Qiu of Imbue
Why AI Agents Don't Work (yet) - with Kanjun Qiu of Imbue
Latent Space
8 Powering your Copilot for Data - with Artem Keydunov from Cube.dev
Powering your Copilot for Data - with Artem Keydunov from Cube.dev
Latent Space
9 Beating GPT-4 with Open Source Models - with Michael Royzen of Phind
Beating GPT-4 with Open Source Models - with Michael Royzen of Phind
Latent Space
10 The State of Silicon and the GPU Poors - with Dylan Patel of SemiAnalysis
The State of Silicon and the GPU Poors - with Dylan Patel of SemiAnalysis
Latent Space
11 The "Normsky" architecture for AI coding agents — with Beyang Liu + Steve Yegge of SourceGraph
The "Normsky" architecture for AI coding agents — with Beyang Liu + Steve Yegge of SourceGraph
Latent Space
12 The AI-First Graphics Editor - with Suhail Doshi of Playground AI
The AI-First Graphics Editor - with Suhail Doshi of Playground AI
Latent Space
13 The Accidental AI Canvas - with Steve Ruiz of tldraw
The Accidental AI Canvas - with Steve Ruiz of tldraw
Latent Space
14 The Origin and Future of RLHF: the secret ingredient for ChatGPT - with Nathan Lambert
The Origin and Future of RLHF: the secret ingredient for ChatGPT - with Nathan Lambert
Latent Space
15 The Four Wars of the AI Stack - Dec 2023 Recap
The Four Wars of the AI Stack - Dec 2023 Recap
Latent Space
16 The State of AI in production — with David Hsu of Retool
The State of AI in production — with David Hsu of Retool
Latent Space
17 Building an open AI company - with Ce and Vipul of Together AI
Building an open AI company - with Ce and Vipul of Together AI
Latent Space
18 Truly Serverless Infra for AI Engineers - with Erik Bernhardsson of Modal
Truly Serverless Infra for AI Engineers - with Erik Bernhardsson of Modal
Latent Space
19 A Brief History of the Open Source AI Hacker - with Ben Firshman of Replicate
A Brief History of the Open Source AI Hacker - with Ben Firshman of Replicate
Latent Space
20 Open Source AI is AI we can Trust — with Soumith Chintala of Meta AI
Open Source AI is AI we can Trust — with Soumith Chintala of Meta AI
Latent Space
21 Making Transformers Sing - with Mikey Shulman of Suno
Making Transformers Sing - with Mikey Shulman of Suno
Latent Space
22 A Comprehensive Overview of Large Language Models - Latent Space Paper Club
A Comprehensive Overview of Large Language Models - Latent Space Paper Club
Latent Space
23 Why Google failed to make GPT-3 -- with David Luan of Adept
Why Google failed to make GPT-3 -- with David Luan of Adept
Latent Space
24 Personal AI Meetup - Bee, BasedHardware, LangChain LangFriend, Deepgram EmilyAI
Personal AI Meetup - Bee, BasedHardware, LangChain LangFriend, Deepgram EmilyAI
Latent Space
25 Supervise the Process of AI Research — with Jungwon Byun and Andreas Stuhlmüller of Elicit
Supervise the Process of AI Research — with Jungwon Byun and Andreas Stuhlmüller of Elicit
Latent Space
26 Breaking down the OG GPT Paper by Alec Radford
Breaking down the OG GPT Paper by Alec Radford
Latent Space
27 High Agency Pydantic over VC Backed Frameworks — with Jason Liu of Instructor
High Agency Pydantic over VC Backed Frameworks — with Jason Liu of Instructor
Latent Space
28 This World Does Not Exist — Joscha Bach, Karan Malhotra, Rob Haisfield (WorldSim, WebSim, Liquid AI)
This World Does Not Exist — Joscha Bach, Karan Malhotra, Rob Haisfield (WorldSim, WebSim, Liquid AI)
Latent Space
29 LLM Asia Paper Club Survey Round
LLM Asia Paper Club Survey Round
Latent Space
How to train a Million Context LLM — with Mark Huang of Gradient.ai
How to train a Million Context LLM — with Mark Huang of Gradient.ai
Latent Space
31 How AI is Eating Finance - with Mike Conover of Brightwave
How AI is Eating Finance - with Mike Conover of Brightwave
Latent Space
32 How To Hire AI Engineers (ft. James Brady and Adam Wiggins of Elicit)
How To Hire AI Engineers (ft. James Brady and Adam Wiggins of Elicit)
Latent Space
33 State of the Art: Training 70B LLMs on 10,000 H100 clusters
State of the Art: Training 70B LLMs on 10,000 H100 clusters
Latent Space
34 The 10,000x Yolo Researcher Metagame — with Yi Tay of Reka
The 10,000x Yolo Researcher Metagame — with Yi Tay of Reka
Latent Space
35 Training Llama 2, 3 & 4: The Path to Open Source AGI — with Thomas Scialom of Meta AI
Training Llama 2, 3 & 4: The Path to Open Source AGI — with Thomas Scialom of Meta AI
Latent Space
36 [LLM Paper Club] Llama 3.1 Paper: The Llama Family of Models
[LLM Paper Club] Llama 3.1 Paper: The Llama Family of Models
Latent Space
37 Synthetic data + tool use for LLM improvements 🦙
Synthetic data + tool use for LLM improvements 🦙
Latent Space
38 RLHF vs SFT to break out of local maxima 📈
RLHF vs SFT to break out of local maxima 📈
Latent Space
39 The Winds of AI Winter (Q2 Four Wars of the AI Stack Recap)
The Winds of AI Winter (Q2 Four Wars of the AI Stack Recap)
Latent Space
40 Segment Anything 2: Memory + Vision = Object Permanence — with Nikhila Ravi and Joseph Nelson
Segment Anything 2: Memory + Vision = Object Permanence — with Nikhila Ravi and Joseph Nelson
Latent Space
41 Answer.ai & AI Magic with Jeremy Howard
Answer.ai & AI Magic with Jeremy Howard
Latent Space
42 Is finetuning GPT4o worth it?
Is finetuning GPT4o worth it?
Latent Space
43 Personal benchmarks vs HumanEval - with Nicholas Carlini of DeepMind
Personal benchmarks vs HumanEval - with Nicholas Carlini of DeepMind
Latent Space
44 Building AGI with OpenAI's Structured Outputs API
Building AGI with OpenAI's Structured Outputs API
Latent Space
45 Q* for model distillation 🍓
Q* for model distillation 🍓
Latent Space
46 Finetuning LoRAs on BILLIONS of tokens 🤖
Finetuning LoRAs on BILLIONS of tokens 🤖
Latent Space
47 Cursor UX team is CRACKED 💻
Cursor UX team is CRACKED 💻
Latent Space
48 Choosing the BEST OpenAI model 🏆
Choosing the BEST OpenAI model 🏆
Latent Space
49 How will OpenAI voice mode change API design?
How will OpenAI voice mode change API design?
Latent Space
50 STEALING OpenAI models data 🥷
STEALING OpenAI models data 🥷
Latent Space
51 [Paper Club] 🍓 On Reasoning: Q-STaR and Friends!
[Paper Club] 🍓 On Reasoning: Q-STaR and Friends!
Latent Space
52 [Paper Club] Writing in the Margins: Chunked Prefill KV Caching for Long Context Retrieval
[Paper Club] Writing in the Margins: Chunked Prefill KV Caching for Long Context Retrieval
Latent Space
53 The Ultimate Guide to Prompting - with Sander Schulhoff from LearnPrompting.org
The Ultimate Guide to Prompting - with Sander Schulhoff from LearnPrompting.org
Latent Space
54 llm.c's Origin and the Future of LLM Compilers - Andrej Karpathy at CUDA MODE
llm.c's Origin and the Future of LLM Compilers - Andrej Karpathy at CUDA MODE
Latent Space
55 Prompt Engineer is NOT a job 📝
Prompt Engineer is NOT a job 📝
Latent Space
56 Prompt Mining LLMs for better prompts ⛏️
Prompt Mining LLMs for better prompts ⛏️
Latent Space
57 The six pillars of few-shot prompting 🔧
The six pillars of few-shot prompting 🔧
Latent Space
58 Language Agents: From Reasoning to Acting — with Shunyu Yao of OpenAI, Harrison Chase of LangGraph
Language Agents: From Reasoning to Acting — with Shunyu Yao of OpenAI, Harrison Chase of LangGraph
Latent Space
59 [Paper Club] Who Validates the Validators? Aligning LLM-Judges with Humans (w/ Eugene Yan)
[Paper Club] Who Validates the Validators? Aligning LLM-Judges with Humans (w/ Eugene Yan)
Latent Space
60 Can you separate intelligence and knowledge?
Can you separate intelligence and knowledge?
Latent Space

Related AI Lessons

35 ChatGPT Prompts for Chiropractors (That Actually Work in 2026)
Boost chiropractic practice efficiency with 35 actionable ChatGPT prompts for tasks like SOAP notes and patient education
Dev.to AI
DeepSeek V4-Flash: A Solid GPT-4 Alternative at 60% Less Cost
Learn how DeepSeek V4-Flash offers a cost-effective alternative to GPT-4 for coding tasks, with similar performance at 60% less cost
Dev.to AI
The Sovereign Redactor — A Precision-Guided Privacy Airlock
Learn about the Sovereign Redactor, a precision-guided privacy airlock, and its application in a forensic system with local Multimodal Vision
Dev.to · Ken W Alger
Inside vLLM's CPU backend: a new contributor's notes
Explore the CPU backend of vLLM and learn from a new contributor's experience to improve your understanding of LLMs
Dev.to · daniel lm

Chapters (23)

Introductions
1:30 Founding story of Gradient and its mission
4:35 "Minimum viable agents"
9:19 Differentiating ML and AI, focusing on out-of-domain generalization
10:12 Extending Llama3 to 1M tokens
14:32 Technical challenges with long context sequences
17:45 Data quality and the importance of diverse datasets
19:45 What's a theta value?
22:42 RoPE vs Ring Attention vs ALiBi vs YaARN
25:06 Why RingAttention matters
28:01 How to refine datasets for context extension
33:34 Multi-stage training data and avoiding overfitting to recent data
34:27 The potential of using synthetic data in training
38:22 Applying LoRa adapters to extend model capabilities
42:25 Benchmarking long context models and evaluating their performance
47:20 Pushing to 4M context and output quality degradation
50:08 What do you need this context for?
52:57 Impact of long context in chat vs Docs Summarization
56:25 Future directions for long context models and multimodality
59:38 How do you know what research matters?
1:02:47 Routine for staying updated with AI research and industry news
1:05:33 Deciding which AI developments to invest time in
1:10:37 Request for collaboration and data set construction for long context
Up next
5 Levels of AI Agents - From Simple LLM Calls to Multi-Agent Systems
Dave Ebbelaar (LLM Eng)
Watch →