Greg Brockman on OpenAI's Road to AGI
Greg Brockman, co-founder and president of OpenAI, joins us to talk about GPT-5 and GPT-OSS, the future of software engineering, why reinforcement learning is still scaling, and how OpenAI is planning to get to AGI.
https://latent.space/
https://kernellabs.ai/
00:00 Introductions
01:04 The Evolution of Reasoning at OpenAI
04:01 Online vs Offline Learning in Language Models
06:44 Sample Efficiency and Human Curation in Reinforcement Learning
08:16 Scaling Compute and Supercritical Learning
13:21 Wall clock time limitations in RL and real-world interactions
16:34 Experience with ARC Institute …
Watch on YouTube ↗
(saves to browser)
Chapters (25)
Introductions
1:04
The Evolution of Reasoning at OpenAI
4:01
Online vs Offline Learning in Language Models
6:44
Sample Efficiency and Human Curation in Reinforcement Learning
8:16
Scaling Compute and Supercritical Learning
13:21
Wall clock time limitations in RL and real-world interactions
16:34
Experience with ARC Institute and DNA neural networks
19:33
Defining the GPT-5 Era
22:46
Evaluating Model Intelligence and Task Difficulty
25:06
Practical Advice for Developers Using GPT-5
31:48
Model Specs
37:21
Challenges in RL Preferences (e.g., try/catch)
39:13
Model Routing and Hybrid Architectures in GPT-5
43:58
GPT-5 pricing and compute efficiency improvements
46:04
Self-Improving Coding Agents and Tool Usage
49:11
On-Device Models and Local vs Remote Agent Systems
51:34
Engineering at OpenAI and Leveraging LLMs
54:16
Structuring Codebases and Teams for AI Optimization
55:27
The Value of Engineers in the Age of AGI
58:42
Current state of AI research and lab diversity
1:01:11
OpenAI’s Prioritization and Focus Areas
1:03:05
Advice for Founders: It's Not Too Late
1:04:20
Future outlook and closing thoughts
1:04:33
Time Capsule to 2045: Future of Compute and Abundance
1:07:07
Time Capsule to 2005: More Problems Will Emerge
Playlist
Uploads from Latent Space · Latent Space · 0 of 60
← Previous
Next →
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
Ep 18: Petaflops to the People — with George Hotz of tinycorp
Latent Space
FlashAttention-2: Making Transformers 800% faster AND exact
Latent Space
RWKV: Reinventing RNNs for the Transformer Era
Latent Space
Generating your AI Media Empire - with Youssef Rizk of Wondercraft.ai
Latent Space
RAG is a hack - with Jerry Liu of LlamaIndex
Latent Space
The End of Finetuning — with Jeremy Howard of Fast.ai
Latent Space
Why AI Agents Don't Work (yet) - with Kanjun Qiu of Imbue
Latent Space
Powering your Copilot for Data - with Artem Keydunov from Cube.dev
Latent Space
Beating GPT-4 with Open Source Models - with Michael Royzen of Phind
Latent Space
The State of Silicon and the GPU Poors - with Dylan Patel of SemiAnalysis
Latent Space
The "Normsky" architecture for AI coding agents — with Beyang Liu + Steve Yegge of SourceGraph
Latent Space
The AI-First Graphics Editor - with Suhail Doshi of Playground AI
Latent Space
The Accidental AI Canvas - with Steve Ruiz of tldraw
Latent Space
The Origin and Future of RLHF: the secret ingredient for ChatGPT - with Nathan Lambert
Latent Space
The Four Wars of the AI Stack - Dec 2023 Recap
Latent Space
The State of AI in production — with David Hsu of Retool
Latent Space
Building an open AI company - with Ce and Vipul of Together AI
Latent Space
Truly Serverless Infra for AI Engineers - with Erik Bernhardsson of Modal
Latent Space
A Brief History of the Open Source AI Hacker - with Ben Firshman of Replicate
Latent Space
Open Source AI is AI we can Trust — with Soumith Chintala of Meta AI
Latent Space
Making Transformers Sing - with Mikey Shulman of Suno
Latent Space
A Comprehensive Overview of Large Language Models - Latent Space Paper Club
Latent Space
Why Google failed to make GPT-3 -- with David Luan of Adept
Latent Space
Personal AI Meetup - Bee, BasedHardware, LangChain LangFriend, Deepgram EmilyAI
Latent Space
Supervise the Process of AI Research — with Jungwon Byun and Andreas Stuhlmüller of Elicit
Latent Space
Breaking down the OG GPT Paper by Alec Radford
Latent Space
High Agency Pydantic over VC Backed Frameworks — with Jason Liu of Instructor
Latent Space
This World Does Not Exist — Joscha Bach, Karan Malhotra, Rob Haisfield (WorldSim, WebSim, Liquid AI)
Latent Space
LLM Asia Paper Club Survey Round
Latent Space
How to train a Million Context LLM — with Mark Huang of Gradient.ai
Latent Space
How AI is Eating Finance - with Mike Conover of Brightwave
Latent Space
How To Hire AI Engineers (ft. James Brady and Adam Wiggins of Elicit)
Latent Space
State of the Art: Training 70B LLMs on 10,000 H100 clusters
Latent Space
The 10,000x Yolo Researcher Metagame — with Yi Tay of Reka
Latent Space
Training Llama 2, 3 & 4: The Path to Open Source AGI — with Thomas Scialom of Meta AI
Latent Space
[LLM Paper Club] Llama 3.1 Paper: The Llama Family of Models
Latent Space
Synthetic data + tool use for LLM improvements 🦙
Latent Space
RLHF vs SFT to break out of local maxima 📈
Latent Space
The Winds of AI Winter (Q2 Four Wars of the AI Stack Recap)
Latent Space
Segment Anything 2: Memory + Vision = Object Permanence — with Nikhila Ravi and Joseph Nelson
Latent Space
Answer.ai & AI Magic with Jeremy Howard
Latent Space
Is finetuning GPT4o worth it?
Latent Space
Personal benchmarks vs HumanEval - with Nicholas Carlini of DeepMind
Latent Space
Building AGI with OpenAI's Structured Outputs API
Latent Space
Q* for model distillation 🍓
Latent Space
Finetuning LoRAs on BILLIONS of tokens 🤖
Latent Space
Cursor UX team is CRACKED 💻
Latent Space
Choosing the BEST OpenAI model 🏆
Latent Space
How will OpenAI voice mode change API design?
Latent Space
Ephemeral software FOR YOU 💻
Latent Space
STEALING OpenAI models data 🥷
Latent Space
Models can do MORE than what you ask for 🤖
Latent Space
[Paper Club] 🍓 On Reasoning: Q-STaR and Friends!
Latent Space
[Paper Club] Writing in the Margins: Chunked Prefill KV Caching for Long Context Retrieval
Latent Space
The Ultimate Guide to Prompting - with Sander Schulhoff from LearnPrompting.org
Latent Space
llm.c's Origin and the Future of LLM Compilers - Andrej Karpathy at CUDA MODE
Latent Space
Prompt Engineer is NOT a job 📝
Latent Space
Prompt Mining LLMs for better prompts ⛏️
Latent Space
The six pillars of few-shot prompting 🔧
Latent Space
Language Agents: From Reasoning to Acting — with Shunyu Yao of OpenAI, Harrison Chase of LangGraph
Latent Space
DeepCamp AI