Better Data is All You Need — Ari Morcos, Datology
Our chat with Ari shows that *data curation is the most impactful and underinvested area in AI.* He argues that the prevailing focus on model architecture and compute scaling overlooks the "bitter lesson" that *"models are what they eat."* Effective data curation—a sophisticated process involving filtering, rebalancing, sequencing (curriculum), and synthetic data generation—allows for training models that are simultaneously *faster, better, and smaller.* Ari recounts his personal journey from focusing on model-centric inductive biases to realizing that data quality is the primary lever for breaking the diminishing returns of naive scaling laws. Datology's mission is to automate this complex curation process, making state-of-the-art data accessible to any organization and enabling a new paradigm of AI development where data efficiency, not just raw scale, drives progress.
*Timestamps*
00:00 Introduction
00:46 What is Datology? The mission to train models faster, better, and smaller through data curation.
01:59 Ari's background: From neuroscience to realizing the "Bitter Lesson" of AI.
05:30 Key Insight: Inductive biases from architecture become less important and even harmful as data scale increases.
08:08 Thesis: Data is the most underinvested area of AI research relative to its impact.
10:15 Why data work is culturally undervalued in research and industry.
12:19 How self-supervised learning changed everything, moving from a data-scarce to a data-abundant regime.
17:05 Why automated curation is superior to human-in-the-loop, citing the DCLM study.
19:22 The "Elephants vs. Dogs" analogy for managing data redundancy and complexity.
22:46 A brief history and commentary on key datasets (Common Crawl, GitHub, Books3).
26:24 Breaking naive scaling laws by improving data quality to maintain high marginal information gain.
29:07 Datology's demonstrated impact: Achieving baseline performance 12x faster.
34:19 The business of data: Datology's moat and its relation
Watch on YouTube ↗
(saves to browser)
Sign in to unlock AI tutor explanation · ⚡30
Playlist
Uploads from Latent Space · Latent Space · 0 of 60
← Previous
Next →
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
Ep 18: Petaflops to the People — with George Hotz of tinycorp
Latent Space
FlashAttention-2: Making Transformers 800% faster AND exact
Latent Space
RWKV: Reinventing RNNs for the Transformer Era
Latent Space
Generating your AI Media Empire - with Youssef Rizk of Wondercraft.ai
Latent Space
RAG is a hack - with Jerry Liu of LlamaIndex
Latent Space
The End of Finetuning — with Jeremy Howard of Fast.ai
Latent Space
Why AI Agents Don't Work (yet) - with Kanjun Qiu of Imbue
Latent Space
Powering your Copilot for Data - with Artem Keydunov from Cube.dev
Latent Space
Beating GPT-4 with Open Source Models - with Michael Royzen of Phind
Latent Space
The State of Silicon and the GPU Poors - with Dylan Patel of SemiAnalysis
Latent Space
The "Normsky" architecture for AI coding agents — with Beyang Liu + Steve Yegge of SourceGraph
Latent Space
The AI-First Graphics Editor - with Suhail Doshi of Playground AI
Latent Space
The Accidental AI Canvas - with Steve Ruiz of tldraw
Latent Space
The Origin and Future of RLHF: the secret ingredient for ChatGPT - with Nathan Lambert
Latent Space
The Four Wars of the AI Stack - Dec 2023 Recap
Latent Space
The State of AI in production — with David Hsu of Retool
Latent Space
Building an open AI company - with Ce and Vipul of Together AI
Latent Space
Truly Serverless Infra for AI Engineers - with Erik Bernhardsson of Modal
Latent Space
A Brief History of the Open Source AI Hacker - with Ben Firshman of Replicate
Latent Space
Open Source AI is AI we can Trust — with Soumith Chintala of Meta AI
Latent Space
Making Transformers Sing - with Mikey Shulman of Suno
Latent Space
A Comprehensive Overview of Large Language Models - Latent Space Paper Club
Latent Space
Why Google failed to make GPT-3 -- with David Luan of Adept
Latent Space
Personal AI Meetup - Bee, BasedHardware, LangChain LangFriend, Deepgram EmilyAI
Latent Space
Supervise the Process of AI Research — with Jungwon Byun and Andreas Stuhlmüller of Elicit
Latent Space
Breaking down the OG GPT Paper by Alec Radford
Latent Space
High Agency Pydantic over VC Backed Frameworks — with Jason Liu of Instructor
Latent Space
This World Does Not Exist — Joscha Bach, Karan Malhotra, Rob Haisfield (WorldSim, WebSim, Liquid AI)
Latent Space
LLM Asia Paper Club Survey Round
Latent Space
How to train a Million Context LLM — with Mark Huang of Gradient.ai
Latent Space
How AI is Eating Finance - with Mike Conover of Brightwave
Latent Space
How To Hire AI Engineers (ft. James Brady and Adam Wiggins of Elicit)
Latent Space
State of the Art: Training 70B LLMs on 10,000 H100 clusters
Latent Space
The 10,000x Yolo Researcher Metagame — with Yi Tay of Reka
Latent Space
Training Llama 2, 3 & 4: The Path to Open Source AGI — with Thomas Scialom of Meta AI
Latent Space
[LLM Paper Club] Llama 3.1 Paper: The Llama Family of Models
Latent Space
Synthetic data + tool use for LLM improvements 🦙
Latent Space
RLHF vs SFT to break out of local maxima 📈
Latent Space
The Winds of AI Winter (Q2 Four Wars of the AI Stack Recap)
Latent Space
Segment Anything 2: Memory + Vision = Object Permanence — with Nikhila Ravi and Joseph Nelson
Latent Space
Answer.ai & AI Magic with Jeremy Howard
Latent Space
Is finetuning GPT4o worth it?
Latent Space
Personal benchmarks vs HumanEval - with Nicholas Carlini of DeepMind
Latent Space
Building AGI with OpenAI's Structured Outputs API
Latent Space
Q* for model distillation 🍓
Latent Space
Finetuning LoRAs on BILLIONS of tokens 🤖
Latent Space
Cursor UX team is CRACKED 💻
Latent Space
Choosing the BEST OpenAI model 🏆
Latent Space
How will OpenAI voice mode change API design?
Latent Space
STEALING OpenAI models data 🥷
Latent Space
[Paper Club] 🍓 On Reasoning: Q-STaR and Friends!
Latent Space
[Paper Club] Writing in the Margins: Chunked Prefill KV Caching for Long Context Retrieval
Latent Space
The Ultimate Guide to Prompting - with Sander Schulhoff from LearnPrompting.org
Latent Space
llm.c's Origin and the Future of LLM Compilers - Andrej Karpathy at CUDA MODE
Latent Space
Prompt Engineer is NOT a job 📝
Latent Space
Prompt Mining LLMs for better prompts ⛏️
Latent Space
The six pillars of few-shot prompting 🔧
Latent Space
Language Agents: From Reasoning to Acting — with Shunyu Yao of OpenAI, Harrison Chase of LangGraph
Latent Space
[Paper Club] Who Validates the Validators? Aligning LLM-Judges with Humans (w/ Eugene Yan)
Latent Space
Can you separate intelligence and knowledge?
Latent Space
Related AI Lessons
⚡
⚡
⚡
⚡
Operational continuity is not governability.
Medium · Deep Learning
AI gave North Korean hackers a $600 million month. DeFi is still working out how to respond.
The Next Web AI
The Fallacy of Vibe-Driven Development: A Critical Look at AI Scaling
Dev.to · Aneesha Prasannan
New Jersey’s 2026 AI Push
Dev.to AI
Chapters (13)
Introduction
0:46
What is Datology? The mission to train models faster, better, and smaller throug
1:59
Ari's background: From neuroscience to realizing the "Bitter Lesson" of AI.
5:30
Key Insight: Inductive biases from architecture become less important and even h
8:08
Thesis: Data is the most underinvested area of AI research relative to its impac
10:15
Why data work is culturally undervalued in research and industry.
12:19
How self-supervised learning changed everything, moving from a data-scarce to a
17:05
Why automated curation is superior to human-in-the-loop, citing the DCLM study.
19:22
The "Elephants vs. Dogs" analogy for managing data redundancy and complexity.
22:46
A brief history and commentary on key datasets (Common Crawl, GitHub, Books3).
26:24
Breaking naive scaling laws by improving data quality to maintain high marginal
29:07
Datology's demonstrated impact: Achieving baseline performance 12x faster.
34:19
The business of data: Datology's moat and its relation
🎓
Tutor Explanation
DeepCamp AI