Understanding Continual Pretraining: What It Is and How It Works

AppliedAI · Intermediate ·🧠 Large Language Models ·1y ago
In this video, we explore continual pretraining, a specialized approach to enhancing large language models. Learn what it is, how it differs from fine-tuning, and when it’s necessary for solving domain-specific challenges. We’ll cover: 1️⃣ Typical Training Workflow – From pretraining to fine-tuning, understand the standard process for building large language models. 2️⃣ What is Continual Pretraining? – Discover how continual pretraining extends the capabilities of general-purpose models by focusing on specific domains or languages. 3️⃣ When to Use It – Explore scenarios like domain mismatch, …
Watch on YouTube ↗ (saves to browser)
5 Levels of AI Agents - From Simple LLM Calls to Multi-Agent Systems
Next Up
5 Levels of AI Agents - From Simple LLM Calls to Multi-Agent Systems
Dave Ebbelaar (LLM Eng)