Skip to content
DeepCamp
ExploreMy FeedLessonsRoadmapsNewsSearch
Sign in Get started
ExploreMy FeedLessonsRoadmapsNewsSearch Sign inGet started
Filter
All Lessons 135,010
Level
All levels Beginner Intermediate Advanced
Duration
Any length Short (<5 min) Medium (5–20 min) Long (>20 min)
Topic
🧠 Large Language Models 16,169 ✍️ Prompt Engineering 675 🤖 AI Agents & Automation 5,618 📐 ML Fundamentals 9,841 🎨 Image & Video AI 3,043 💻 AI-Assisted Coding 3,715 🛠️ AI Tools & Apps 15,363 👁️ Computer Vision 1,114 🛡️ AI Safety & Ethics 7,912 📄 Research Papers Explained 7,812 🔍 RAG & Vector Search 3,920 📰 AI News & Updates 20,942 🔐 Cybersecurity 4,497 📣 Digital Marketing & Growth 5,160 🖊️ Copywriting & Content Strategy 1,938 📊 Data Analytics & Business Intelligence 6,675 🖌️ UI/UX Design 3,340 🏗️ Systems Design & Architecture 3,801 📋 Product Management 847 📅 Project Management 717 🚀 Entrepreneurship & Startups 10,284
Source
▶ YouTube 116,907 📚 Coursera 18,102 🎤 TED 1
Channel
Coursera 18,155 Analytics Vidhya 1,034 GaryVee 1,031 MIT OpenCourseWare 1,024 HubSpot Marketing 1,017 Microsoft Developer 1,015 Tech With Tim 1,012 David Bombal 1,011 Real Python 1,008 DataCamp 1,007 Databricks 1,006 MLOps.community 1,005 Zapier 1,004 codebasics 1,003 DesignCourse 1,003 Ali Abdaal 1,003 NeuralNine 1,002 Lenny's Podcast 1,001 Google Cloud Tech 1,001 1littlecoder 1,001 freeCodeCamp.org 1,000 a16z 1,000 WIRED 1,000 Two Minute Papers 1,000 Traversy Media 1,000 The New Stack 1,000 The Futur 1,000 TechCrunch 1,000 Stanford Online 1,000 Seeker 1,000
✕ Clear filters
8 lessons

📄 Research Papers Explained

The latest AI papers broken down — attention, RLHF, diffusion, MoE and more

All ▶ YouTube 116,907📚 Coursera 18,102🎤 TED 1
Save your progress and get personalised paths

Free account: saved library, learning streaks, AI-built roadmaps.

Sign up free →
Mixture of Experts (MoE), Visually Explained
31:46
📄 Research Papers Explained
Mixture of Experts (MoE), Visually Explained
Jia-Bin Huang Advanced 1mo ago
Mixture of Experts (MoE) Introduction
29:59
📄 Research Papers Explained
Mixture of Experts (MoE) Introduction
Vizuara Beginner 11mo ago
Reinforcement Learning from Human Feedback explained with math derivations and the PyTorch code.
2:15:13
📄 Research Papers Explained
Reinforcement Learning from Human Feedback explained with math derivations and the PyTorch code.
Umar Jamil Beginner 2y ago
Mistral / Mixtral Explained: Sliding Window Attention, Sparse Mixture of Experts, Rolling Buffer
1:26:21
📄 Research Papers Explained
Mistral / Mixtral Explained: Sliding Window Attention, Sparse Mixture of Experts, Rolling Buffer
Umar Jamil Beginner 2y ago
Fine-Tune Mixtral 8x7B (Mistral's Mixture of Experts MoE) Model - Walkthrough Guide
23:12
📄 Research Papers Explained
Fine-Tune Mixtral 8x7B (Mistral's Mixture of Experts MoE) Model - Walkthrough Guide
Brev Beginner 2y ago
Mixtral 8x7B DESTROYS Other Models (MoE = AGI?)
20:50
📄 Research Papers Explained
Mixtral 8x7B DESTROYS Other Models (MoE = AGI?)
Matthew Berman Advanced 2y ago
RLHF - Reinforcement Learning from Human Feedback
56:30
📄 Research Papers Explained
RLHF - Reinforcement Learning from Human Feedback
West Coast Machine Learning Beginner 2y ago
Research Paper Deep Dive -  The Sparsely-Gated Mixture-of-Experts (MoE)
22:39
📄 Research Papers Explained
Research Paper Deep Dive - The Sparsely-Gated Mixture-of-Experts (MoE)
650 AI Lab Advanced 3y ago

© 2026 DeepCamp — For the ones who figure it out.

A TechAssembly Ltd product — Created by Sam Iso

ToolHub Tools All Lessons AI News Search Privacy
TechAssembly Powered by TechAssembly.io
TechAssembly DeepCamp AI
👋 Hi! I'm DeepCamp AI. Ask me to find content, explain AI concepts, or suggest a learning path. What are you curious about?
Powered by TechAssembly.io

Share