Stop Using Fixed-Size Chunking for RAG #rag #machinelearning #llm

Shane | LLM Implementation ยท Beginner ยท๐Ÿง  Large Language Models ยท1w ago
Fixed-size chunking cuts words, splits rows, and breaks context at the chunk boundary. There's a better default. ๐Ÿ“š Full tutorial: https://www.youtube.com/watch?v=EbXlqjk8cZ4 This is from Module 2 of a 10-part RAG course. ๐Ÿ“‹ Playlist: https://www.youtube.com/playlist?list=PL0G6--HT7Yq_sxLFyWFWL6KHYWlosspj_ ๐Ÿ’ฌ Discord: https://discord.gg/KpnJQbgpjt
Watch on YouTube โ†— (saves to browser)
5 Levels of AI Agents - From Simple LLM Calls to Multi-Agent Systems
Next Up
5 Levels of AI Agents - From Simple LLM Calls to Multi-Agent Systems
Dave Ebbelaar (LLM Eng)