How To Implement Short Term Memory Using LangGraph
In this video, we continue the Agentic AI using LangGraph series by learning how to implement short-term memory in LLM-based systems. The video explains why LLMs are stateless, how short-term memory works using conversation history, and how to implement it in LangGraph using checkpointers and thread IDs. You’ll then learn why in-memory storage is not suitable for production and how to persist conversation state using PostgreSQL with Docker. Finally, the video covers the context window problem in LLMs and teaches practical techniques like trimming, deletion, and summarization to handle long con…
Watch on YouTube ↗
(saves to browser)
Chapters (16)
Introduction and video overview
1:38
Recap: LLMs are stateless and memory fundamentals
3:33
Short-term memory concept and checkpointers in LangGraph
4:46
Implementing short-term memory with threads and in-memory storage
9:22
Limitations of in-memory memory storage
11:16
Need for persistence in production systems
12:38
Setting up PostgreSQL using Docker
15:54
Implementing persistent memory with PostgreSQL checkpointer
17:36
Verifying persistence after application restart
19:30
Context window and context overflow problem
21:38
Trimming strategy to control token limits
31:19
Limitations of trimming
32:03
Summarization concept for long conversations
35:24
Deletion and its role in summarization
40:12
Implementing summarization workflow in LangGraph
49:26
Testing summarization behavior in practice
DeepCamp AI