Asynchronous Python LLM APIs | FastAPI, Redis, AsyncIO

Code with Irtiza · Intermediate ·🧠 Large Language Models ·11mo ago
In this video we build an API to process LLM responses asynchronously in the background using Python's AsyncIO library that takes advantage of thread pools and background workers. 🔴 More from me: https://irtizahafiz.com?utm_source=youtube 🟢 Join my mailing list: https://irtizahafiz.com/newsletter?utm_source=youtube 0:00 API Demos 6:22 Async POST API 10:44 Python AsyncIO Task Queues & Threads 15:50 Async LLM Processing 19:15 GET API to track task progress 21:15 Threads, Workers & Parallel Processing 26:50 Productionizing Using Message Brokers
Watch on YouTube ↗ (saves to browser)

Chapters (7)

API Demos
6:22 Async POST API
10:44 Python AsyncIO Task Queues & Threads
15:50 Async LLM Processing
19:15 GET API to track task progress
21:15 Threads, Workers & Parallel Processing
26:50 Productionizing Using Message Brokers
5 Levels of AI Agents - From Simple LLM Calls to Multi-Agent Systems
Next Up
5 Levels of AI Agents - From Simple LLM Calls to Multi-Agent Systems
Dave Ebbelaar (LLM Eng)