Distilling LLMs with Datawizz and Fireworks AI

Datawizz · Beginner ·🧠 Large Language Models ·11mo ago
Use Datawizz to distill small, efficient SLMs & deploy them to the Fireworks AI platform for fast & efficient inference. This quick tutorial will cover: - Connecting Datawizz as a proxy to collect LLM logs - Fine-tuning a Llama 3.2 model with these logs - Deploying the new model to a dedicated server in FireworksAI - Smartly routing traffic to the new model Read more here: https://docs.datawizz.ai/models/model-deployment#fireworks-ai
Watch on YouTube ↗ (saves to browser)
5 Levels of AI Agents - From Simple LLM Calls to Multi-Agent Systems
Next Up
5 Levels of AI Agents - From Simple LLM Calls to Multi-Agent Systems
Dave Ebbelaar (LLM Eng)