778: Mixtral 8x22B: SOTA Open-Source LLM Capabilities at a Fraction of the Compute — with Jon Krohn
Super Data Science: ML & AI Podcast with Jon Krohn
·
Beginner
·🧠 Large Language Models
·6:31
·1y ago
Mixtral #Mistral #OpenSourceLLM Mixtral 8x22B is the focus in this week's Five-Minute Friday as @JonKrohnLearns unpacks the ...
Watch on YouTube ↗
(saves to browser)
DeepCamp AI