Connect Semantic Kernel to Open Source models via Ollama
In this video you will learn what #ollama is and it's key capabilities. Moreover, I demonstrate how to run #llama3 locally on your computer via #ollama.
Lastly, a walkthrough of how to connect #semantickernel to Llama 3.1 which is served by Ollama.
00:00 - Intro
00:20 - Ollama Overview
01:44 -Running Llama 3.1 locally via Ollama
02:44 - Connecting Semantic Kernel to Llama 3.1 via Ollama
04:22 - Recap of what we have learned in this video
#ollama #huggingface #llm #ai #aiagent #openai
Watch on YouTube ↗
(saves to browser)
Chapters (5)
Intro
0:20
Ollama Overview
1:44
Running Llama 3.1 locally via Ollama
2:44
Connecting Semantic Kernel to Llama 3.1 via Ollama
4:22
Recap of what we have learned in this video
DeepCamp AI