How to run LLMs locally in under 2 minutes, no code. (Mistral, Llama 2)
In this video I show you how to easily install and use any open source LLM in under 2 minutes using ollama.
๐ Links
- Buy me a coffee: https://www.buymeacoffee.com/redaitoronto
- Ollama Download: ollama.ai
- Mistral AI: https://mistral.ai/
- Ollama Repo: https://ollama.ai/
- Follow me on twitter: https://twitter.com/Ali_isRed
- Join my AI email list: https://www.redai.ai
- My discord: https://discord.gg/self
โฑ๏ธ Timestamps
0:00 Demo
0:14 Mistral In Short
0:30 First step, go to Ollama.ai
0:40 Open downloaded Ollama App
0:48 Models supported by Ollama
1:17 Open terminal and type ollama run miโฆ
Watch on YouTube โ
(saves to browser)
Chapters (7)
Demo
0:14
Mistral In Short
0:30
First step, go to Ollama.ai
0:40
Open downloaded Ollama App
0:48
Models supported by Ollama
1:17
Open terminal and type ollama run mistral
1:28
Testing my Local Mistral instance
DeepCamp AI