Run LLMs locally | LM Studio Tutorial | Generative AI
Let me show you an easy way run LLMs locally! This one is for my non-techie friends. No coding required.
Step 1: Download and install LM Studio (https://lmstudio.ai/)
Step 2: Select the model of your choice. I recommend giving Llama 3.2 1b a shot. This one is small and powerful.
Step 3: Chat set and go!
For my techie friends who have not yet used LM Studio. Below are some cool things that you can do with this. LM Studio supports most models that are in GGUF format.
1/ Built in RAG support (new in 0.3)
2/ Use models through the in-app Chat UI
3/ Download any compatible model files from Hu…
Watch on YouTube ↗
(saves to browser)
DeepCamp AI