Turn Your Old PC Into a Local AI Server (Ollama Ubuntu Setup)
Blog Link : https://selftuts.in/run-ai-models-locally-and-access-on-other-device/
In this video, we will learn how to turn an old Ubuntu machine into a Local AI Server using Ollama.
Instead of running heavy AI models on your daily laptop, you can run them on an older machine and access them from any device on your local Wi-Fi network.
This setup allows you to run local LLM models like Llama, Qwen and others without paying for expensive cloud APIs.
In this tutorial we will cover:
• Running Ollama on an Ubuntu machine
• Exposing the Ollama server to the local network
• Accessing the AI serve…
Watch on YouTube ↗
(saves to browser)
Chapters (20)
Introduction
0:09
Architecture Overview
0:23
Local AI Server Setup
0:56
Why Running AI on Laptop is Problem
1:24
Best Architecture for Local AI
1:41
Prerequisites (Ubuntu + Ollama)
2:02
Why Local AI is Important in 2026
3:10
SSH into Ubuntu Machine
3:54
Check Ollama Version
4:06
Enable Network Access
4:40
Configure Ollama Host
5:25
Save Configuration File
6:03
Reload Systemd
6:50
Restart Ollama Server
7:13
Check Server Status
7:40
Find Ubuntu IP Address
8:03
Access Ollama from Mac
8:36
Test Server Connection
9:18
Using Ollama API
9:33
Final Thoughts
DeepCamp AI