Local LLMs with llamafile

Coursera Course · Coursera

Open Course on Coursera

Free to audit · Opens on Coursera

Local LLMs with llamafile

Coursera · Beginner ·🧠 Large Language Models ·1h ago
In this 1-hour project-based course, you will learn to: * Package open-source AI models into portable llamafile executables * Deploy llamafiles locally across Windows, macOS and Linux * Monitor system metrics like GPU usage when running models * Query llamafile APIs with Python to process generated text * Experience real-time inference through hands-on examples
Watch on Coursera ↗ (saves to browser)
5 Levels of AI Agents - From Simple LLM Calls to Multi-Agent Systems
Next Up
5 Levels of AI Agents - From Simple LLM Calls to Multi-Agent Systems
Dave Ebbelaar (LLM Eng)