Running Local LLMs With Ollama

Prerequisites

Prerequisites

In this lesson, you'll learn how to use Ollama to run large language models (LLMs) locally and incorporate them into AI-powered applications. Along the way, we’ll introduce key concepts in AI so you can see how each piece fits together.

Required Skills

SkillDescription
AI Chatbot FamiliarityHaving used tools like ChatGPT, Claude, or similar will help you follow along.
Shell CommandsBasic knowledge of commands such as curl to send HTTP requests from your terminal.

Note

Prior programming experience is not required for this lesson. However, if you know a programming language, it may speed up your progress when we start building AI applications in later modules.

The image shows a diagram labeled "Prerequisites" with an AI application interface on the left, featuring icons like a chatbot and media, connected to a "Curl Command" on the right.

Watch Video

Watch video content

Previous
Course Introduction