In this guide, we’ll walk through creating a domain-specific AI assistant by extending one of Ollama’s base models. We’ll customize Gromo’s assistant—named Harris—so it:Documentation Index
Fetch the complete documentation index at: https://notes.kodekloud.com/llms.txt
Use this file to discover all available pages before exploring further.
- Knows its own name
- Understands Gromo’s investment context
- Defaults to Indian Rupees (INR) when no currency is specified
1. Define Your Modelfile
First, create aModelfile that builds on Llama 3.2, lowers creativity for financial precision, and sets up a system prompt:

| Directive | Purpose | Example |
|---|---|---|
| FROM | Selects the base LLM | FROM llama3.2 |
| PARAMETER | Adjusts model settings (e.g., creativity, temperature) | PARAMETER temperature 0.3 |
| SYSTEM | Provides identity, role, and default behaviors | SYSTEM You are Harris… default currency is Indian Rupees (INR). |
Using
PARAMETER temperature 0.3 ensures more accurate, fact-driven responses—crucial for financial applications.2. Build the Custom Model
With yourModelfile ready, run:

3. Verify Assistant Behavior
Run Harris to confirm its name recognition, context awareness, and INR default:SYSTEM prompt.
4. Next Steps: Share Your Model
Once you’re satisfied, push Harris to a registry so colleagues can pull it:
Ensure your registry credentials are configured before pushing or pulling custom models to avoid authentication errors.