dockerHow to Run Stakpak with Local LLM Models Using Ollama

What is Stakpak?

Stakpak is an open source agent that lives on your machines 24/7, keeps your apps running, and only pings when it needs a human.

Install

curl -sSL https://stakpak.dev/install.sh | sh

or check Install Stakpak

Usage with Ollama

Quick setup

  1. Download the ollama model.

  2. Open your terminal and run stakpak config

stakpak config
  1. Choose create new profile

  2. Add profile name

ollama
  1. Choose "Use my own Model"

  2. Choose "bring your own model"

  3. Enter provider name:

  1. Enter the endpoint:

  1. Enter the API Key

  1. Add your model name for smart mode

  2. Add your model name for eco mode

  3. Press "y" to save config

  4. Press "y" to open stakpak

NOTE:

  1. To open Stakpak with the Ollama profile, all you have to do is open your terminal and write:

  • qwen3-coder

  • glm-4.7

  • gpt-oss:120b

Cloud models are also available at ollama.com/search?c=cloudarrow-up-right.

Last updated