Ollama (Offline)
Run Axiom entirely on-device with no API key using Ollama.
- Install Ollama from ollama.com
- Pull a model:
ollama pull llama3.2- Run
axiom-wiki initand select Ollama (local)
Axiom connects to http://localhost:11434 by default and validates the connection during setup.
Docker + Ollama
Section titled “Docker + Ollama”services: axiom-wiki: image: axiomwiki/axiom-wiki volumes: - ./wiki:/app/wiki - ./raw:/app/raw environment: - OLLAMA_BASE_URL=http://ollama:11434/api ollama: image: ollama/ollama volumes: - ollama_data:/root/.ollamavolumes: ollama_data:Model recommendations
Section titled “Model recommendations”For wiki use, models with strong instruction following and JSON output work best:
- llama3.2 — good balance of quality and speed
- mistral — fast, good for smaller documents
- qwen2.5 — strong multilingual support