How to add a local Ollama model to Hermes Agent as a selectable provider with correct context window, auth, and cron defaults.
http://localhost:11434)ollama pull <model>:<tag>The name ollama is a reserved built-in alias in Hermes (hermes_cli/auth.py line 960: "ollama": "custom"). Using ollama as a provider key in config.yaml causes provider resolution to fail silently because _get_named_custom_provider() sees it as a built-in and skips the config lookup.
Always use a different key like local-ollama for the provider name.
Error you'll see if you use ollama as the key:
RuntimeError: No LLM provider configured. Run hermes model to select a provider...
# Pull a tag with large context baked in
ollama pull gemma4:e4b-128k
# Or create a custom Modelfile
cat <<'EOF' > /tmp/Modelfile
FROM gemma4:e4b
PARAMETER num_ctx 131072
EOF
ollama create gemma4:e4b-128k -f /tmp/Modelfile
# Verify
ollama show gemma4:e4b-128k --modelfile | grep num_ctx
# Should show: PARAMETER num_ctx 131072
Add under providers: in ~/.hermes/config.yaml: