Long-term memory via ChromaDB with local Ollama embeddings. Auto-recall injects relevant context every turn. No cloud APIs required — fully self-hosted.
Long-term semantic memory backed by ChromaDB and local Ollama embeddings. Zero cloud dependencies.
chromadb_search tool: Manual semantic search over your ChromaDB collectionChromaDB running (Docker recommended):
docker run -d --name chromadb -p 8100:8000 chromadb/chroma:latest
Ollama with an embedding model:
ollama pull nomic-embed-text
Indexed documents in ChromaDB. Use any ChromaDB-compatible indexer to populate your collection.
# 1. Copy the plugin extension
mkdir -p ~/.openclaw/extensions/chromadb-memory
cp {baseDir}/scripts/index.ts ~/.openclaw/extensions/chromadb-memory/
cp {baseDir}/scripts/openclaw.plugin.json ~/.openclaw/extensions/chromadb-memory/
# 2. Add to your OpenClaw config (~/.openclaw/openclaw.json):
{
"plugins": {
"entries": {
"chromadb-memory": {
"enabled": true,
"config": {
"chromaUrl": "http://localhost:8100",
"collectionName": "longterm_memory",
"ollamaUrl": "http://localhost:11434",
"embeddingModel": "nomic-embed-text",
"autoRecall": true,
"autoRecallResults": 3,
"minScore": 0.5
}
}
}
}
}
# 4. Restart the gateway
openclaw gateway restart
| Option | Default | Description |
|---|---|---|
chromaUrl | http://localhost:8100 | ChromaDB server URL |
collectionName | longterm_memory | Collection name (auto-resolves UUID, survives reindexing) |
collectionId | — | Collection UUID (optional fallback) |
ollamaUrl | http://localhost:11434 | Ollama API URL |
embeddingModel | nomic-embed-text | Ollama embedding model |
autoRecall | true | Auto-inject relevant memories each turn |
autoRecallResults | 3 | Max auto-recall results per turn |
minScore | 0.5 | Minimum similarity score (0-1) |
minScore are injected into the agent's context as <chromadb-memories>Auto-recall adds ~275 tokens per turn worst case (3 results × ~300 chars + wrapper). Against a 200K+ context window, this is negligible.
minScore to 0.6 or 0.7minScore to 0.4, increase autoRecallResults to 5autoRecall: false, use chromadb_search toolUser Message → Ollama (embed) → ChromaDB (query) → Context Injection
↓
Agent Response
No OpenAI. No cloud. Your memories stay on your hardware.