Local persistent memory for OpenClaw agents. Captures conversations, extracts structured facts via LLM, and auto-recalls relevant knowledge before each turn. Privacy-first, all stored data stays local in SQLite.
Memento gives your agents long-term memory. It captures conversations, extracts structured facts using an LLM, and auto-injects relevant knowledge before each AI turn.
All stored data stays on your machine — no cloud sync, no subscriptions. Extraction uses your configured LLM provider; use a local model (Ollama) for fully air-gapped operation.
⚠️ Privacy note: When
autoExtractis enabled, conversation segments are sent to your configured LLM provider for fact extraction. If you use a cloud provider (Anthropic, OpenAI, Mistral), that text leaves your machine. For fully local operation, setextractionModeltoollama/<model>and keep Ollama running locally.
shared, private, or secret based on content, with hard overrides for sensitive categories (medical, financial, credentials)Install the plugin, restart your gateway, and Memento starts capturing automatically. Extraction is off by default — enable it explicitly when ready.
Download a local embedding model for richer recall:
mkdir -p ~/.node-llama-cpp/models
curl -L -o ~/.node-llama-cpp/models/bge-m3-Q8_0.gguf \
"https://huggingface.co/gpustack/bge-m3-GGUF/resolve/main/bge-m3-Q8_0.gguf"
All environment variables are optional — you only need the one matching your chosen LLM provider:
| Variable | When Needed |
|---|---|
ANTHROPIC_API_KEY | Using anthropic/* models for extraction |
OPENAI_API_KEY | Using openai/* models for extraction |
MISTRAL_API_KEY | Using mistral/* models for extraction |
MEMENTO_API_KEY | Generic fallback for any provider |
MEMENTO_WORKSPACE_MAIN | Migration only: path to agent workspace for bootstrapping |
No API key needed for ollama/* models (local inference).
Add to your openclaw.json under plugins.entries.memento.config:
{
"memento": {
"autoCapture": true,
"extractionModel": "anthropic/claude-sonnet-4-6",
"extraction": {
"autoExtract": true,
"minTurnsForExtraction": 3
},
"recall": {
"autoRecall": true,
"maxFacts": 20,
"crossAgentRecall": true,
"autoQueryPlanning": false
}
}
}
autoExtract: trueis an explicit opt-in (default:false). When enabled, conversation segments are sent to the configuredextractionModelfor LLM-based fact extraction. Omit or set tofalseto keep everything local.
autoQueryPlanning: trueis an explicit opt-in (default:false). When enabled, a fast LLM call runs before each recall search to expand the query with synonyms and identify relevant categories — improving precision at the cost of one extra LLM call per turn.
Memento stores all data locally:
| Path | Contents |
|---|---|
~/.engram/conversations.sqlite | Main database: conversations, facts, embeddings |
~/.engram/segments/*.jsonl | Human-readable conversation backups |
~/.engram/migration-config.json | Optional: migration workspace paths (only for bootstrapping) |
| Feature | Data leaves machine? | Details |
|---|---|---|
autoCapture (default: true) | ❌ No | Writes to local SQLite + JSONL only |
autoExtract (default: false) | ⚠️ Yes, if cloud LLM | Sends conversation text to configured provider. Use ollama/* for local. |
autoRecall (default: true) | ❌ No | Reads from local SQLite only |
| Secret facts | ❌ Never | Filtered from extraction context — never sent to any LLM |
| Migration | ❌ No | Reads local workspace files, writes to local SQLite |
Migration is an optional, one-time process to seed Memento from existing agent memory/markdown files. It is user-initiated only — never runs automatically.
Migration reads only the files you explicitly list in the config. It does not scan your filesystem, read arbitrary files, or access anything outside the configured paths.
~/.engram/migration-config.json or set MEMENTO_WORKSPACE_MAIN:{
"agents": [
{
"agentId": "main",
"workspace": "/path/to/your-workspace",
"paths": ["MEMORY.md", "memory/*.md"]
}
]
}
npx tsx src/extraction/migrate.ts --all --dry-run
The dry-run prints every file path it would read — review this before proceeding.
npx tsx src/extraction/migrate.ts --all
message:received + message:sent, buffers multi-turn segmentsprevious_value), and knowledge graph relations (including causal edges with causal_weight)fact_relations with causal_weight), multi-layer clusters, and temporal transition tracking (previous_value)autoQueryPlanning), multi-factor scoring (recency × frequency × category weight), 1-hop graph traversal with causal edge 1.5× boost, injected via before_prompt_build hook# From ClawHub
clawhub install memento
# Or for local development
git clone https://github.com/braibaud/Memento
cd Memento
npm install
Note: better-sqlite3 includes native bindings that compile during npm install. This is expected behavior for SQLite access.