Deploy Memoria with Docker Compose or Kubernetes. Environment variables, multi-instance setup, security. Use when deploying or configuring Memoria.
cd Memoria
cp .env.example .env # Set MEMORIA_MASTER_KEY, MEMORIA_EMBEDDING_API_KEY
docker compose up -d
Services: API on :8100, MatrixOne on :6001. Verify: curl http://localhost:8100/health
| Variable | Description |
|---|---|
MEMORIA_MASTER_KEY | Admin API key (min 16 chars) |
| Variable | Default | Description |
|---|---|---|
MEMORIA_DB_HOST |
matrixone| MatrixOne host |
MEMORIA_DB_PORT | 6001 | MatrixOne port |
MEMORIA_DB_USER | root | Database user |
MEMORIA_DB_PASSWORD | 111 | Database password |
MEMORIA_DB_NAME | memoria | Database name |
| Variable | Default | Description |
|---|---|---|
MEMORIA_EMBEDDING_PROVIDER | local | local or openai |
MEMORIA_EMBEDDING_MODEL | all-MiniLM-L6-v2 | Model name |
MEMORIA_EMBEDDING_API_KEY | — | Required if provider is openai (single-backend) |
MEMORIA_EMBEDDING_BASE_URL | — | Custom endpoint, OpenAI-compatible (single-backend) |
MEMORIA_EMBEDDING_ENDPOINTS | — | JSON array for multi-backend round-robin. When set, supersedes BASE_URL/API_KEY. Format: [{"url":"https://api1.example.com/v1","api_key":"sk-1"},{"url":"https://api2.example.com/v1","api_key":"sk-2"}]. All endpoints must serve the same model. Requests rotate round-robin; failed/rate-limited endpoints are skipped automatically. |
MEMORIA_EMBEDDING_DIM | 0 (auto) | Embedding dimension |
| Variable | Default | Description |
|---|---|---|
MEMORIA_INSTANCE_ID | Random UUID | Unique instance ID. Set to Pod name in K8s |
MEMORIA_LOCK_TTL_SECS | 120 | Distributed lock TTL. Heartbeat renews every TTL/3 |
| Variable | Default | Description |
|---|---|---|
MEMORIA_GOVERNANCE_ENABLED | false | Enable background governance scheduler |
MEMORIA_GOVERNANCE_PLUGIN_BINDING | default | Repository binding name |
MEMORIA_GOVERNANCE_PLUGIN_SUBJECT | system | Subject key for binding resolution |
MEMORIA_GOVERNANCE_PLUGIN_DIR | — | Local plugin dir (dev mode, skips signature) |
| Variable | Default | Description |
|---|---|---|
MEMORIA_API_KEY_SECRET | — | HMAC secret for key hashing. Set independently of MASTER_KEY |
| Variable | Description |
|---|---|
LLM_API_KEY | OpenAI-compatible API key |
LLM_BASE_URL | API base URL |
LLM_MODEL | Model name |
Use an existing MatrixOne instead of the bundled one:
MEMORIA_DB_HOST=your-host MEMORIA_DB_PORT=6001 docker compose up -d api
Tables are auto-created on first startup.
All coordination uses the shared MatrixOne database — zero external dependencies.
How it works:
mem_async_tasks table, visible cross-instance.NoopDistributedLock — zero overhead, identical behavior.apiVersion: apps/v1