Mem0 Platform SDK for adding persistent memory to AI applications. TRIGGER when: user mentions "mem0", "MemoryClient", "memory layer", "remember user preferences", "persistent context", "personalization", or needs to add long-term memory to chatbots, agents, or AI apps. Covers Python SDK (mem0ai), TypeScript SDK (mem0ai), and framework integrations (LangChain, CrewAI, OpenAI Agents SDK, Pipecat, LlamaIndex, AutoGen, LangGraph). Also covers the open-source self-hosted Memory class. This is the DEFAULT mem0 skill for ambiguous queries. DO NOT TRIGGER when: user asks about CLI commands, terminal usage, or shell scripts (use mem0-cli), or Vercel AI SDK / @mem0/vercel-ai-provider / createMem0 (use mem0-vercel-ai-sdk).
Skill Graph: This skill is part of the Mem0 skill graph:
- mem0 (this skill) -- Platform Client SDK + OSS (Python + TypeScript)
- mem0-cli (GitHub) -- Command-line interface
- mem0-vercel-ai-sdk (GitHub) -- Vercel AI SDK provider
Mem0 is a managed memory layer for AI applications. It stores, retrieves, and manages user memories via API — no infrastructure to deploy. For self-hosted usage, see the OSS section in the client references below.
Python:
pip install mem0ai
export MEM0_API_KEY="m0-your-api-key"
TypeScript/JavaScript:
npm install mem0ai
export MEM0_API_KEY="m0-your-api-key"
Get an API key at: https://app.mem0.ai/dashboard/api-keys
Python:
from mem0 import MemoryClient
client = MemoryClient(api_key="m0-xxx")
TypeScript:
import MemoryClient from 'mem0ai';
const client = new MemoryClient({ apiKey: 'm0-xxx' });
For async Python, use AsyncMemoryClient.
Every Mem0 integration follows the same pattern: retrieve → generate → store.
messages = [
{"role": "user", "content": "I'm a vegetarian and allergic to nuts."},
{"role": "assistant", "content": "Got it! I'll remember that."}
]
client.add(messages, user_id="alice")
results = client.search("dietary preferences", filters={"user_id": "alice"})
for mem in results.get("results", []):
print(mem["memory"])
all_memories = client.get_all(filters={"user_id": "alice"})
client.update("memory-uuid", text="Updated: vegetarian, nut allergy, prefers organic")
client.delete("memory-uuid")
client.delete_all(user_id="alice") # delete all for a user
from mem0 import MemoryClient
from openai import OpenAI
mem0 = MemoryClient()
openai = OpenAI()
def chat(user_input: str, user_id: str) -> str:
# 1. Retrieve relevant memories
memories = mem0.search(user_input, filters={"user_id": user_id})
context = "\n".join([m["memory"] for m in memories.get("results", [])])
# 2. Generate response with memory context
response = openai.chat.completions.create(
model="gpt-5-mini",
messages=[
{"role": "system", "content": f"User context:\n{context}"},
{"role": "user", "content": user_input},
]
)
reply = response.choices[0].message.content
# 3. Store interaction for future context
mem0.add(
[{"role": "user", "content": user_input}, {"role": "assistant", "content": reply}],
user_id=user_id
)
return reply
add() before searching. Also verify user_id matches exactly (case-sensitive) and use filters={"user_id": "..."} syntax.OR instead, or query separately.infer=True (default) and infer=False for the same data. Stick to one mode.from mem0 import MemoryClient (or AsyncMemoryClient for async). Do not use from mem0 import Memory.top_k=20, threshold=0.1, rerank=False. Adjust as needed for your use case.If you're using SDK v2.x, note these differences:
user_id as top-level kwarg to search() instead of inside filterstop_k=100, no threshold, rerank=Trueenable_graph=TrueSee the migration guide for details.
For the latest docs beyond what's in the references, use the doc search tool:
python ${CLAUDE_SKILL_DIR}/scripts/mem0_doc_search.py --query "topic"
python ${CLAUDE_SKILL_DIR}/scripts/mem0_doc_search.py --page "/platform/features/graph-memory"
python ${CLAUDE_SKILL_DIR}/scripts/mem0_doc_search.py --index
No API key needed — searches docs.mem0.ai directly.
Language-specific deep references (Platform + OSS):
| Language | File |
|---|---|
| Python (MemoryClient + AsyncMemoryClient + Memory OSS) | client/python.md |
| TypeScript/Node.js (MemoryClient + Memory OSS) | client/node.md |
| Python vs TypeScript differences | client/differences.md |
Load these on demand for deeper detail:
| Topic | File |
|---|---|
| Quickstart (Python, TS, cURL) | references/quickstart.md |
| SDK guide (all methods, both languages) | references/sdk-guide.md |
| API reference (endpoints, filters, object schema) | references/api-reference.md |
| Architecture (pipeline, lifecycle, scoping, performance) | references/architecture.md |
| Platform features (retrieval, graph, categories, MCP, etc.) | references/features.md |
| Framework integrations (LangChain, CrewAI, OpenAI Agents, etc.) | references/integration-patterns.md |
| Use cases & examples (real-world patterns with code) | references/use-cases.md |
| Skill | When to use | Link |
|---|---|---|
| mem0-cli | Terminal commands, scripting, CI/CD, agent tool loops | local / GitHub |
| mem0-vercel-ai-sdk | Vercel AI SDK provider with automatic memory | local / GitHub |