# Directory Structure
```
├── .env
├── Dockerfile
├── example
│ ├── config-smithery.json
│ ├── config.json
│ ├── docker-config.json
│ ├── pydantic_ai_repl.py
│ └── README.md
├── LICENSE
├── pyproject.toml
├── README.md
├── smithery.yaml
└── src
└── mem0_mcp_server
├── __init__.py
├── config.json
├── http_entry.py
├── mcp.json
├── py.typed
├── schemas.py
└── server.py
```
# Files
--------------------------------------------------------------------------------
/.env:
--------------------------------------------------------------------------------
```
1 | MEM0_API_KEY=<your-api-key>
2 | OPENAI_API_KEY=<your-openai-api-key>
3 | MEM0_DEFAULT_USER_ID=<your-mem0-user-id>
4 |
```
--------------------------------------------------------------------------------
/example/README.md:
--------------------------------------------------------------------------------
```markdown
1 | # Pydantic AI Demo
2 |
3 | This directory contains a Pydantic AI agent to interactively test the Mem0 MCP server.
4 |
5 | ## Quick Start
6 |
7 | ```bash
8 | # Install the package
9 | pip install mem0-mcp-server
10 | # Or with uv
11 | uv pip install mem0-mcp-server
12 |
13 | # Set your API keys
14 | export MEM0_API_KEY="m0-..."
15 | export OPENAI_API_KEY="sk-openai_..."
16 |
17 | # Run the REPL
18 | python example/pydantic_ai_repl.py
19 | ```
20 |
21 | ## Using Different Server Configurations
22 |
23 | ### Local Server (default)
24 | ```bash
25 | python example/pydantic_ai_repl.py
26 | ```
27 |
28 | ### Docker Container
29 | ```bash
30 | # Start Docker container
31 | docker run --rm -d \
32 | --name mem0-mcp \
33 | -e MEM0_API_KEY="m0-..." \
34 | -p 8080:8081 \
35 | mem0-mcp-server
36 |
37 | # Run agent pointing to Docker
38 | export MEM0_MCP_CONFIG_PATH=example/docker-config.json
39 | export MEM0_MCP_CONFIG_SERVER=mem0-docker
40 | python example/pydantic_ai_repl.py
41 | ```
42 |
43 | ### Smithery Remote Server
44 | ```bash
45 | export MEM0_MCP_CONFIG_PATH=example/config-smithery.json
46 | export MEM0_MCP_CONFIG_SERVER=mem0-memory-mcp
47 | python example/pydantic_ai_repl.py
48 | ```
49 |
50 | ## What Happens
51 |
52 | 1. The script loads the configuration from `example/config.json` by default
53 | 2. Starts or connects to the Mem0 MCP server
54 | 3. A Pydantic AI agent (Mem0Guide) connects to the server
55 | 4. You get an interactive REPL to test memory operations
56 |
57 | ## Example Prompts
58 |
59 | - "Remember that I love tiramisu"
60 | - "Search for my food preferences"
61 | - "Update my project: the mobile app is now 80% complete"
62 | - "Show me all memories about project Phoenix"
63 | - "Delete memories from 2023"
64 |
65 | ## Config Files
66 |
67 | - `config.json` - Local server (default)
68 | - `docker-config.json` - Connect to Docker container on port 8080
69 | - `config-smithery.json` - Connect to Smithery remote server
70 |
71 | You can create custom configs by copying and modifying these files.
```
--------------------------------------------------------------------------------
/README.md:
--------------------------------------------------------------------------------
```markdown
1 | # Mem0 MCP Server
2 |
3 | [](https://pypi.org/project/mem0-mcp-server/) [](LICENSE) [](https://smithery.ai/server/@mem0ai/mem0-memory-mcp)
4 |
5 | `mem0-mcp-server` wraps the official [Mem0](https://mem0.ai) Memory API as a Model Context Protocol (MCP) server so any MCP-compatible client (Claude Desktop, Cursor, custom agents) can add, search, update, and delete long-term memories.
6 |
7 | ## Tools
8 |
9 | The server exposes the following tools to your LLM:
10 |
11 | | Tool | Description |
12 | | --------------------- | --------------------------------------------------------------------------------- |
13 | | `add_memory` | Save text or conversation history (or explicit message objects) for a user/agent. |
14 | | `search_memories` | Semantic search across existing memories (filters + limit supported). |
15 | | `get_memories` | List memories with structured filters and pagination. |
16 | | `get_memory` | Retrieve one memory by its `memory_id`. |
17 | | `update_memory` | Overwrite a memory's text once the user confirms the `memory_id`. |
18 | | `delete_memory` | Delete a single memory by `memory_id`. |
19 | | `delete_all_memories` | Bulk delete all memories in the confirmed scope (user/agent/app/run). |
20 | | `delete_entities` | Delete a user/agent/app/run entity (and its memories). |
21 | | `list_entities` | Enumerate users/agents/apps/runs stored in Mem0. |
22 |
23 | All responses are JSON strings returned directly from the Mem0 API.
24 |
25 | ## Usage Options
26 |
27 | There are three ways to use the Mem0 MCP Server:
28 |
29 | 1. **Python Package** - Install and run locally using `uvx` with any MCP client
30 | 2. **Docker** - Containerized deployment that creates an `/mcp` HTTP endpoint
31 | 3. **Smithery** - Remote hosted service for managed deployments
32 |
33 | ## Quick Start
34 |
35 | ### Installation
36 |
37 | ```bash
38 | uv pip install mem0-mcp-server
39 | ```
40 |
41 | Or with pip:
42 |
43 | ```bash
44 | pip install mem0-mcp-server
45 | ```
46 |
47 | ### Client Configuration
48 |
49 | Add this configuration to your MCP client:
50 |
51 | ```json
52 | {
53 | "mcpServers": {
54 | "mem0": {
55 | "command": "uvx",
56 | "args": ["mem0-mcp-server"],
57 | "env": {
58 | "MEM0_API_KEY": "m0-...",
59 | "MEM0_DEFAULT_USER_ID": "your-handle"
60 | }
61 | }
62 | }
63 | }
64 | ```
65 |
66 | ### Test with the Python Agent
67 |
68 | <details>
69 | <summary><strong>Click to expand: Test with the Python Agent</strong></summary>
70 |
71 | To test the server immediately, use the included Pydantic AI agent:
72 |
73 | ```bash
74 | # Install the package
75 | pip install mem0-mcp-server
76 | # Or with uv
77 | uv pip install mem0-mcp-server
78 |
79 | # Set your API keys
80 | export MEM0_API_KEY="m0-..."
81 | export OPENAI_API_KEY="sk-openai-..."
82 |
83 | # Clone and test with the agent
84 | git clone https://github.com/mem0ai/mem0-mcp.git
85 | cd mem0-mcp-server
86 | python example/pydantic_ai_repl.py
87 | ```
88 |
89 | **Using different server configurations:**
90 |
91 | ```bash
92 | # Use with Docker container
93 | export MEM0_MCP_CONFIG_PATH=example/docker-config.json
94 | export MEM0_MCP_CONFIG_SERVER=mem0-docker
95 | python example/pydantic_ai_repl.py
96 |
97 | # Use with Smithery remote server
98 | export MEM0_MCP_CONFIG_PATH=example/config-smithery.json
99 | export MEM0_MCP_CONFIG_SERVER=mem0-memory-mcp
100 | python example/pydantic_ai_repl.py
101 | ```
102 |
103 | </details>
104 |
105 | ## What You Can Do
106 |
107 | The Mem0 MCP server enables powerful memory capabilities for your AI applications:
108 |
109 | - Remember that I'm allergic to peanuts and shellfish - Add new health information to memory
110 | - Store these trial parameters: 200 participants, double-blind, placebo-controlled study - Save research data
111 | - What do you know about my dietary preferences? - Search and retrieve all food-related memories
112 | - Update my project status: the mobile app is now 80% complete - Modify existing memory with new info
113 | - Delete all memories from 2023, I need a fresh start - Bulk remove outdated memories
114 | - Show me everything I've saved about the Phoenix project - List all memories for a specific topic
115 |
116 | ## Configuration
117 |
118 | ### Environment Variables
119 |
120 | - `MEM0_API_KEY` (required) – Mem0 platform API key.
121 | - `MEM0_DEFAULT_USER_ID` (optional) – default `user_id` injected into filters and write requests (defaults to `mem0-mcp`).
122 | - `MEM0_ENABLE_GRAPH_DEFAULT` (optional) – Enable graph memories by default (defaults to `false`).
123 | - `MEM0_MCP_AGENT_MODEL` (optional) – default LLM for the bundled agent example (defaults to `openai:gpt-4o-mini`).
124 |
125 | ## Advanced Setup
126 |
127 | <details>
128 | <summary><strong>Click to expand: Docker, Smithery, and Development</strong></summary>
129 |
130 | ### Docker Deployment
131 |
132 | To run with Docker:
133 |
134 | 1. Build the image:
135 |
136 | ```bash
137 | docker build -t mem0-mcp-server .
138 | ```
139 |
140 | 2. Run the container:
141 |
142 | ```bash
143 | docker run --rm -d \
144 | --name mem0-mcp \
145 | -e MEM0_API_KEY=m0-... \
146 | -p 8080:8081 \
147 | mem0-mcp-server
148 | ```
149 |
150 | 3. Monitor the container:
151 |
152 | ```bash
153 | # View logs
154 | docker logs -f mem0-mcp
155 |
156 | # Check status
157 | docker ps
158 | ```
159 |
160 | ### Running with Smithery Remote Server
161 |
162 | To connect to a Smithery-hosted server:
163 |
164 | 1. Install the MCP server (Smithery dependencies are now bundled):
165 |
166 | ```bash
167 | pip install mem0-mcp-server
168 | ```
169 |
170 | 2. Configure MCP client with Smithery:
171 | ```json
172 | {
173 | "mcpServers": {
174 | "mem0-memory-mcp": {
175 | "command": "npx",
176 | "args": [
177 | "-y",
178 | "@smithery/cli@latest",
179 | "run",
180 | "@mem0ai/mem0-memory-mcp",
181 | "--key",
182 | "your-smithery-key",
183 | "--profile",
184 | "your-profile-name"
185 | ],
186 | "env": {
187 | "MEM0_API_KEY": "m0-..."
188 | }
189 | }
190 | }
191 | }
192 | ```
193 |
194 | ### Development Setup
195 |
196 | Clone and run from source:
197 |
198 | ```bash
199 | git clone https://github.com/mem0ai/mem0-mcp.git
200 | cd mem0-mcp-server
201 | pip install -e ".[dev]"
202 |
203 | # Run locally
204 | mem0-mcp-server
205 |
206 | # Or with uv
207 | uv sync
208 | uv run mem0-mcp-server
209 | ```
210 |
211 | </details>
212 |
213 | ## License
214 |
215 | [Apache License 2.0](https://github.com/mem0ai/mem0-mcp/blob/main/LICENSE)
216 |
```
--------------------------------------------------------------------------------
/smithery.yaml:
--------------------------------------------------------------------------------
```yaml
1 | runtime: "python"
2 |
```
--------------------------------------------------------------------------------
/src/mem0_mcp_server/__init__.py:
--------------------------------------------------------------------------------
```python
1 | """Mem0 MCP server package."""
2 |
3 | from .server import main
4 |
5 | __all__ = ["main"]
6 |
```
--------------------------------------------------------------------------------
/example/docker-config.json:
--------------------------------------------------------------------------------
```json
1 | {
2 | "mcpServers": {
3 | "mem0-docker": {
4 | "type": "http",
5 | "url": "http://localhost:8080/mcp"
6 | }
7 | }
8 | }
9 |
```
--------------------------------------------------------------------------------
/src/mem0_mcp_server/mcp.json:
--------------------------------------------------------------------------------
```json
1 | {
2 | "name": "Mem0 Memory",
3 | "description": "Full read/write access to your Mem0 long-term memory",
4 | "url": "stdio"
5 | }
6 |
```
--------------------------------------------------------------------------------
/Dockerfile:
--------------------------------------------------------------------------------
```dockerfile
1 | FROM python:3.12-slim
2 |
3 | ENV PYTHONUNBUFFERED=1 \
4 | UV_SYSTEM_PYTHON=1
5 |
6 | WORKDIR /app
7 |
8 | RUN pip install --no-cache-dir uv
9 |
10 | COPY pyproject.toml README.md ./
11 | COPY src ./src
12 |
13 | RUN uv pip install --system .
14 |
15 | ENV PORT=8081
16 |
17 | CMD ["python", "-m", "mem0_mcp_server.http_entry"]
18 |
```
--------------------------------------------------------------------------------
/src/mem0_mcp_server/config.json:
--------------------------------------------------------------------------------
```json
1 | {
2 | "mcpServers": {
3 | "mem0": {
4 | "command": "${MEM0_MCP_COMMAND:-uvx}",
5 | "args": ["${MEM0_MCP_BINARY:-mem0-mcp-server}"],
6 | "env": {
7 | "MEM0_API_KEY": "${MEM0_API_KEY}",
8 | "MEM0_DEFAULT_USER_ID": "${MEM0_DEFAULT_USER_ID:-mem0-mcp}"
9 | }
10 | }
11 | }
12 | }
13 |
```
--------------------------------------------------------------------------------
/example/config.json:
--------------------------------------------------------------------------------
```json
1 | {
2 | "mcpServers": {
3 | "mem0-local": {
4 | "command": "${MEM0_MCP_COMMAND:-python}",
5 | "args": ["-m", "mem0_mcp_server.server"],
6 | "env": {
7 | "MEM0_API_KEY": "${MEM0_API_KEY}",
8 | "MEM0_DEFAULT_USER_ID": "${MEM0_DEFAULT_USER_ID:-mem0-mcp}"
9 | },
10 | "timeout": "${MEM0_MCP_SERVER_TIMEOUT:-30}"
11 | }
12 | }
13 | }
```
--------------------------------------------------------------------------------
/example/config-smithery.json:
--------------------------------------------------------------------------------
```json
1 | {
2 | "mcpServers": {
3 | "mem0-memory-mcp": {
4 | "command": "npx",
5 | "args": [
6 | "-y",
7 | "@smithery/cli@latest",
8 | "run",
9 | "@mem0ai/mem0-memory-mcp",
10 | "--key",
11 | "your-smithery-key-here",
12 | "--profile",
13 | "your-profile-name-here"
14 | ],
15 | "env": {
16 | "MEM0_API_KEY": "${MEM0_API_KEY}",
17 | "MEM0_DEFAULT_USER_ID": "${MEM0_DEFAULT_USER_ID:-mem0-mcp}"
18 | },
19 | "timeout": "${MEM0_MCP_SERVER_TIMEOUT:-30}"
20 | }
21 | }
22 | }
```
--------------------------------------------------------------------------------
/src/mem0_mcp_server/http_entry.py:
--------------------------------------------------------------------------------
```python
1 | """Production HTTP entry point for Smithery and other container hosts."""
2 |
3 | from __future__ import annotations
4 |
5 | import os
6 |
7 | from .server import create_server
8 |
9 |
10 | def main() -> None:
11 | server = create_server()
12 | # Ensure runtime overrides are respected if Smithery injects a different port/host.
13 | server.settings.host = os.getenv("HOST", server.settings.host)
14 | server.settings.port = int(os.getenv("PORT", server.settings.port))
15 | server.run(transport="streamable-http")
16 |
17 |
18 | if __name__ == "__main__":
19 | main()
20 |
```
--------------------------------------------------------------------------------
/pyproject.toml:
--------------------------------------------------------------------------------
```toml
1 | [build-system]
2 | requires = ["hatchling>=1.27.0"]
3 | build-backend = "hatchling.build"
4 |
5 | [project]
6 | name = "mem0-mcp-server"
7 | version = "0.2.1"
8 | description = "Model Context Protocol server that exposes the Mem0 long-term memory API as tools"
9 | readme = "README.md"
10 | license = {text = "Apache-2.0"}
11 | authors = [{name = "Mem0"}]
12 | requires-python = ">=3.10"
13 | keywords = [
14 | "mcp",
15 | "mem0",
16 | "memory",
17 | "agents",
18 | "tooling",
19 | "llm",
20 | "anthropic",
21 | "claude",
22 | ]
23 | classifiers = [
24 | "Intended Audience :: Developers",
25 | "License :: OSI Approved :: Apache Software License",
26 | "Programming Language :: Python",
27 | "Programming Language :: Python :: 3",
28 | "Programming Language :: Python :: 3.10",
29 | "Programming Language :: Python :: 3.11",
30 | "Programming Language :: Python :: 3.12",
31 | "Operating System :: OS Independent",
32 | "Topic :: Software Development :: Libraries",
33 | "Typing :: Typed",
34 | ]
35 |
36 | dependencies = [
37 | "mcp[cli]>=1.6.0",
38 | "mem0ai>=1.0.1",
39 | "python-dotenv>=1.2.1",
40 | "requests>=2.32.5",
41 | "pydantic-ai-slim[mcp]>=1.14.1",
42 | "smithery>=0.4.2",
43 | ]
44 |
45 | [project.urls]
46 | Homepage = "https://mem0.ai"
47 | Repository = "https://github.com/mem0ai/mem0-mcp"
48 | Documentation = "https://docs.mem0.ai"
49 |
50 | [project.optional-dependencies]
51 | agent = ["pydantic-ai-slim[mcp]>=1.14.1", "python-dotenv>=1.2.1"]
52 |
53 | [dependency-groups]
54 | dev = [
55 | "pytest>=8.3.4",
56 | "ruff>=0.7.0",
57 | "mypy>=1.18.2",
58 | ]
59 |
60 | [project.scripts]
61 | mem0-mcp-server = "mem0_mcp_server.server:main"
62 | dev = "smithery.cli.dev:main"
63 | start = "smithery.cli.start:main"
64 | playground = "smithery.cli.playground:main"
65 |
66 | [project.entry-points."mcp.servers"]
67 | mem0 = "mem0_mcp_server:mcp.json"
68 |
69 | [tool.smithery]
70 | server = "mem0_mcp_server.server:create_server"
71 |
72 | [tool.hatch.build.targets.wheel]
73 | packages = ["src/mem0_mcp_server"]
74 |
75 | [tool.hatch.build.targets.wheel.shared-data]
76 | "src/mem0_mcp_server/mcp.json" = "share/mcp/servers/mem0-mcp-server.json"
77 | "src/mem0_mcp_server/py.typed" = "mem0_mcp_server/py.typed"
78 | "src/mem0_mcp_server/config.json" = "share/mcp/configs/mem0-mcp-server.json"
79 |
80 | [tool.ruff]
81 | target-version = "py310"
82 | line-length = 100
83 |
84 | [tool.mypy]
85 | python_version = "3.10"
86 | strict = true
87 |
```
--------------------------------------------------------------------------------
/src/mem0_mcp_server/schemas.py:
--------------------------------------------------------------------------------
```python
1 | """Shared Pydantic models for the Mem0 MCP server."""
2 |
3 | from __future__ import annotations
4 |
5 | from typing import Any, Dict, Optional
6 |
7 | from pydantic import BaseModel, Field
8 |
9 |
10 | # classic structure across all payloads , does not change
11 | class ToolMessage(BaseModel):
12 | role: str = Field(..., description="Role of the speaker, e.g., user or assistant.")
13 | content: str = Field(..., description="Full text of the utterance to store.")
14 |
15 |
16 | class ConfigSchema(BaseModel):
17 | """Session-level overrides used when hosting via Smithery or HTTP."""
18 |
19 | mem0_api_key: str = Field(..., description="Mem0 API key (required)")
20 | default_user_id: Optional[str] = Field(
21 | None, description="Default user_id injected into filters when unspecified."
22 | )
23 | enable_graph_default: Optional[bool] = Field(
24 | None, description="Default enable_graph toggle when clients omit the flag."
25 | )
26 |
27 |
28 | class AddMemoryArgs(BaseModel):
29 | text: Optional[str] = Field(
30 | None, description="Simple sentence to remember; converted into a user message when set."
31 | )
32 | messages: Optional[list[ToolMessage]] = Field(
33 | None,
34 | description=(
35 | "Explicit role/content history for durable storage. Provide this OR `text`; defaults "
36 | "to the server user_id."
37 | ),
38 | )
39 | user_id: Optional[str] = Field(None, description="Override for the Mem0 user ID.")
40 | agent_id: Optional[str] = Field(None, description="Optional agent identifier.")
41 | app_id: Optional[str] = Field(None, description="Optional app identifier.")
42 | run_id: Optional[str] = Field(None, description="Optional run identifier.")
43 | metadata: Optional[Dict[str, Any]] = Field(None, description="Opaque metadata to persist.")
44 | enable_graph: Optional[bool] = Field(
45 | None, description="Only set True if the user explicitly opts into graph storage."
46 | )
47 |
48 |
49 | # this is where we start with filters
50 | class SearchMemoriesArgs(BaseModel):
51 | query: str = Field(..., description="Describe what you want to find.")
52 | filters: Optional[Dict[str, Any]] = Field(
53 | None, description="Additional filter clauses; user_id is injected automatically."
54 | )
55 | limit: Optional[int] = Field(None, description="Optional maximum number of matches.")
56 | enable_graph: Optional[bool] = Field(
57 | None, description="Set True only when the user asks for graph knowledge."
58 | )
59 |
60 |
61 | class GetMemoriesArgs(BaseModel):
62 | filters: Optional[Dict[str, Any]] = Field(
63 | None, description="Structured filters; user_id injected automatically."
64 | )
65 | page: Optional[int] = Field(None, description="1-indexed page number.")
66 | page_size: Optional[int] = Field(None, description="Number of memories per page.")
67 | enable_graph: Optional[bool] = Field(
68 | None, description="Set True only when the user wants graph knowledge."
69 | )
70 |
71 |
72 | class DeleteAllArgs(BaseModel):
73 | user_id: Optional[str] = Field(
74 | None, description="User scope to delete; defaults to server user."
75 | )
76 | agent_id: Optional[str] = Field(None, description="Optional agent scope filter.")
77 | app_id: Optional[str] = Field(None, description="Optional app scope filter.")
78 | run_id: Optional[str] = Field(None, description="Optional run scope filter.")
79 |
80 |
81 | class DeleteEntitiesArgs(BaseModel):
82 | user_id: Optional[str] = Field(None, description="Delete this user and all related memories.")
83 | agent_id: Optional[str] = Field(None, description="Delete this agent and its memories.")
84 | app_id: Optional[str] = Field(None, description="Delete this app and its memories.")
85 | run_id: Optional[str] = Field(None, description="Delete this run and its memories.")
86 |
```
--------------------------------------------------------------------------------
/example/pydantic_ai_repl.py:
--------------------------------------------------------------------------------
```python
1 | """Standalone Pydantic AI REPL wired to the Mem0 MCP server.
2 |
3 | Run this script from the repo root after installing the package (e.g.,
4 | `pip install -e .[smithery]`). It defaults to the bundled `example/config.json`
5 | so you can connect to the local `mem0_mcp_server.server` entry point without
6 | touching `uvx`.
7 | """
8 |
9 | from __future__ import annotations
10 |
11 | import asyncio
12 | import json
13 | import os
14 | import sys
15 | from pathlib import Path
16 |
17 | from dotenv import load_dotenv
18 | from pydantic_ai import Agent
19 | from pydantic_ai.messages import ModelMessage
20 | from pydantic_ai.mcp import MCPServerStdio, load_mcp_servers
21 |
22 | EXAMPLE_DIR = Path(__file__).resolve().parent
23 | PROJECT_ROOT = EXAMPLE_DIR.parent
24 |
25 | # Ensure `src/` is importable when running directly from the repo without
26 | # installing the editable package first. Safe no-op if already installed.
27 | SRC_PATH = PROJECT_ROOT / "src"
28 | if SRC_PATH.exists() and str(SRC_PATH) not in sys.path:
29 | sys.path.insert(0, str(SRC_PATH))
30 |
31 | BASE_DIR = Path(__file__).resolve().parent
32 | DEFAULT_CONFIG_PATH = BASE_DIR / "config.json"
33 | _env_config_raw = os.getenv("MEM0_MCP_CONFIG_PATH")
34 | if not _env_config_raw:
35 | CONFIG_PATH = DEFAULT_CONFIG_PATH
36 | else:
37 | CONFIG_PATH = Path(_env_config_raw).expanduser()
38 | CONFIG_SERVER_KEY = os.getenv("MEM0_MCP_CONFIG_SERVER", "mem0-local")
39 | DEFAULT_MODEL = os.getenv("MEM0_MCP_AGENT_MODEL", "openai:gpt-5")
40 | DEFAULT_TIMEOUT = int(os.getenv("MEM0_MCP_SERVER_TIMEOUT", "30"))
41 |
42 |
43 | def _require_env(var_name: str) -> str:
44 | value = os.getenv(var_name)
45 | if not value:
46 | raise RuntimeError(f"{var_name} must be set before running the agent.")
47 | return value
48 |
49 |
50 | def _select_server_index() -> int:
51 | """Return the index of the requested server key inside the config file."""
52 |
53 | try:
54 | config = json.loads(CONFIG_PATH.read_text())
55 | except FileNotFoundError:
56 | return -1
57 | servers = config.get("mcpServers") or {}
58 | if not servers:
59 | raise RuntimeError(f"No 'mcpServers' definitions found in {CONFIG_PATH}")
60 | keys = list(servers.keys())
61 | if CONFIG_SERVER_KEY not in servers:
62 | if CONFIG_SERVER_KEY:
63 | raise RuntimeError(
64 | f"Server '{CONFIG_SERVER_KEY}' not found in {CONFIG_PATH}. Available: {keys}"
65 | )
66 | return 0
67 | return keys.index(CONFIG_SERVER_KEY)
68 |
69 |
70 | def _load_server_from_config() -> MCPServerStdio | None:
71 | """Load the MCP server definition from config.json if present."""
72 |
73 | if not CONFIG_PATH.exists():
74 | return None
75 | index = _select_server_index()
76 | servers = load_mcp_servers(CONFIG_PATH)
77 | if not servers:
78 | raise RuntimeError(f"{CONFIG_PATH} did not produce any MCP servers.")
79 | if index >= len(servers):
80 | raise RuntimeError(
81 | f"Server index {index} is out of range for {CONFIG_PATH}; found {len(servers)} servers."
82 | )
83 | return servers[index]
84 |
85 |
86 | def build_server() -> MCPServerStdio:
87 | """Launch the Mem0 MCP server over stdio with inherited env vars."""
88 |
89 | env = os.environ.copy()
90 | _require_env("MEM0_API_KEY") # fail fast with a helpful error
91 |
92 | configured = _load_server_from_config()
93 | if configured:
94 | return configured
95 |
96 | server_path = PROJECT_ROOT / "src" / "mem0_mcp_server" / "server.py"
97 | return MCPServerStdio(
98 | sys.executable,
99 | args=[str(server_path)],
100 | env=env,
101 | timeout=DEFAULT_TIMEOUT,
102 | )
103 |
104 |
105 | def build_agent(server: MCPServerStdio) -> tuple[Agent, str]:
106 | """Create a Pydantic AI agent that can use the Mem0 MCP tools."""
107 |
108 | default_user = os.getenv("MEM0_DEFAULT_USER_ID", "mem0-mcp")
109 | system_prompt = (
110 | "You are Mem0Guide, a friendly assistant whose ONLY external actions are the Mem0 MCP tools.\n"
111 | f"Default to user_id='{default_user}' unless the user gives another value, and inject it into every filter.\n"
112 | "Operating loop:\n"
113 | " 1) Treat every new preference/fact/personal detail as durable—call add_memory right away (even if they never say “remember”) unless they opt out. "
114 | "When a new detail replaces an older one, summarize both so the latest truth is clear (e.g., “was planning Berlin; now relocating to San Francisco”).\n"
115 | " 2) Only run the search → list IDs → confirm → update/delete flow when the user references an existing memory or ambiguity would be risky.\n"
116 | " 3) For get/show/list requests, use a single get_memories or search_memories call and expand synonyms yourself.\n"
117 | " 4) For destructive bulk actions (delete_all_memories, delete_entities) ask for scope once; if the user immediately confirms, execute without re-asking.\n"
118 | " 5) Keep graph opt-in only.\n"
119 | "Act decisively: remember the latest confirmation context so you can honor a follow-up “yes/confirm” without repeating questions, run the best-fit tool, mention what you ran, summarize the outcome naturally, and suggest one concise next step. "
120 | "Mention memory_ids only when needed. Ask clarifying questions only when you truly lack enough info or safety is at risk."
121 | )
122 | model = os.getenv("MEM0_MCP_AGENT_MODEL", DEFAULT_MODEL)
123 | agent = Agent(model=model, toolsets=[server], system_prompt=system_prompt)
124 | return agent, model
125 |
126 |
127 | def _print_banner(model: str) -> None:
128 | print("Mem0 Pydantic AI agent ready. Type a prompt or 'exit' to quit.\n")
129 | print(f"Model: {model}")
130 | print("Tools: Mem0 MCP (add/search/get/update/delete)\n")
131 |
132 |
133 | async def chat_loop(agent: Agent, server: MCPServerStdio, model_name: str) -> None:
134 | """Interactive REPL that streams requests through the agent."""
135 |
136 | message_history: list[ModelMessage] = []
137 | async with server:
138 | async with agent:
139 | _print_banner(model_name)
140 | while True:
141 | try:
142 | user_input = input("You> ").strip()
143 | except (EOFError, KeyboardInterrupt):
144 | print("\nBye!")
145 | return
146 | if not user_input:
147 | continue
148 | if user_input.lower() in {"exit", "quit"}:
149 | print("Bye!")
150 | return
151 | result = await agent.run(user_input, message_history=message_history)
152 | message_history.extend(result.new_messages())
153 | print(f"\nAgent> {result.output}\n")
154 |
155 |
156 | async def main() -> None:
157 | load_dotenv()
158 | server = build_server()
159 | agent, model_name = build_agent(server)
160 | await chat_loop(agent, server, model_name)
161 |
162 |
163 | if __name__ == "__main__":
164 | asyncio.run(main())
165 |
```
--------------------------------------------------------------------------------
/src/mem0_mcp_server/server.py:
--------------------------------------------------------------------------------
```python
1 | """MCP server that exposes Mem0 REST endpoints as MCP tools."""
2 |
3 | from __future__ import annotations
4 |
5 | import json
6 | import logging
7 | import os
8 | from typing import Annotated, Any, Callable, Dict, Optional, TypeVar
9 |
10 | from dotenv import load_dotenv
11 | from mcp.server.fastmcp import Context, FastMCP
12 | from mcp.server.transport_security import TransportSecuritySettings
13 | from mem0 import MemoryClient
14 | from mem0.exceptions import MemoryError
15 | from pydantic import Field
16 |
17 | try: # Support both package (`python -m mem0_mcp.server`) and script (`python mem0_mcp/server.py`) runs.
18 | from .schemas import (
19 | AddMemoryArgs,
20 | ConfigSchema,
21 | DeleteAllArgs,
22 | DeleteEntitiesArgs,
23 | GetMemoriesArgs,
24 | SearchMemoriesArgs,
25 | ToolMessage,
26 | )
27 | except ImportError: # pragma: no cover - fallback for script execution
28 | from schemas import (
29 | AddMemoryArgs,
30 | ConfigSchema,
31 | DeleteAllArgs,
32 | DeleteEntitiesArgs,
33 | GetMemoriesArgs,
34 | SearchMemoriesArgs,
35 | ToolMessage,
36 | )
37 |
38 | load_dotenv()
39 |
40 | logging.basicConfig(level=logging.INFO, format="%(levelname)s %(name)s | %(message)s")
41 | logger = logging.getLogger("mem0_mcp_server")
42 |
43 |
44 |
45 |
46 | T = TypeVar("T")
47 |
48 | try:
49 | from smithery.decorators import smithery
50 | except ImportError: # pragma: no cover - Smithery optional
51 |
52 | class _SmitheryFallback:
53 | @staticmethod
54 | def server(*args, **kwargs): # type: ignore[misc]
55 | def decorator(func: Callable[..., T]) -> Callable[..., T]: # type: ignore[type-var]
56 | return func
57 |
58 | return decorator
59 |
60 | smithery = _SmitheryFallback() # type: ignore[assignment]
61 |
62 |
63 | # graph remains off by default , also set the default user_id to "mem0-mcp" when nothing set
64 | ENV_API_KEY = os.getenv("MEM0_API_KEY")
65 | ENV_DEFAULT_USER_ID = os.getenv("MEM0_DEFAULT_USER_ID", "mem0-mcp")
66 | ENV_ENABLE_GRAPH_DEFAULT = os.getenv("MEM0_ENABLE_GRAPH_DEFAULT", "false").lower() in {
67 | "1",
68 | "true",
69 | "yes",
70 | }
71 |
72 | _CLIENT_CACHE: Dict[str, MemoryClient] = {}
73 |
74 |
75 | def _config_value(source: Any, field: str):
76 | if source is None:
77 | return None
78 | if isinstance(source, dict):
79 | return source.get(field)
80 | return getattr(source, field, None)
81 |
82 |
83 | def _with_default_filters(
84 | default_user_id: str, filters: Optional[Dict[str, Any]] = None
85 | ) -> Dict[str, Any]:
86 | """Ensure filters exist and include the default user_id at the top level."""
87 | if not filters:
88 | return {"AND": [{"user_id": default_user_id}]}
89 | if not any(key in filters for key in ("AND", "OR", "NOT")):
90 | filters = {"AND": [filters]}
91 | has_user = json.dumps(filters, sort_keys=True).find('"user_id"') != -1
92 | if not has_user:
93 | and_list = filters.setdefault("AND", [])
94 | if not isinstance(and_list, list):
95 | raise ValueError("filters['AND'] must be a list when present.")
96 | and_list.insert(0, {"user_id": default_user_id})
97 | return filters
98 |
99 |
100 | def _mem0_call(func, *args, **kwargs):
101 | try:
102 | result = func(*args, **kwargs)
103 | except MemoryError as exc: # surface structured error back to MCP client
104 | logger.error("Mem0 call failed: %s", exc)
105 | # returns the erorr to the model
106 | return json.dumps(
107 | {
108 | "error": str(exc),
109 | "status": getattr(exc, "status", None),
110 | "payload": getattr(exc, "payload", None),
111 | },
112 | ensure_ascii=False,
113 | )
114 | return json.dumps(result, ensure_ascii=False)
115 |
116 |
117 | def _resolve_settings(ctx: Context | None) -> tuple[str, str, bool]:
118 | session_config = getattr(ctx, "session_config", None)
119 | api_key = _config_value(session_config, "mem0_api_key") or ENV_API_KEY
120 | if not api_key:
121 | raise RuntimeError(
122 | "MEM0_API_KEY is required (via Smithery config, session config, or environment) to run the Mem0 MCP server."
123 | )
124 |
125 | default_user = _config_value(session_config, "default_user_id") or ENV_DEFAULT_USER_ID
126 | enable_graph_default = _config_value(session_config, "enable_graph_default")
127 | if enable_graph_default is None:
128 | enable_graph_default = ENV_ENABLE_GRAPH_DEFAULT
129 |
130 | return api_key, default_user, enable_graph_default
131 |
132 |
133 | # init the client
134 | def _mem0_client(api_key: str) -> MemoryClient:
135 | client = _CLIENT_CACHE.get(api_key)
136 | if client is None:
137 | client = MemoryClient(api_key=api_key)
138 | _CLIENT_CACHE[api_key] = client
139 | return client
140 |
141 |
142 | def _default_enable_graph(enable_graph: Optional[bool], default: bool) -> bool:
143 | if enable_graph is None:
144 | return default
145 | return enable_graph
146 |
147 |
148 | @smithery.server(config_schema=ConfigSchema)
149 | def create_server() -> FastMCP:
150 | """Create a FastMCP server usable via stdio, Docker, or Smithery."""
151 |
152 | # When running inside Smithery, the platform probes the server without user-provided
153 | # session config, so we defer the hard requirement for MEM0_API_KEY until a tool call.
154 | if not ENV_API_KEY:
155 | logger.warning(
156 | "MEM0_API_KEY is not set; Smithery health checks will pass, but every tool "
157 | "invocation will fail until a key is supplied via session config or env vars."
158 | )
159 |
160 | server = FastMCP(
161 | "mem0",
162 | host=os.getenv("HOST", "0.0.0.0"),
163 | port=int(os.getenv("PORT", "8081")),
164 | transport_security=TransportSecuritySettings(enable_dns_rebinding_protection=False),
165 | )
166 |
167 | # graph is disabled by default to make queries simpler and fast
168 | # Mention " Enable/Use graph while calling memory " in your system prompt to run it in each instance
169 |
170 | @server.tool(description="Store a new preference, fact, or conversation snippet. Requires at least one: user_id, agent_id, or run_id.")
171 | def add_memory(
172 | text: Annotated[
173 | str,
174 | Field(
175 | description="Plain sentence summarizing what to store. Required even if `messages` is provided."
176 | ),
177 | ],
178 | messages: Annotated[
179 | Optional[list[Dict[str, str]]],
180 | Field(
181 | default=None,
182 | description="Structured conversation history with `role`/`content`. "
183 | "Use when you have multiple turns.",
184 | ),
185 | ] = None,
186 | user_id: Annotated[
187 | Optional[str],
188 | Field(default=None, description="Override the default user scope for this write."),
189 | ] = None,
190 | agent_id: Annotated[
191 | Optional[str], Field(default=None, description="Optional agent identifier.")
192 | ] = None,
193 | app_id: Annotated[
194 | Optional[str], Field(default=None, description="Optional app identifier.")
195 | ] = None,
196 | run_id: Annotated[
197 | Optional[str], Field(default=None, description="Optional run identifier.")
198 | ] = None,
199 | metadata: Annotated[
200 | Optional[Dict[str, Any]],
201 | Field(default=None, description="Attach arbitrary metadata JSON to the memory."),
202 | ] = None,
203 | enable_graph: Annotated[
204 | Optional[bool],
205 | Field(
206 | default=None,
207 | description="Set true only if the caller explicitly wants Mem0 graph memory.",
208 | ),
209 | ] = None,
210 | ctx: Context | None = None,
211 | ) -> str:
212 | """Write durable information to Mem0."""
213 |
214 | api_key, default_user, graph_default = _resolve_settings(ctx)
215 | args = AddMemoryArgs(
216 | text=text,
217 | messages=[ToolMessage(**msg) for msg in messages] if messages else None,
218 | user_id=user_id if user_id else (default_user if not (agent_id or run_id) else None),
219 | agent_id=agent_id,
220 | app_id=app_id,
221 | run_id=run_id,
222 | metadata=metadata,
223 | enable_graph=_default_enable_graph(enable_graph, graph_default),
224 | )
225 | payload = args.model_dump(exclude_none=True)
226 | payload.setdefault("enable_graph", graph_default)
227 | conversation = payload.pop("messages", None)
228 | if not conversation:
229 | derived_text = payload.pop("text", None)
230 | if derived_text:
231 | conversation = [{"role": "user", "content": derived_text}]
232 | else:
233 | return json.dumps(
234 | {
235 | "error": "messages_missing",
236 | "detail": "Provide either `text` or `messages` so Mem0 knows what to store.",
237 | },
238 | ensure_ascii=False,
239 | )
240 | else:
241 | payload.pop("text", None)
242 |
243 | client = _mem0_client(api_key)
244 | return _mem0_call(client.add, conversation, **payload)
245 |
246 | @server.tool(
247 | description="""Run a semantic search over existing memories.
248 |
249 | Use filters to narrow results. Common filter patterns:
250 | - Single user: {"AND": [{"user_id": "john"}]}
251 | - Agent memories: {"AND": [{"agent_id": "agent_name"}]}
252 | - Recent memories: {"AND": [{"user_id": "john"}, {"created_at": {"gte": "2024-01-01"}}]}
253 | - Multiple users: {"AND": [{"user_id": {"in": ["john", "jane"]}}]}
254 | - Cross-entity: {"OR": [{"user_id": "john"}, {"agent_id": "agent_name"}]}
255 |
256 | user_id is automatically added to filters if not provided.
257 | """
258 | )
259 | def search_memories(
260 | query: Annotated[str, Field(description="Natural language description of what to find.")],
261 | filters: Annotated[
262 | Optional[Dict[str, Any]],
263 | Field(default=None, description="Additional filter clauses (user_id injected automatically)."),
264 | ] = None,
265 | limit: Annotated[
266 | Optional[int], Field(default=None, description="Maximum number of results to return.")
267 | ] = None,
268 | enable_graph: Annotated[
269 | Optional[bool],
270 | Field(
271 | default=None,
272 | description="Set true only when the user explicitly wants graph-derived memories.",
273 | ),
274 | ] = None,
275 | ctx: Context | None = None,
276 | ) -> str:
277 | """Semantic search against existing memories."""
278 |
279 | api_key, default_user, graph_default = _resolve_settings(ctx)
280 | args = SearchMemoriesArgs(
281 | query=query,
282 | filters=filters,
283 | limit=limit,
284 | enable_graph=_default_enable_graph(enable_graph, graph_default),
285 | )
286 | payload = args.model_dump(exclude_none=True)
287 | payload["filters"] = _with_default_filters(default_user, payload.get("filters"))
288 | payload.setdefault("enable_graph", graph_default)
289 | client = _mem0_client(api_key)
290 | return _mem0_call(client.search, **payload)
291 |
292 | @server.tool(
293 | description="""Page through memories using filters instead of search.
294 |
295 | Use filters to list specific memories. Common filter patterns:
296 | - Single user: {"AND": [{"user_id": "john"}]}
297 | - Agent memories: {"AND": [{"agent_id": "agent_name"}]}
298 | - Recent memories: {"AND": [{"user_id": "john"}, {"created_at": {"gte": "2024-01-01"}}]}
299 | - Multiple users: {"AND": [{"user_id": {"in": ["john", "jane"]}}]}
300 |
301 | Pagination: Use page (1-indexed) and page_size for browsing results.
302 | user_id is automatically added to filters if not provided.
303 | """
304 | )
305 | def get_memories(
306 | filters: Annotated[
307 | Optional[Dict[str, Any]],
308 | Field(default=None, description="Structured filters; user_id injected automatically."),
309 | ] = None,
310 | page: Annotated[
311 | Optional[int], Field(default=None, description="1-indexed page number when paginating.")
312 | ] = None,
313 | page_size: Annotated[
314 | Optional[int], Field(default=None, description="Number of memories per page (default 10).")
315 | ] = None,
316 | enable_graph: Annotated[
317 | Optional[bool],
318 | Field(
319 | default=None,
320 | description="Set true only if the caller explicitly wants graph-derived memories.",
321 | ),
322 | ] = None,
323 | ctx: Context | None = None,
324 | ) -> str:
325 | """List memories via structured filters or pagination."""
326 |
327 | api_key, default_user, graph_default = _resolve_settings(ctx)
328 | args = GetMemoriesArgs(
329 | filters=filters,
330 | page=page,
331 | page_size=page_size,
332 | enable_graph=_default_enable_graph(enable_graph, graph_default),
333 | )
334 | payload = args.model_dump(exclude_none=True)
335 | payload["filters"] = _with_default_filters(default_user, payload.get("filters"))
336 | payload.setdefault("enable_graph", graph_default)
337 | client = _mem0_client(api_key)
338 | return _mem0_call(client.get_all, **payload)
339 |
340 | @server.tool(
341 | description="Delete every memory in the given user/agent/app/run but keep the entity."
342 | )
343 | def delete_all_memories(
344 | user_id: Annotated[
345 | Optional[str], Field(default=None, description="User scope to delete; defaults to server user.")
346 | ] = None,
347 | agent_id: Annotated[
348 | Optional[str], Field(default=None, description="Optional agent scope to delete.")
349 | ] = None,
350 | app_id: Annotated[
351 | Optional[str], Field(default=None, description="Optional app scope to delete.")
352 | ] = None,
353 | run_id: Annotated[
354 | Optional[str], Field(default=None, description="Optional run scope to delete.")
355 | ] = None,
356 | ctx: Context | None = None,
357 | ) -> str:
358 | """Bulk-delete every memory in the confirmed scope."""
359 |
360 | api_key, default_user, _ = _resolve_settings(ctx)
361 | args = DeleteAllArgs(
362 | user_id=user_id or default_user,
363 | agent_id=agent_id,
364 | app_id=app_id,
365 | run_id=run_id,
366 | )
367 | payload = args.model_dump(exclude_none=True)
368 | client = _mem0_client(api_key)
369 | return _mem0_call(client.delete_all, **payload)
370 |
371 | @server.tool(description="List which users/agents/apps/runs currently hold memories.")
372 | def list_entities(ctx: Context | None = None) -> str:
373 | """List users/agents/apps/runs with stored memories."""
374 |
375 | api_key, _, _ = _resolve_settings(ctx)
376 | client = _mem0_client(api_key)
377 | return _mem0_call(client.users)
378 |
379 | @server.tool(description="Fetch a single memory once you know its memory_id.")
380 | def get_memory(
381 | memory_id: Annotated[str, Field(description="Exact memory_id to fetch.")],
382 | ctx: Context | None = None,
383 | ) -> str:
384 | """Retrieve a single memory once the user has picked an exact ID."""
385 |
386 | api_key, _, _ = _resolve_settings(ctx)
387 | client = _mem0_client(api_key)
388 | return _mem0_call(client.get, memory_id)
389 |
390 | @server.tool(description="Overwrite an existing memory’s text.")
391 | def update_memory(
392 | memory_id: Annotated[str, Field(description="Exact memory_id to overwrite.")],
393 | text: Annotated[str, Field(description="Replacement text for the memory.")],
394 | ctx: Context | None = None,
395 | ) -> str:
396 | """Overwrite an existing memory’s text after the user confirms the exact memory_id."""
397 |
398 | api_key, _, _ = _resolve_settings(ctx)
399 | client = _mem0_client(api_key)
400 | return _mem0_call(client.update, memory_id=memory_id, text=text)
401 |
402 | @server.tool(description="Delete one memory after the user confirms its memory_id.")
403 | def delete_memory(
404 | memory_id: Annotated[str, Field(description="Exact memory_id to delete.")],
405 | ctx: Context | None = None,
406 | ) -> str:
407 | """Delete a memory once the user explicitly confirms the memory_id to remove."""
408 |
409 | api_key, _, _ = _resolve_settings(ctx)
410 | client = _mem0_client(api_key)
411 | return _mem0_call(client.delete, memory_id)
412 |
413 | @server.tool(
414 | description="Remove a user/agent/app/run record entirely (and cascade-delete its memories)."
415 | )
416 | def delete_entities(
417 | user_id: Annotated[
418 | Optional[str], Field(default=None, description="Delete this user and its memories.")
419 | ] = None,
420 | agent_id: Annotated[
421 | Optional[str], Field(default=None, description="Delete this agent and its memories.")
422 | ] = None,
423 | app_id: Annotated[
424 | Optional[str], Field(default=None, description="Delete this app and its memories.")
425 | ] = None,
426 | run_id: Annotated[
427 | Optional[str], Field(default=None, description="Delete this run and its memories.")
428 | ] = None,
429 | ctx: Context | None = None,
430 | ) -> str:
431 | """Delete a user/agent/app/run (and its memories) once the user confirms the scope."""
432 |
433 | api_key, _, _ = _resolve_settings(ctx)
434 | args = DeleteEntitiesArgs(
435 | user_id=user_id,
436 | agent_id=agent_id,
437 | app_id=app_id,
438 | run_id=run_id,
439 | )
440 | if not any([args.user_id, args.agent_id, args.app_id, args.run_id]):
441 | return json.dumps(
442 | {
443 | "error": "scope_missing",
444 | "detail": "Provide user_id, agent_id, app_id, or run_id before calling delete_entities.",
445 | },
446 | ensure_ascii=False,
447 | )
448 | payload = args.model_dump(exclude_none=True)
449 | client = _mem0_client(api_key)
450 | return _mem0_call(client.delete_users, **payload)
451 |
452 | # Add a simple prompt for server capabilities
453 | @server.prompt()
454 | def memory_assistant() -> str:
455 | """Get help with memory operations and best practices."""
456 | return """You are using the Mem0 MCP server for long-term memory management.
457 |
458 | Quick Start:
459 | 1. Store memories: Use add_memory to save facts, preferences, or conversations
460 | 2. Search memories: Use search_memories for semantic queries
461 | 3. List memories: Use get_memories for filtered browsing
462 | 4. Update/Delete: Use update_memory and delete_memory for modifications
463 |
464 | Filter Examples:
465 | - User memories: {"AND": [{"user_id": "john"}]}
466 | - Agent memories: {"AND": [{"agent_id": "agent_name"}]}
467 | - Recent only: {"AND": [{"user_id": "john"}, {"created_at": {"gte": "2024-01-01"}}]}
468 |
469 | Tips:
470 | - user_id is automatically added to filters
471 | - Use "*" as wildcard for any non-null value
472 | - Combine filters with AND/OR/NOT for complex queries"""
473 |
474 | return server
475 |
476 |
477 | def main() -> None:
478 | """Run the MCP server over stdio."""
479 |
480 | server = create_server()
481 | logger.info("Starting Mem0 MCP server (default user=%s)", ENV_DEFAULT_USER_ID)
482 | server.run(transport="stdio")
483 |
484 |
485 | if __name__ == "__main__":
486 | main()
487 |
```