Universal personal memory system for AI assistants - MCP server with local vector search. Works with Claude Desktop, Cursor, Windsurf, Cline, Roo Code, OpenCode.
Memory MCP Server
Universal personal memory system for AI assistants — a Model Context Protocol (MCP) server that gives any AI tool persistent, searchable memory across sessions.
Works with Claude Desktop, Cursor, Windsurf, Cline, Roo Code, OpenCode, Continue and any MCP-compatible client.
100% local. No API keys. No cloud. Your memories stay on your machine.
Features
- Hybrid Search — semantic vector search + full-text keyword search, combined for best results
- 100% Local — uses FastEmbed for embeddings, runs entirely on your machine
- Zero Config —
uvx memory-mcp-serverjust works - Universal — one server, all your AI tools share the same memory
- Structured — five memory types:
preference,project,workflow,knowledge,summary - Auto-setup — one command to configure all your AI tools
- Fast — SQLite + LanceDB, sub-second queries even with thousands of memories
Quick Start
Prerequisites
- Python 3.11+
- uv (recommended) or pip
Install
# Using uv (recommended)
uv tool install memory-mcp-server
# Or with pip
pip install memory-mcp-server
Auto-Configure All Your AI Tools
memory-mcp-setup setup
This detects your installed AI tools and adds memory-mcp to each one automatically.
Or Configure Manually
See Manual Configuration below.
Supported AI Tools
| Tool | Auto-Setup | Manual Config | |------|:----------:|:-------------:| | Claude Desktop | ✅ | ✅ | | Cursor | ✅ | ✅ | | Windsurf | ✅ | ✅ | | Cline (VS Code) | ✅ | ✅ | | Roo Code (VS Code) | ✅ | ✅ | | OpenCode | ✅ | ✅ | | Continue | — | ✅ | | Any MCP Client | — | ✅ |
Manual Configuration
Claude Desktop
Add to claude_desktop_config.json:
- macOS:
~/Library/Application Support/Claude/claude_desktop_config.json - Windows:
%APPDATA%\Claude\claude_desktop_config.json
{
"mcpServers": {
"memory": {
"command": "uvx",
"args": ["memory-mcp-server"]
}
}
}
Cursor
Add to ~/.cursor/mcp.json:
{
"mcpServers": {
"memory": {
"command": "uvx",
"args": ["memory-mcp-server"]
}
}
}
Windsurf
Add to ~/.codeium/windsurf/mcp_config.json:
{
"mcpServers": {
"memory": {
"command": "uvx",
"args": ["memory-mcp-server"]
}
}
}
Cline (VS Code)
Cline auto-detects MCP servers, or add manually in Cline MCP settings:
{
"mcpServers": {
"memory": {
"command": "uvx",
"args": ["memory-mcp-server"],
"disabled": false,
"autoApprove": []
}
}
}
Roo Code (VS Code)
Same format as Cline, in Roo MCP settings:
{
"mcpServers": {
"memory": {
"command": "uvx",
"args": ["memory-mcp-server"],
"disabled": false,
"autoApprove": []
}
}
}
OpenCode
Add to ~/.config/opencode/opencode.json:
{
"mcp": {
"memory": {
"type": "local",
"command": ["uvx", "memory-mcp-server"],
"enabled": true
}
}
}
Any MCP Client (stdio transport)
uvx memory-mcp-server
The server communicates over stdio using the MCP protocol.
MCP Tools
The server exposes 6 tools:
| Tool | Description |
|------|-------------|
| memory_store | Store a new memory with kind, tags, priority |
| memory_search | Hybrid semantic + keyword search |
| memory_list | List memories with optional filters |
| memory_update | Update content, tags, or priority |
| memory_delete | Delete a memory by ID |
| memory_stats | Get total count and breakdown |
Memory Kinds
| Kind | Use For |
|------|---------|
| preference | Personal preferences: coding style, tools, conventions |
| project | Project-specific: architecture, tech stack, decisions |
| workflow | Processes: PR flow, deployment steps, review checklists |
| knowledge | Technical insights: gotchas, solutions, tips |
| summary | Session summaries: key decisions, outcomes |
Teaching Your AI to Use Memory
Copy the contents of SKILL.md into your AI tool's system prompt, custom instructions, or rules file. This teaches the AI when and how to use the memory tools.
Where to Put It
| Tool | Location |
|------|----------|
| Claude Desktop | Project Instructions or CLAUDE.md |
| Cursor | .cursor/rules/*.mdc or Settings → Rules |
| Windsurf | .windsurfrules |
| Cline | .clinerules |
| Roo Code | .roorules |
| OpenCode | .opencode/skills/memory-system/SKILL.md |
| Continue | .continue/rules/*.md |
CLI Commands
# Auto-configure all detected AI tools
memory-mcp-setup setup
# Auto-configure a specific tool
memory-mcp-setup setup --tool cursor
# Preview config without writing (dry run)
memory-mcp-setup setup --dry-run
# Show config snippet for manual setup
memory-mcp-setup show-config --tool claude-desktop
# Health check
memory-mcp-setup doctor
Environment Variables
| Variable | Default | Description |
|----------|---------|-------------|
| MEMORY_DATA_DIR | Platform-specific (see below) | Directory for memory database files |
| MEMORY_CACHE_DIR | System default | Cache directory for embedding model |
Default Data Directory
| Platform | Path |
|----------|------|
| macOS | ~/Library/Application Support/memory-mcp/data |
| Linux | ~/.local/share/memory-mcp/data |
| Windows | %APPDATA%\memory-mcp\data |
Architecture
┌─────────────────────────────────────────────┐
│ AI Tool (Client) │
│ Claude / Cursor / Windsurf / Cline / ... │
└────────────────┬────────────────────────────┘
│ MCP (stdio)
┌────────────────▼────────────────────────────┐
│ memory-mcp-server │
│ │
│ ┌─────────────┐ ┌──────────────────────┐ │
│ │ FastMCP │ │ EmbeddingManager │ │
│ │ (6 tools) │ │ (FastEmbed/BGE) │ │
│ └──────┬──────┘ └──────────┬───────────┘ │
│ │ │ │
│ ┌──────▼────────────────────▼───────────┐ │
│ │ MemoryStore │ │
│ │ │ │
│ │ ┌──────────┐ ┌─────────────────┐ │ │
│ │ │ SQLite │ │ LanceDB │ │ │
│ │ │ metadata │ │ vector index │ │ │
│ │ │ + FTS │ │ (384-dim BGE) │ │ │
│ │ └──────────┘ └─────────────────┘ │ │
│ └────────────────────────────────────────┘ │
└──────────────────────────────────────────────┘
- SQLite: stores memory metadata, supports full-text search via FTS5
- LanceDB: stores embedding vectors, supports fast approximate nearest neighbor search
- FastEmbed: runs BAAI/bge-small-en-v1.5 locally for 384-dimensional embeddings
Development
# Clone
git clone https://github.com/cmdparkour/memory-mcp-server.git
cd memory-mcp-server
# Install with dev dependencies
uv sync
# Run directly
uv run memory-mcp
# Run setup CLI
uv run memory-mcp-setup doctor
FAQ
Does it need an API key?
No. Everything runs locally — embedding model included.
Does it support Chinese / non-English languages?
Yes. The BGE embedding model supports multilingual text. SQLite FTS5 also handles CJK characters.
Can multiple AI tools share the same memory?
Yes — that's the whole point. All tools point to the same local database.
Where is my data stored?
See Default Data Directory. You can override with MEMORY_DATA_DIR.
How do I back up my memories?
Copy the data directory. It contains a SQLite database and a LanceDB folder.
How do I reset all memories?
Delete the data directory.
License
MIT — see LICENSE.