MCP Servers

模型上下文协议服务器、框架、SDK 和模板的综合目录。

MCP server by underratedf00l

创建于 3/29/2026
更新于 about 5 hours ago
Repository documentation and setup instructions

MCP-Loci

Your AI forgets everything between sessions. Loci fixes that.

Loci is a persistent memory server for Model Context Protocol — giving Claude and other MCP-compatible assistants a cross-session memory that actually works. Store facts, recall them semantically, and synthesize a full portrait of everything your AI knows.

Python License: MIT FastMCP


What it does

| Tool | Description | |---|---| | remember | Store a named memory with type, description, and content | | recall | Search memories — keyword, semantic, or hybrid | | forget | Remove a memory by name or ID | | synthesize | Cross-memory portrait: changes, uncertainties, and recommendations | | health | Server status: memory count, embedding count, model state |

Hybrid search combines BM25 keyword matching (SQLite FTS5) with local semantic embeddings (all-MiniLM-L6-v2) — so recall works whether you remember the exact word or just the general idea.


Quickstart

Install

pip install mcp-loci

Or with semantic search support (recommended):

pip install "mcp-loci[embeddings]"

Add to Claude Desktop

Edit ~/Library/Application Support/Claude/claude_desktop_config.json:

{
  "mcpServers": {
    "memory": {
      "command": "python",
      "args": ["-m", "mcp_loci.server"]
    }
  }
}

Restart Claude Desktop. That's it — Claude now has persistent memory across sessions.

Try it

In Claude:

"Remember that my preferred code style is no trailing commas and 4-space indentation."

Close Claude. Reopen it.

"What's my preferred code style?"

Claude remembers.


Memory types

Use types to organize and filter your memories:

| Type | Use for | |---|---| | user | Personal preferences, identity, background | | feedback | Things Claude should always/never do | | project | Active work context, goals, deadlines | | reference | Where to find things — links, locations, sources |


Semantic search

Install with [embeddings] to enable semantic recall. The first call loads the model (~90MB, local — no API key needed). Subsequent calls use a background-warmed model for instant response.

recall("what does adam prefer for formatting")
# → surfaces "preferred code style" memory even without exact keyword match

Set semantic: false for pure keyword search, or leave it on for hybrid (default).


Configuration

| Env var | Default | Description | |---|---|---| | MCP_MEMORY_DB_PATH | ~/.mcp-loci/memory.db | SQLite database location |


Development

git clone https://github.com/underratedf00l/MCP-Loci
cd MCP-Loci
pip install -e ".[dev,embeddings]"
pytest

28 tests (24 unit + 4 integration). All green.


Why "Loci"?

The method of loci is a 2,500-year-old memorization technique — you place memories in specific locations in an imagined space so you can walk back and find them. That's exactly what this does, at the data layer.


License

MIT — see LICENSE.

快速设置
此服务器的安装指南

安装包 (如果需要)

uvx mcp-loci

Cursor 配置 (mcp.json)

{ "mcpServers": { "underratedf00l-mcp-loci": { "command": "uvx", "args": [ "mcp-loci" ] } } }