MCP Servers

A collection of Model Context Protocol servers, templates, tools and more.

K
Krusch Memory MCP

"A persistent, local-first semantic memory MCP server for IDEs, featuring dual SQLite/PostgreSQL support."

Created 4/17/2026
Updated about 4 hours ago
Repository documentation and setup instructions

Krusch Memory MCP

A persistent, local-first Model Context Protocol (MCP) server that provides your IDE AI assistants (like Claude Desktop or Cursor) with long-term, semantic memory. Instead of your agent forgetting previous bugs, lessons, or project outcomes when you close the editor, it retrieves them using fast vector embeddings.

npm version License: MIT Node Ollama DB

⚡ Quick Start

You must have Ollama running with the nomic-embed-text model pulled:

ollama run nomic-embed-text

1. Install the MCP globally:

npm install -g krusch-memory-mcp

(Or use the 1-command installer: curl -sL https://raw.githubusercontent.com/kruschdev/krusch_memory_mcp/main/install.sh | bash)

2. Run the interactive demo! Once installed, you can instantly verify that your Ollama connection and Database are working correctly:

krusch-memory-demo

This will spin up a temporary in-memory database, insert a mock memory, and retrieve it using vector search.

3. Add to Claude Desktop (claude_desktop_config.json):

{
  "mcpServers": {
    "krusch-memory": {
      "command": "krusch-memory",
      "args": [],
      "env": {
        "DB_MODE": "sqlite",
        "OLLAMA_URL": "http://localhost:11434",
        "EMBED_MODEL": "nomic-embed-text"
      }
    }
  }
}

3. Restart Claude Desktop. That's it!


🚀 Real-World Usage Examples

To effectively use Krusch Memory, simply speak to your IDE agent normally, instructing it to document its findings.

Example 1: Documenting a bug fix

You: "That fixed the port conflict! Please save this to memory so we don't forget the fix." Claude: [Calls add_memory] "I've saved a memory in the 'bugs' category noting that the backend port 5441 conflicts with our legacy DB and we should use 5442 instead."

Example 2: Recalling architectural decisions

You: "How did we decide to structure the user authentication last week?" Claude: [Calls search_memory] "Looking at the 'priorities' and 'lessons' categories, I see we decided to use a singleton JWT factory to avoid circular dependencies."

Example 3: Utilizing Category Filtering

You: "What are my goals for today?" Claude: [Calls search_memory with category='priorities'] "According to your priorities, you wanted to finish the CLI demo first."

How Does it Handle Similar Memories?

If you add multiple slightly different memories over time, the MCP returns the Top 3 highest cosine-similarity matches. Because Krusch includes Exponential Temporal Decay, if you have two very similar memories, the newer one will have a slightly higher score, preventing your agent from hallucinating based on outdated facts.


🗄️ Database Comparison: SQLite vs PostgreSQL

Krusch Memory offers two modes out of the box, controlled via the DB_MODE environment variable.

| Feature | SQLite (sqlite) | PostgreSQL (postgres) | |---------|-------------------|--------------------------| | Best For | Solo developers, lightweight setups. | Enterprise, high-volume swarms logging every action. | | Dependencies | None (Built-in to node module). | Requires pgvector (Docker Compose provided). | | Speed | 1-5ms (for up to ~10k vectors). | Native HNSW C-index (instant at 100k+ vectors). | | Setup | Zero config. | Requires database connection string. |

(For instructions on migrating or configuring Postgres, see our Advanced Topics Guide).


🛠️ Configuration & Troubleshooting

Claude Config Properties

| Variable | Description | Default | |----------|-------------|---------| | DB_MODE | The database engine to use (sqlite or postgres). | sqlite | | OLLAMA_URL | The endpoint for your local Ollama instance. | http://localhost:11434 | | EMBED_MODEL| The Ollama text-embedding model to use. | nomic-embed-text | | AUTO_TAG | Whether to use a local LLM to extract tags from memories. | false | | TAG_MODEL | The Ollama model to use for auto-tagging. | llama3.2 | | DECAY_RATE | Exponential decay rate applied to older memories. | 0.01 |

Troubleshooting

  • Ollama API returned 404 Cause: You haven't pulled the embedding model. Fix: Run ollama pull nomic-embed-text.
  • ECONNREFUSED 127.0.0.1:11434 Cause: Ollama is not running. Fix: Start the Ollama desktop app, or run ollama serve.
  • Database is locked (SQLite) Cause: Multiple instances trying to write simultaneously. Fix: Krusch Memory is primarily designed for a single IDE instance in SQLite mode. If running multiple agents simultaneously, use Postgres.

📖 Further Reading

See the docs/advanced-topics.md file for:

  • Migrating from SQLite to Postgres.
  • Swapping Embedding Models.
  • How Temporal Decay works.

🧪 Testing

Krusch Memory MCP uses the native Node.js test runner. You can run the test suite locally:

npm test

Note: The integration tests will gracefully skip the database insertion/search tests if you do not have Ollama running locally, to prevent CI/CD failures.

License

MIT License. Created by kruschdev.

Quick Setup
Installation guide for this server

Install Package (if required)

npx @modelcontextprotocol/server-krusch_memory_mcp

Cursor configuration (mcp.json)

{ "mcpServers": { "kruschdev-krusch-memory-mcp": { "command": "npx", "args": [ "kruschdev-krusch-memory-mcp" ] } } }