MCP server by Karanjot786
wiki-hub-mcp
Give your LLM a persistent, searchable wiki backed by a GitHub repo. Based on Karpathy's LLM Wiki pattern: your LLM reads sources and writes structured pages to your repo. You ask questions; your wiki searches and answers. No RAG pipeline, no vector database. Your knowledge grows as plain markdown files you own.
The idea
RAG pipelines re-derive the same knowledge every time you ask a question. This MCP takes a different approach: your LLM reads sources once and compiles the knowledge into a persistent wiki stored in a GitHub repo you own.
The wiki grows over time. Each page links to others. The LLM searches what's already there before reading anything new. Knowledge compounds with each session.
| Layer | What it is | Where it lives |
|---|---|---|
| Raw sources | Articles, papers, transcripts | pages/sources/ |
| Wiki | Entities, concepts, topics, syntheses | pages/entities/, pages/concepts/, pages/topics/ |
| Schema | Naming rules, page templates, ingest workflow | WIKI_SCHEMA.md |
Three operations drive the whole system:
| Operation | What the LLM does |
|---|---|
| Ingest | Calls wiki_add_source, reads the content, creates or updates pages |
| Query | Calls wiki_search, reads the results, answers from compiled knowledge |
| Lint | Calls wiki_lint, finds orphan pages and broken links, fixes them |
When to use this
Good fit:
- You return to a topic repeatedly (research areas, technologies, people)
- You want to build up knowledge across many sessions
- You prefer owning your knowledge in plain markdown files
Not the right tool:
- One-off questions where you won't return to the topic
- Real-time data like prices or news; wiki knowledge gets stale
- Fully automated pipelines with no human in the loop
Why it works
LLMs don't get bored. They read the same source ten times and extract more detail each pass. A human wiki editor gives up after the third revision. Your LLM keeps going.
Obsidian integration
The wiki is plain markdown with YAML frontmatter. Clone your wiki repo locally and open the folder as an Obsidian vault. Use wiki_search in Claude for BM25 full-text search. Use Obsidian for browsing and graph view.
Prerequisites
- Node.js 20+
ghCLI authenticated (gh auth login)- A GitHub repo for your wiki:
gh repo create my-wiki --private
Installation
Add this config to your MCP config file:
{
"mcpServers": {
"wiki-hub": {
"command": "npx",
"args": ["-y", "wiki-hub-mcp"],
"env": {
"WIKI_GITHUB_OWNER": "your-github-username",
"WIKI_GITHUB_REPO": "my-wiki"
}
}
}
}
Prefer a setup wizard? Run this and pick your client:
npx -y wiki-hub-mcp install
1. Claude Desktop 6. Zed
2. Claude Code CLI 7. Continue.dev
3. Cursor 8. Cline
4. Windsurf 9. Roo Code
5. VS Code / Copilot 0. All clients
Claude Code
claude mcp add --scope user wiki-hub \
-e WIKI_GITHUB_OWNER=your-username \
-e WIKI_GITHUB_REPO=my-wiki \
-- npx -y wiki-hub-mcp
Verify: claude mcp list
Claude Desktop
macOS: ~/Library/Application Support/Claude/claude_desktop_config.json
Windows: %APPDATA%\Claude\claude_desktop_config.json
Add the standard config above. Restart Claude Desktop after saving.
Cursor
Click the Install in Cursor badge above, then add your env vars in Settings > MCP > wiki-hub-mcp > Edit.
Or create ~/.cursor/mcp.json manually:
{
"mcpServers": {
"wiki-hub": {
"type": "stdio",
"command": "npx",
"args": ["-y", "wiki-hub-mcp"],
"env": {
"WIKI_GITHUB_OWNER": "your-github-username",
"WIKI_GITHUB_REPO": "my-wiki"
}
}
}
}
Cursor requires
"type": "stdio". Omitting it silently disables the server.
Use via Agent mode (Cmd+I, then Agent).
Windsurf
~/.codeium/windsurf/mcp_config.json
Windsurf uses
"servers"as the root key, not"mcpServers".
{
"servers": {
"wiki-hub": {
"command": "npx",
"args": ["-y", "wiki-hub-mcp"],
"env": {
"WIKI_GITHUB_OWNER": "your-github-username",
"WIKI_GITHUB_REPO": "my-wiki"
}
}
}
}
Restart Windsurf. Tools appear in the Cascade panel.
VS Code / GitHub Copilot
Click the Install in VS Code badge above. It prompts for your GitHub owner and repo, then writes the config.
Or create .vscode/mcp.json in your workspace:
{
"servers": {
"wiki-hub": {
"type": "stdio",
"command": "npx",
"args": ["-y", "wiki-hub-mcp"],
"env": {
"WIKI_GITHUB_OWNER": "your-github-username",
"WIKI_GITHUB_REPO": "my-wiki"
}
}
}
}
Requires the GitHub Copilot extension with Agent mode enabled (github.copilot.chat.agent.enabled: true).
Zed
~/.config/zed/settings.json
{
"context_servers": {
"wiki-hub": {
"command": {
"path": "npx",
"args": ["-y", "wiki-hub-mcp"],
"env": {
"WIKI_GITHUB_OWNER": "your-github-username",
"WIKI_GITHUB_REPO": "my-wiki"
}
}
}
}
}
Continue.dev
Create ~/.continue/mcpServers/wiki-hub.yaml:
name: wiki-hub
command: npx
args:
- "-y"
- wiki-hub-mcp
env:
WIKI_GITHUB_OWNER: your-github-username
WIKI_GITHUB_REPO: my-wiki
Cline
Edit via Cline > MCP Servers > Edit Config, or open the file directly:
macOS: ~/Library/Application Support/Code/User/globalStorage/saoudrizwan.claude-dev/settings/cline_mcp_settings.json
Windows: %APPDATA%\Code\User\globalStorage\saoudrizwan.claude-dev\settings\cline_mcp_settings.json
Add the standard config above.
Roo Code
macOS: ~/Library/Application Support/Code/User/globalStorage/rooveterinaryinc.roo-cline/settings/cline_mcp_settings.json
Add the standard config above.
Codex (OpenAI)
codex mcp add wiki-hub npx "-y wiki-hub-mcp"
Or edit ~/.codex/config.toml:
[mcp_servers.wiki-hub]
command = "npx"
args = ["-y", "wiki-hub-mcp"]
[mcp_servers.wiki-hub.env]
WIKI_GITHUB_OWNER = "your-github-username"
WIKI_GITHUB_REPO = "my-wiki"
MCP Inspector (testing)
WIKI_GITHUB_OWNER=your-username WIKI_GITHUB_REPO=my-wiki \
npx @modelcontextprotocol/inspector npx -y wiki-hub-mcp
Open the printed URL. All 18 tools appear under the Tools tab.
Environment variables
| Variable | Required | Default | Description |
|---|---|---|---|
| WIKI_GITHUB_OWNER | Yes | | GitHub username or org |
| WIKI_GITHUB_REPO | Yes | | Repository name |
| WIKI_CACHE_PATH | No | ~/.wiki-mcp/cache.db | SQLite cache path |
Tools (18)
| Tool | Description |
|---|---|
| wiki_create_page | Create entity, concept, topic, or synthesis page |
| wiki_update_page | Update existing page content |
| wiki_get_page | Read page by exact path or title string (fuzzy match) |
| wiki_list_pages | List pages with type or tag filter |
| wiki_delete_page | Soft-delete page; returns backlinks needing updates |
| wiki_add_source | Fetch URL or text, store source page, return content for LLM |
| wiki_list_sources | List ingested sources |
| wiki_get_source | Read source summary page |
| wiki_search | BM25 full-text search |
| wiki_get_relevant_pages | Keyword-scored relevance search; find related pages before ingesting |
| wiki_sync_cache | Refresh local search cache from GitHub |
| wiki_append_log | Write to activity log |
| wiki_get_log | Read recent log entries |
| wiki_lint | Health check: orphans, shallow pages, missing citations, broken links |
| wiki_get_stats | Page counts by type, health score |
| wiki_flag_contradiction | Record conflicting claims |
| wiki_list_contradictions | List unresolved contradictions |
| wiki_resolve_contradiction | Mark contradiction resolved |
Example workflow
User: "Add this article to my wiki: https://arxiv.org/abs/1706.03762"
LLM calls:
1. wiki_add_source(type="url", url="https://arxiv.org/abs/1706.03762")
2. [reads returned content]
3. wiki_create_page(type="concept", title="Transformer Architecture", ...)
4. wiki_create_page(type="entity", title="Ashish Vaswani", ...)
5. wiki_update_page(path="pages/topics/nlp.md", ...)
6. wiki_append_log(operation="ingest", description="Added Attention Is All You Need paper")
Credits
Karpathy's LLM Wiki gist originated the pattern. Vannevar Bush described the concept in 1945 as the Memex: "a device in which an individual stores all his books, records, and communications, and which is mechanized so that it may be consulted with exceeding speed and flexibility."
License
MIT