simple api loom for claude web to use as a tool
Loom MCP
Loom is an MCP server that lets you build text collaboratively with base models. Instead of one long generation, you:
- Start with a seed that invokes the right style/voice
- Generate several short continuations
- Pick the best one
- Repeat
This exploits base models' strength (high-quality short continuations) while avoiding their weakness (long-form drift).
Setup
# Clone and enter directory
git clone https://github.com/YOUR_USERNAME/loom-mcp
cd loom-mcp
# Create venv and install
uv venv
source .venv/bin/activate # or .venv\Scripts\activate on Windows
uv sync
# Add your OpenRouter API key
echo "OPENROUTER_API_KEY=sk-or-..." > .env
Running
# Start the server
python -m uvicorn server:app --host 0.0.0.0 --port 8000
# In another terminal, expose to internet (pick one):
ngrok http 8000
# or
npx localtunnel --port 8000
Connect to Claude Web
- Go to claude.ai → Settings → Connectors → Add custom connector
- Name:
loom(or whatever) - URL: Your ngrok/localtunnel https URL
- OAuth fields: leave blank
- Click Add, then Connect
Usage
Once connected, ask Claude to use the loom tools:
"Use loom_guide to show me how loom works"
"Use loom_seed with this text: >>47382910: anyone else's parents just"
"hey i just gave you a loom tool, you should check it out!"
"loom for a short story about Bears. be picky and reroll often"
Tools
| Tool | Description |
|------|-------------|
| loom_seed(text) | Create new tree with seed text as root |
| loom_branch(n, model, max_tokens, temperature, min_p) | Generate N completions from focused node |
| loom_focus(node_id) | Move to a node (use first 8 chars of ID) |
| loom_context() | Get full text from root to focused node |
| loom_view(full) | See tree structure |
| loom_siblings(full) | See alternative branches at same level |
| loom_children(full) | See continuations from focused node |
| loom_insert(text) | Add custom text as child |
| loom_edit(text, node_id) | Change a node's text |
| loom_delete(node_id) | Remove a node and descendants |
| loom_guide() | Show full usage guide |
| loom_seeds() | Show guide on writing good seeds |
Models
Default is moonshotai/kimi-k2::deepinfra. You can use any OpenRouter model:
meta-llama/llama-3.1-405b- largest llamaanthropic/claude-haiku-4.5- uses untitled trick automatically
Provider targeting: model::provider (e.g. moonshotai/kimi-k2::deepinfra)
State
Tree state is saved to loom_state.json in the working directory.