MCP server by pontusab
mcp-typegen
MCP is the source of truth.
mcp-typegenis the projection. Generate typed TypeScript from any MCP server. One file per tool, organized by server — so your LLM can write code against it.
What it is
- A CLI + library that turns any MCP server into a typed
servers/<name>/<tool>.tstree - Protocol-native — uses
@modelcontextprotocol/sdkas the only source of truth - Codegen only — zero runtime, zero sandbox coupling
Why you want this
// Before: 150k tokens of tool definitions in your prompt, one brittle call per tool
const txns = await callTool("transactions_list", { limit: 100 });
// After: autocomplete, typecheck, one file per tool, ~2k tokens to the LLM
import { transactionsList } from "./servers/midday";
const txns = await transactionsList({ limit: 100 });
Inspired by Anthropic's "code execution with MCP" — their published result was 150,000 tokens → 2,000 tokens (98.7% reduction) via progressive disclosure. mcp-typegen is the primitive that makes that pattern one command away for any TS codebase.
30-second quickstart
npx mcp-typegen --url https://api.example.com/mcp -o ./servers
import { callMCPTool } from "./servers/client";
import { listFiles } from "./servers/gdrive";
const files = await listFiles({ folderId: "root" });
Then open servers/client.ts and wire callMCPTool to your MCP transport — reference implementation below.
MCP is the source of truth
Your MCP server already defines every tool's name, input schema, output schema, and description. That's the contract. mcp-typegen does one job: project that contract into the shape an LLM writes code against.
- The server changes → you regenerate → broken calls won't compile. Zero drift.
- No SDK tax. Ship an MCP server, every TS codebase gets a typed client for free.
- No proprietary schema (unlike Composio, LangChain tools, hand-rolled OpenAPI adapters).
- Protocol-native. Same output works with any LLM, any AI SDK, any sandbox.
- Grows with MCP. When resources/prompts/sampling mature, existing servers light up — no re-integration.
If it's not in MCP, it's not our problem. If it is in MCP, you already have it.
flowchart LR
Server["Any MCP server (source of truth)"] -->|introspect| Typegen[mcp-typegen]
Typegen -->|projection 1| Types[Typed TS per tool]
Typegen -->|projection 2| Catalog[Progressive-disclosure catalog]
Typegen -->|projection 3| Docs[Per-tool TSDoc]
Types --> LLM[LLM writing code]
Catalog --> LLM
LLM --> Call[callMCPTool]
Call --> Server
How it works
- Connect to any MCP server with the official SDK (
@modelcontextprotocol/sdk). listTools(with full cursor pagination) to get every tool's JSON schema.- Each tool becomes a single TypeScript file with typed
Input/Outputinterfaces and an exported async function. - Each server folder gets an
index.tsthat re-exports every tool. - Optionally emit a progressive-disclosure catalog (
catalog.jsonorcatalog.md) that the LLM reads first.
Output shape
<out>/
├── client.ts # callMCPTool<T>() stub — user implements
├── catalog.json # optional (--catalog json)
├── catalog.md # optional (--catalog markdown)
└── servers/
├── gdrive/
│ ├── getDocument.ts
│ ├── listFiles.ts
│ └── index.ts # re-exports
└── salesforce/
├── updateRecord.ts
├── query.ts
└── index.ts
Each tool file:
// servers/gdrive/getDocument.ts
import { callMCPTool } from "../../client.js";
export interface GetDocumentInput {
documentId: string;
fields?: string;
}
export interface GetDocumentOutput {
content: string;
}
/** Read a document from Google Drive. */
export async function getDocument(input: GetDocumentInput): Promise<GetDocumentOutput> {
return callMCPTool<GetDocumentOutput>("gdrive_getDocument", input);
}
Recipes
| I want to… | See |
|---|---|
| Run a complete code agent end-to-end | examples/full-agent/ |
| Use with Claude Code or any FS-capable runtime | examples/claude-code/ |
| Adapt for an ambient sandbox (secure-exec, Workers) | examples/secure-exec-flatten/ |
| Implement callMCPTool against my transport | below |
| Wire the generated tree into an LLM prompt | examples/full-agent/ |
API
Install:
bun add -d mcp-typegen
# or
npm install --save-dev mcp-typegen
Use programmatically:
import { Client } from "@modelcontextprotocol/sdk/client/index.js";
import { generate } from "mcp-typegen";
const client = new Client({ name: "my-app", version: "1.0.0" }, { capabilities: {} });
await client.connect(myTransport);
const { files, tools } = await generate({
servers: { midday: client },
catalog: { format: "markdown" },
});
// `files` is a `relative-path -> content` map. Write it however you like.
for (const [path, content] of Object.entries(files)) {
await writeFile(resolve("./generated", path), content);
}
generate(options)
| Option | Type | Default | Description |
|---|---|---|---|
| servers | Record<string, Client> | — | Map of folder name -> connected MCP client |
| clientImportPath | string | "../../client.js" | Where each tool file imports callMCPTool from |
| indexFiles | boolean | true | Emit per-server index.ts |
| sanitize | (name: string) => string | built-in | Override tool-name → identifier mapping |
| header | string | auto-warning | File banner prepended to each emitted file |
| catalog | CatalogOptions | undefined | Emit progressive-disclosure sidecar |
Executor (types-only)
For sandbox authors. mcp-typegen ships zero runtime — this type is a shape to target:
import type { Executor, ExecuteResult } from "mcp-typegen";
interface ExecuteResult {
result: unknown;
error?: string;
logs?: string[];
}
interface Executor {
execute(
code: string,
fns: Record<string, (...args: unknown[]) => Promise<unknown>>,
): Promise<ExecuteResult>;
}
Mirrors Cloudflare's codemode Executor so sandboxes can be swapped.
CLI
| Flag | Description |
|---|---|
| --url <url> | Connect over HTTP (streamable, falls back to SSE) |
| --stdio "<command>" | Spawn an MCP server over stdio |
| --config <path> | Read .mcp.json / .cursor/mcp.json / .vscode/mcp.json |
| --folder <name> | Folder name for single-server usage (default: default) |
| -o, --out <dir> | Output directory (default: ./servers-gen) |
| --catalog <fmt> | Emit catalog sidecar: json or markdown |
| --detail <level> | Catalog detail: names | summary | full |
| --header <K:V> | HTTP header (repeatable) |
| --client-import <p> | Override client import path in tool files |
| --no-index | Skip emitting per-server index.ts |
Examples
# single MCP server over HTTP
npx mcp-typegen --url https://api.midday.ai/mcp -o ./servers
# stdio-transported server
npx mcp-typegen --stdio "npx -y @modelcontextprotocol/server-memory" --folder memory -o ./servers
# multi-server from config
npx mcp-typegen --config .mcp.json -o ./servers
# with progressive-disclosure catalog
npx mcp-typegen --config .mcp.json --catalog markdown -o ./servers
# with auth header
npx mcp-typegen --url https://api.midday.ai/mcp --header "Authorization:Bearer $TOKEN" -o ./servers
Implementing callMCPTool
client.ts is emitted with a stub — you replace the body with a call to your MCP transport. Reference implementation for a stdio server:
import { Client } from "@modelcontextprotocol/sdk/client/index.js";
import { StdioClientTransport } from "@modelcontextprotocol/sdk/client/stdio.js";
let cached: Client | undefined;
async function getClient(): Promise<Client> {
if (cached) return cached;
const client = new Client({ name: "agent", version: "0.0.0" }, { capabilities: {} });
const transport = new StdioClientTransport({
command: "npx",
args: ["-y", "@modelcontextprotocol/server-memory"],
});
await client.connect(transport);
cached = client;
return client;
}
export async function callMCPTool<T = unknown>(name: string, args: object): Promise<T> {
const client = await getClient();
const result = await client.callTool({ name, arguments: args as Record<string, unknown> });
if (result.isError) {
const text = Array.isArray(result.content)
? result.content.map((c) => (c as { text?: string }).text ?? "").join("\n")
: "Tool call failed";
throw new Error(text);
}
if (result.structuredContent !== undefined) return result.structuredContent as T;
const first = Array.isArray(result.content) ? result.content[0] : undefined;
const text = first && typeof first === "object" && "text" in first ? (first as { text?: string }).text : undefined;
if (!text) return null as T;
try { return JSON.parse(text) as T; } catch { return text as T; }
}
For HTTP or SSE transports, swap StdioClientTransport for StreamableHTTPClientTransport / SSEClientTransport.
CI workflow
# .github/workflows/codegen-check.yml
- run: bun run mcp-typegen --config .mcp.json -o ./servers
- run: git diff --exit-code servers/
If an MCP server changed and you forgot to regenerate, CI fails.
FAQ
How is this different from @cloudflare/codemode?
Portable. No Workers coupling. Filesystem-tree output, not ambient global. Ship zero runtime, ship zero AI SDK wrapper.
How is this different from mcp-client-gen?
Optimized for LLM code-mode, not human SDK ergonomics. Ships the progressive-disclosure catalog + Executor contract. Multi-server out of the box.
Does it ship a sandbox?
No. See examples/full-agent/ for a node:vm recipe you can copy and adapt.
Does it support resources and prompts? Not in v0.1. Tools only. Track GitHub issues for updates.
Zod / runtime validation? Out of scope. Bring your own validator — the JSON Schema is preserved in the catalog if you need it.
Why no ambient .d.ts mode?
Opinionated on Anthropic's filesystem shape. A ~30-line flatten recipe in examples/secure-exec-flatten/ covers ambient consumers.
How do I wire it into CI?
npx mcp-typegen --config .mcp.json -o ./servers + git diff --exit-code step. Done.
Does it work with stdio servers?
Yes, --stdio "node my-server.js".
Authentication?
Bring a configured Client (with auth) when using generate() as a library. CLI accepts --header for HTTP transports.
Is the Executor type required?
No. It's a shape to target for sandbox authors, not coupled to generate().
Not this package's job
- Not a sandbox (ship zero executor implementations)
- Not an agent framework (no LLM orchestration)
- Not an AI SDK wrapper
- Not an OAuth helper (bring a configured
Client) - Not runtime validation (Zod, Ajv — your choice)
- Not an ambient
.d.tsemitter (opinionated on filesystem-tree)
License
MIT © Pontus Abrahamsson