MCP server by Chrisliao0806
Swagger MCP Service
🔄 Automatically Transform Any OpenAPI/Swagger API into MCP (Model Context Protocol) Tools
🌟 Overview
Swagger MCP Service is a powerful, zero-code solution that automatically converts any OpenAPI/Swagger specification into MCP (Model Context Protocol) tools. This enables Large Language Models (LLMs) to seamlessly interact with your existing REST APIs without writing any custom integration code.
Key Features
- 🚀 Zero-Code Integration - Just provide an OpenAPI spec URL, and the system handles everything
- 🔄 Dynamic Tool Generation - Automatically parses API endpoints and generates MCP tools
- 🤖 LLM-Ready - Works with OpenAI GPT models out of the box via LangChain
- 📋 Smart Parsing - Supports Swagger UI, ReDoc, and direct OpenAPI JSON endpoints
- ⚙️ Highly Configurable - Customize tool names, filters, system prompts, and more via YAML
- 🔌 Multi-Server Support - Connect to multiple MCP servers simultaneously (OpenAPI + third-party)
- 🆕 Auto Tool Discovery - Automatically fetches tool descriptions from third-party MCP servers
- 🧩 Extensible Architecture - Clean separation between server, client, and parser components
- 🌐 Beautiful Web Interface - Modern chat UI with real-time streaming responses
- 📊 Comprehensive Logging - Colored console logs with performance metrics
🏗️ Architecture
┌──────────────────────────────────────────────────────────────────────────┐
│ Swagger MCP Service │
├──────────────────────────────────────────────────────────────────────────┤
│ │
│ ┌─────────────┐ ┌────────────────┐ │
│ │ Browser │───────▶│ Web Interface │ (SSE Streaming) │
│ │ /User │◀───────│ (web_server.py)│ │
│ └─────────────┘ └────────┬───────┘ │
│ │ │
│ ┌─────────────┐ │ │
│ │ CLI │────┐ │ FastAPI + LangGraph │
│ │ (run.py) │ │ │ │
│ └─────────────┘ │ ▼ │
│ │ ┌──────────────┐ ┌──────────────────────┐ │
│ └───▶│ MCP Client │────▶│ LLM (GPT-4) │ │
│ │ (client.py) │◀────│ via LangChain │ │
│ └──────┬───────┘ └──────────────────────┘ │
│ │ │
│ │ stdio (multiple connections) │
│ ┌──────────┴──────────┐ │
│ ▼ ▼ │
│ ┌──────────────┐ ┌──────────────────┐ │
│ │ MCP Server │ │ 3rd Party MCP │ │
│ │ (server.py) │ │ (mcp-server-*) │ │
│ └──────┬───────┘ └──────────────────┘ │
│ │ │
│ │ Dynamic Tool Registration │
│ ▼ │
│ ┌─────────────────────┐ │
│ │ OpenAPI Parser │ │
│ │ (openapi_parser.py) │ │
│ └─────────┬───────────┘ │
│ │ │
│ │ Parse & Transform │
│ ▼ │
│ ┌──────────────────────┐ │
│ │ OpenAPI/Swagger │ │
│ │ Specification │ │
│ └──────────┬───────────┘ │
│ │ │
└──────────────────────┼────────────────────────────────────────────────────┘
│ HTTP Requests
▼
┌──────────────┐
Web Server** | `web_server.py` | 🌐 Beautiful web chat interface with real-time streaming responses |
| ** │ Target API │
│ Server │
└──────────────┘
Component Overview
| Component | File | Description |
|-----------|------|-------------|
| Run Script | run.py | Entry point with CLI for validation, listing tools, and running the service |
| MCP Server | server.py | Dynamically registers MCP tools from OpenAPI spec and handles API calls |
| MCP Client | client.py | Connects to LLM via LangChain and provides interactive chat interface |
| OpenAPI Parser | openapi_parser.py | Parses OpenAPI/Swagger specs from URLs or files, extracts endpoints |
| Configuration | config.yaml | Central configuration for API source, LLM settings, and customization |
📦 Installation
Prerequisites
- Python 3.10 or higher
- An OpenAI API key (for the LLM client)
Step 1: Clone the Repository
git clone https://github.com/yourusername/swagger_mcp_service.git
cd swagger_mcp_service
Step 2: Install Dependencies
pip install -r requirements.txt
Step 3: Configure Environment Variables
Create a .env file in the project root:
OPENAI_API_KEY=your_openai_api_key_here
OPENAI_MODEL=gpt-4.1-mini # Optional: override model from config
⚙️ Configuration
All configuration is done through generic_mcp/config.yaml. Here's a breakdown of the key sections:
API Configuration
api:
# OpenAPI specification source (choose one)
openapi_url: "http://localhost:8000/openapi.json" # Direct URL
# openapi_url: "http://localhost:8000/docs" # Swagger UI page (auto-detected)
# openapi_file: "./openapi.json" # Local file (takes priority)
# Base URL for API calls (used if not defined in OpenAPI spec)
base_url: "http://localhost:8000"
# Request timeout in seconds
timeout: 30
MCP Servers Configuration (Multi-Server Support)
mcp_servers:
# OpenAPI/Swagger type server
- name: "My API Service"
type: "openapi"
enabled: true
openapi:
openapi_url: "http://localhost:8000/openapi.json"
base_url: "http://localhost:8000"
tool_generation:
include_all: true
snake_case_names: true
# Third-party MCP server (auto tool discovery!)
- name: "Fetch"
type: "external"
enabled: true
command: "uvx"
args: ["mcp-server-fetch"]
# 🆕 No need to write description/tools_description!
# They are automatically fetched from the MCP server
Third-Party MCP Servers
You can connect to any third-party MCP server. Tool descriptions are automatically discovered!
mcp_servers:
# Example: mcp-server-fetch
- name: "Fetch"
type: "external"
enabled: true
command: "uvx"
args: ["mcp-server-fetch"]
# Example: Filesystem server
- name: "Filesystem"
type: "external"
enabled: true
command: "npx"
args: ["-y", "@modelcontextprotocol/server-filesystem", "/path/to/dir"]
# Example: GitHub server with environment variables
- name: "GitHub"
type: "external"
enabled: true
command: "npx"
args: ["-y", "@modelcontextprotocol/server-github"]
env:
GITHUB_PERSONAL_ACCESS_TOKEN: "${GITHUB_TOKEN}"
# Optional: Override auto-discovered descriptions
- name: "Custom Server"
type: "external"
enabled: true
command: "my-mcp-server"
args: []
description: "Custom description (optional)"
tools_description: | # Optional - overrides auto-discovery
- **tool_name**: Custom tool description
Tool Generation Options
tool_generation:
include_all: true # Include all endpoints
exclude_endpoints: [] # Exclude specific endpoints
snake_case_names: true # Convert names to snake_case
simplified_names: true # Simplify tool names
# tool_prefix: "myapi_" # Optional prefix for all tools
LLM Configuration
llm:
provider: "openai"
model: "gpt-4.1-mini"
temperature: 0
System Prompt Customization
system_prompt:
template: |
You are an AI assistant with access to the following API tools...
Available Variables:
- {api_name}: API title from OpenAPI spec
- {api_description}: API description
- {tools_summary}: Auto-generated tool documentation
🚀 Usage
Quick Start
Option 1: Web Interface (Recommended) 🌐
- Start your target API server (or use the included example):
cd api_swagger_example
uvicorn api_server:app --reload
- Configure the OpenAPI source in
generic_mcp/config.yaml:
api:
openapi_url: "http://localhost:8000/openapi.json"
base_url: "http://localhost:8000"
- Launch the web server:
cd generic_mcp
python web_server.py
Web Server
# Launch web interface with default config
python web_server.py
# Use custom configuration file
python web_server.py /path/to/my-config.yaml
# Set custom port (default: 8080)
export MCP_WEB_PORT=3000
python web_server.py
# Override LLM model
export OPENAI_MODEL=gpt-4o
python web_server.py
CLI Client
Web Interface
The web interface provides a beautiful, modern chat experience with:
- 💬 Real-time Streaming - Watch responses generate token by token
- 🔧 Tool Call Visualization - See when and how API tools are invoked
- 🎨 Dark Theme - Easy on the eyes, similar to Claude's interface
- 📝 Markdown Support - Rich formatting with code highlighting
- 📊 Session Management - Multiple conversation contexts
CLI Interface
- Open your browser and visit: http://localhost:8080
Option 2: Command-Line Interface
- Run the interactive client:
cd generic_mcp
python run.py
Command-Line Options
# Run interactive client (default)
python run.py
# Use custom configuration file
python run.py --config /path/to/my-config.yaml
# Validate configuration and OpenAPI spec
python run.py --validate
# List all available tools
python run.py --list-tools
# Run server only (for debugging)
python run.py --server-only
Example Interaction
============================================================
🤖 Procurement System API
============================================================
Enterprise procurement management system...
👤 You: Show me recent purchase history
🤖 Assisweb_server.py # 🌐 Web chat interface (FastAPI + SSE)
│ ├── run.py # CLI entry point
│ ├── server.py # MCP server implementation
│ ├── client.py # MCP client with LangChain
│ ├── openapi_parser.py # OpenAPI specification parser
│ ├── config.yaml # Configuration file
│ └── templates/
│ └── index.html # Web UI templat-----------|
| PH001 | Laptop (Dell) | 10 | NT$ 42,000 | Digital Co |
| PH002 | Laptop (Lenovo) | 5 | NT$ 52,000 | Tech Corp |
------------------------------------------------------------
👤 You: Check inventory for Dell laptops
🤖 Assistant:
Dell Latitude 5540 Inventory Status:
- Available: 3 units
- Reserved: 2 units
- Location: Main Warehouse
📁 Project Structure
swagger_mcp_service/
├── README.md # This file
├── requirements.txt # Python dependencies
├── LICENSE # MIT License
│
├── generic_mcp/ # Core MCP service
│ ├── __init__.py
│ ├── run.py # CLI entry point
│ ├── server.py # MCP server implementation
│ ├── client.py # MCP client with LangChain
│ ├── openapi_parser.py # OpenAPI specification parser
│ └── config.yaml # Configuration file
│
└── api_swagger_example/ # Example API server
└── api_server.py # FastAPI demo server
4. Web Interface Streaming
The web server uses Server-Sent Events (SSE) for real-time streaming:
- Token Streaming - LLM responses stream character by character
- Tool Events -
on_tool_startandon_tool_endevents show API calls - Session Management - Multiple conversation contexts with unique IDs
- Colored Logging - Backend logs with timestamps and log levels
- Error Handling - Graceful error messages displayed in UI
🔧 How It Works
1. OpenAPI Parsing
The OpenAPIParser class:
- Loads OpenAPI specs from URLs (including Swagger UI pages) or local files
- Automatically detects and extracts OpenAPI JSON from documentation pages
- Supports both OpenAPI 3.x and Swagger 2.0 formats
2. Dynamic Tool Generation
For each API endpoint, the system generates an MCP tool:
OpenAPI Endpoint → MCP Tool
───────────────────────────────────────────────────
operationId / path+method → tool name (function name)
summary / description → tool description (docstring)
path parameters → required parameters (in="path")
query parameters → query parameters (in="query")
request body properties → body parameters
3. Runtime Execution
When a tool is invoked:
- Parameters are classified (path, query, body)
- Path parameters are substituted in the URL
- An HTTP request is made to the target API
- Response is returned as JSON to the LLM
🧪 Example API Server
The api_swagger_example/ directory contains a complete FastAPI demo server simulating an enterprise procurement system. It includes:
- 📋 Purchase History - Query past purchase records
- 📦 Inventory Management - Check stock and request items
- 🏢 Supplier Management - Query supplier information
- 🛒 Product Catalog - Browse products and prices
- 📝 Purchase Requests - Create and manage purchase requests
- 📄 Purchase Orders - Generate purchase orders
Running the Example
cd api_swagger_example
uvicorn api_server:app --reload --host 0.0.0.0 --port 8000
Access the Swagger UI at: http://localhost:8000/docs
🔌 Integration with Other Systems
Using with External APIs
Simply update config.yaml to point to any OpenAPI-compliant API:
mcp_servers:
- name: "External API"
type: "openapi"
enabled: true
openapi:
openapi_url: "https://api.example.com/openapi.json"
base_url: "https://api.example.com"
Combining Multiple MCP Servers
You can combine OpenAPI servers with third-party MCP servers:
mcp_servers:
# Your internal API
- name: "Internal API"
type: "openapi"
enabled: true
openapi:
openapi_url: "http://localhost:8000/openapi.json"
# Web content fetching
- name: "Fetch"
type: "external"
enabled: true
command: "uvx"
args: ["mcp-server-fetch"]
# File system access
- name: "Filesystem"
type: "external"
enabled: true
command: "npx"
args: ["-y", "@modelcontextprotocol/server-filesystem", "./data"]
Connecting to Enterprise Systems
The service can integrate with:
- SAP systems (with OpenAPI gateway)
- Salesforce APIs
- AWS services with OpenAPI specs
- Any REST API with Swagger/OpenAPI documentation
- Third-party MCP servers from the community
🛡️ Error Handling
The system provides robust error handling:
- Connection Errors: Clear messages when API server is unreachable
- HTTP Errors: Detailed error responses with status codes
- Parsing Errors: Helpful suggestions when OpenAPI spec cannot be parsed
- Validation Mode: Pre-flight checks with
--validateflag
📝 Customization Examples
Filter Specific Endpoints
tool_generation:
include_all: false
include_endpoints:
- "get_purchase_history"
- "create_purchase_order"
Exclude Endpoints
tool_generation:
include_all: true
exclude_endpoints:
- "delete_all_data"
- "/admin/*"
Add Tool Prefix
tool_generation:
tool_prefix: "procurement_"
# Results in: procurement_get_inventory, procurement_create_order, etc.
🤝 Contributing
Contributions are welcome! Please feel free to submit a Pull Request.
- Fork the repository
- Create your feature branch (
git checkout -b feature/AmazingFeature) - Commit your changes (
git commit -m 'Add some AmazingFeature') - Push to the branch (
git push origin feature/AmazingFeature) - Open a Pull Request
📄 License
This project is licensed under the MIT License - see the LICENSE file for details.
🙏 Acknowledgments
- Model Context Protocol (MCP) - The protocol that makes this possible
- FastMCP - Simplified MCP server implementation
- LangChain - LLM application framework
- FastAPI - Modern Python web framework
Made with ❤️ for seamless API-to-LLM integration