MCP server by melisasvr
🚀 DevOps MCP Hub
A production-inspired Model Context Protocol (MCP) server that standardizes DevOps tooling, GitHub, Kubernetes, Prometheus, and Groq AI into plug-and-play tools consumable by any MCP-compatible AI assistant, including Claude Desktop, Cursor, and VS Code Copilot. Built for developers who want real AI-powered incident response without the infrastructure overhead.
⚡ Quick Start
1. Create a virtual environment
python -m venv .venv
source .venv/bin/activate # Windows: .venv\Scripts\activate
2. Install dependencies
pip install -r requirements.txt
No Docker. No Kubernetes cluster. No heavy runtimes. Runs in ~40 MB of RAM.
3. Set your API keys in .env
Create a .env file in the project root:
GITHUB_TOKEN=ghp_your_personal_access_token
GROQ_API_KEY=gsk_your_groq_api_key
- GitHub token → https://github.com/settings/tokens (select
reposcope) - Groq API key → https://console.groq.com (free tier available)
4. Run the server
stdio mode (Claude Desktop, Cursor, VS Code):
python server.py
HTTP mode (test client, REST tooling):
python server.py --http
# → Listening at http://127.0.0.1:8000/mcp
🧪 Run the Demo Client
With the server running in HTTP mode, open a second terminal:
# Optional: point at a real GitHub repo
export TEST_REPO=octocat/Hello-World
export TEST_ISSUE=1
python test_client.py
The client walks through a complete "debug a failing deployment" scenario:
| Step | Tool | Action |
|------|------|--------|
| 1 | prometheus_query_metrics | Detect elevated 5xx error rates and CPU spikes |
| 2 | k8s_get_pod_logs | Identify root cause in pod logs |
| 3a | github_fetch_issue | Pull context from the related bug report |
| 3b | github_create_draft_pr | Open a targeted draft fix PR |
🔌 Connect to Claude Desktop
Edit your Claude Desktop config at ~/Library/Application Support/Claude/claude_desktop_config.json (macOS) or C:\Users\<you>\AppData\Roaming\Claude\claude_desktop_config.json (Windows):
{
"mcpServers": {
"devops-mcp-hub": {
"command": "python",
"args": ["/absolute/path/to/devops_mcp_hub/server.py"]
}
}
}
Restart Claude Desktop and ask: "Fetch issue #1 from octocat/Hello-World" Claude will automatically call your server.
🛠️ Tools Reference
| Tool | Type | Description |
|------|------|-------------|
| github_fetch_issue | 🟢 Real | Fetches issue title, body, labels, assignees via PyGithub |
| github_create_draft_pr | 🟢 Real | Opens a draft PR, returns the URL |
| k8s_get_pod_logs | 🟡 Mock | Realistic logs: CrashLoopBackOff / OOMKilled / CPU throttle |
| prometheus_query_metrics | 🟡 Mock | Real Prometheus API response shape, routes on PromQL keywords |
| ai_analyze_incident | 🤖 Groq | LLM-powered root cause analysis with structured JSON output |
| audit_get_recent_calls | 🗄️ SQLite | Returns recent tool call history from the audit log |
| audit_get_stats | 🗄️ SQLite | Aggregate stats: total calls, error rate, avg latency per tool |
📂 Project Structure
devops_mcp_hub/
├── server.py # MCP server — all 7 tools
├── config.py # Pydantic-validated config loader
├── database.py # SQLite audit log + decorator
├── retry.py # Exponential backoff utility
├── test_client.py # Demo: debug a failing deployment
├── config.yaml # Server settings
├── requirements.txt
├── .env # Your secrets (never commit this)
├── .gitignore
└── README.md
💼 Why This is Portfolio-Worthy
1. MCP is the emerging standard for AI tool integration
- The Model Context Protocol is rapidly becoming the universal interface between AI assistants and external tools, analogous to what LSP did for language servers. Building an MCP server demonstrates fluency with where agentic AI tooling is heading.
2. One server, every AI client
- Traditional AI integrations require writing a custom plugin per model. MCP breaks that coupling. DevOps MCP Hub connects to Claude, Cursor, VS Code, and any future MCP client without a single line of model-specific code.
3. Production-grade patterns at minimal ops cost
- The server uses Pydantic config validation, SQLite audit logging, exponential backoff retries, and structured LLM output, all patterns used in production AI platform engineering.
4. Real AI integration with Groq
- The
ai_analyze_incidenttool sends real pod logs and metrics to Llama 3 via Groq and returns a structured incident report demonstrating practical LLM integration beyond simple chat.
🤝 Contributing
Contributions are welcome! Here's how to get started:
- Fork the repository
- Create a branch for your feature:
git checkout -b feature/your-feature-name - Make your changes and add tests if applicable
- Commit with a clear message:
git commit -m "feat: add PagerDuty incident tool." - Push to your fork:
git push origin feature/your-feature-name - Open a Pull Request, describe what you changed and why
Ideas for contributions
- Real Kubernetes integration using the
kubernetesPython client - Real Prometheus integration hitting
http://localhost:9090 - PagerDuty or OpsGenie tool
- Slack notification tool
- Unit tests with
pytest. - CI pipeline with GitHub Actions
Code style
- Follow existing docstring format on all tools
- Keep tools focused — one tool, one responsibility
- Never commit
.envor real API keys
📋 Environment Variables
| Variable | Required | Description |
|----------|----------|-------------|
| GITHUB_TOKEN | For GitHub tools | Personal access token with repo scope |
| GROQ_API_KEY | For AI analysis | Free API key from console.groq.com |
| TEST_REPO | For test client | owner/repo to test GitHub calls |
| TEST_ISSUE | For test client | Issue number to fetch (default: 1) |
| TEST_HEAD_BRANCH | For test client | Branch name for draft PR creation |
📄 License
- MIT License
Copyright (c) 2026 DevOps MCP Hub Contributors
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including, without limitation, the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all
copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES, OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT, OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
SOFTWARE.