M
Modular Faq Agent With MCP
作者 @iahiram
An agentic FAQ system using LangGraph, Qwen 3, and Qdrant. Implements Model Context Protocol (MCP) for tool use and stateful memory. Modular, containerized, and ready for deployment.
创建于 12/25/2025
更新于 about 10 hours ago
README
Repository documentation and setup instructions
🤖 Modular FAQ Agent with MCP & Memory
An agentic FAQ system using LangGraph, Qwen 3, and Qdrant. Implements Model Context Protocol (MCP) for tool use and stateful memory. Modular, containerized, and ready for deployment.
🚀 Features
- Semantic Search: Uses
nomic-embed-textembeddings with Qdrant for accurate FAQ retrieval. - Conversational Memory: Built with LangGraph's
MemorySaverto track session-based conversation context. - MCP Integration: Tools are decoupled from the agent through a standard Model Context Protocol server.
- Streaming Response: Optional SSE support for real-time interaction.
- Containerized Architecture: Full stack deployment using Docker Compose.
📁 Project Structure
mcp_faq_agent/
├── API/ # FastAPI routes and Pydantic models
│ ├── core/ # Business logic (get_content filtering)
│ └── endpoints.py # Main API endpoints
├── agent/ # LangGraph agent orchestration
├── mcp_custom/ # MCP Server hosting search tools
├── qdrant_ingestion/ # Scripts to populate the vector store
├── configurations/ # Settings, environment loading, and YAML prompts
└── utils/ # Shared helper functions
🛠️ Getting Started
1. Prerequisites
- Docker & Docker Compose
- (Optional) uv for local development
2. Deployment
Simply build and start the entire stack:
sudo docker compose up -d --build
This will spin up:
ollama-qwen-cpu: The local LLM engine.qdrant: The vector database.faq-api: The FastAPI agent entry point.
3. Ingesting Data (Initial Setup)
To populate the database with your specific FAQs:
# Run the ingestion script (local context)
uv run python qdrant_ingestion/ingest_faqs.py
📡 API Usage
POST /chat
Body:
{
"question": "What is the company remote work policy?",
"session_id": "user_123"
}
Example Curl:
curl -X POST http://localhost:8000/chat \
-H "Content-Type: application/json" \
-d '{"question": "What are the common benefits?", "session_id": "demo_1"}'
⚙️ Configuration
The system uses environment variables managed in .env (passed via Docker Compose):
LLM_MODEL: The Ollama model to use.OLLAMA_URL: Connection URL for Ollama.QDRANT_URL: Connection URL for Qdrant.PROMPT_PATH: Location of the YAML prompt versioning file.
📄 License
MIT
快速设置
此服务器的安装指南
安装命令 (包未发布)
git clone https://github.com/iahiram/Modular-FAQ-Agent-with-MCP
手动安装: 请查看 README 获取详细的设置说明和所需的其他依赖项。
Cursor 配置 (mcp.json)
{
"mcpServers": {
"iahiram-modular-faq-agent-with-mcp": {
"command": "git",
"args": [
"clone",
"https://github.com/iahiram/Modular-FAQ-Agent-with-MCP"
]
}
}
}