MCP Servers

A collection of Model Context Protocol servers, templates, tools and more.

MCP server by JinLiye

Created 10/7/2025
Updated 2 months ago
Repository documentation and setup instructions

LLM-MCP-RAG-py

Dependency

pip install -r requirement.txt

Also need to install nodejs

How To Run

  • get chat api from openrouter

  • get embedding api from siliconflow

  • create and put api key in .env file

    example of .env

    OPENAI_API_KEY=xxx
    OPENAI_BASE_URL=https://openrouter.ai/api/v1
    
    EMBEDDING_KEY=xxx
    EMBEDDING_BASE_URL=https://api.siliconflow.cn/v1
    
  • To start the main program, use the following command:

    python ./src/main.py
    

    To test the chat functionality module, run:

    python ./src/test_chat.py
    

    To test the MCP client functionality, execute:

    python ./src/testMCPClient.py
    

    To test the Agent module functionality, run:

    python ./src/Agent.py
    

    Note: When testing the Agent, make sure to set up the MCP client correctly. Modify the directory path r'C:\Users\32114\Desktop\code\llm-mcp-rag-py\output' to suit your needs, allowing access to the appropriate directory.

LLM

MCP

RAG

Acknowledgments

This project is based on the work of llm-mcp-rag, which was originally implemented in TypeScript. This implementation has been rewritten using Python

Quick Setup
Installation guide for this server

Install Package (if required)

uvx llm-mcp-rag-py

Cursor configuration (mcp.json)

{ "mcpServers": { "jinliye-llm-mcp-rag-py": { "command": "uvx", "args": [ "llm-mcp-rag-py" ] } } }