MCP server by JinLiye
LLM-MCP-RAG-py
Dependency
pip install -r requirement.txt
Also need to install nodejs
How To Run
-
get chat api from openrouter
-
get embedding api from siliconflow
-
create and put api key in .env file
example of .env
OPENAI_API_KEY=xxx OPENAI_BASE_URL=https://openrouter.ai/api/v1 EMBEDDING_KEY=xxx EMBEDDING_BASE_URL=https://api.siliconflow.cn/v1 -
To start the main program, use the following command:
python ./src/main.pyTo test the chat functionality module, run:
python ./src/test_chat.pyTo test the MCP client functionality, execute:
python ./src/testMCPClient.pyTo test the Agent module functionality, run:
python ./src/Agent.pyNote: When testing the Agent, make sure to set up the MCP client correctly. Modify the directory path r'C:\Users\32114\Desktop\code\llm-mcp-rag-py\output' to suit your needs, allowing access to the appropriate directory.
LLM
MCP
RAG
-
Retrieval Augmented Generation
- 译文: https://www.yuque.com/serviceup/misc/cn-retrieval-augmented-generation-overview
Acknowledgments
This project is based on the work of llm-mcp-rag, which was originally implemented in TypeScript. This implementation has been rewritten using Python