This project demonstrates how to use LlamaIndex with the MCP tool integration to enable LLM agents to fetch live weather information using the OpenWeather API.
MCP + LlamaIndex Example: Weather Agent with OpenWeather API
Overview
This project demonstrates how to use LlamaIndex with the MCP tool integration to enable LLM agents to fetch live weather information using the OpenWeather API.
Features
-
Real-Time Weather Retrieval:
The agent dynamically accesses the OpenWeather API via an MCP server and can answer any location-based weather query. -
LlamaIndex & MCP Integration:
Leverages the FunctionAgent workflow with external tool calls for flexible and scalable augmentation of LLM agents. -
Fully Customizable:
Adapt the agent to any external API simply by exposing new MCP tools—demonstrates rapid AI engineering capability.
Usage
# get the code
git clone https://github.com/teddytesfa/llamaindex-mcp-openweather-agent
cd llamaindex-mcp-openweather-agent
# create a virtual environment
python3 -m venv venv
source venv/bin/activate
# install dependencies
pip install "mcp>=1.9.1,<2" python-dotenv llama_index llama-index-tools-mcp llama-index-llms-google-genai
cp .env.example .env
# NOTE: edit the .env file to have the correct values!
# run the server
python mcp_server.py --server_type=sse
-
Setup
.envcontaining your OpenAI API key and OpenWeather API key. -
Ensure a valid MCP server is running and exposes a
fetch_weathertool using OpenWeather API. -
Run the notebook (
mcp_openweather.ipynb) and ask weather-related questions interactively.
Example
User: What's the weather in Addis Ababa? Agent: The current weather in Addis Ababa is clear sky, with a temperature of 21.7°C and 37% humidity.
Author: Teddy Tesfa | AI Engineer
See more at https://github.com/teddytesfa