MCP-based agentic AI system enabling LLMs to fetch and reason over real-time global weather data using standardized tool invocation. Built with MCP Server/Client, Llama 3, and external APIs to deliver low-latency, hallucination-free weather intelligence through protocol-driven workflows.
🌤️ Weather MCP Agent: Global Intelligence System
Next-Generation Agentic AI powered by the Model Context Protocol (MCP) and Llama 3 (Groq). A real-time, multi-modal weather intelligence system that bridges the gap between Large Language Models and deterministic data tools.
🚀 Overview
The Weather MCP Agent is a state-of-the-art implementation of the Model Context Protocol (MCP), designed to demonstrate the future of AI interoperability. Unlike traditional chatbots that hallucinate data, this agent uses a standardized protocol to "connect" to live tools—fetching real-time weather forecasts, alerts, and atmospheric analytics for any city on Earth.
Built on Streamlit for a reactive UI and powered by Groq's LPU for near-instant inference, this system showcases how Agentic AI can orchestrate complex workflows (Geocoding -> Weather API -> Conversational Synthesis) in milliseconds.
🌐🎬 Live Demo
🚀 Try it now:
- Streamlit Profile - https://share.streamlit.io/user/ratnesh-181998
- Project Demo - https://weather-mcp-a2a-agent-to-agent3a95dbsjhgfhd3yussfz3e.streamlit.app/
🌟 Key Capabilities
- 🌍 Global Coverage: Instant weather intelligence for 100,000+ cities worldwide.
- ⚡ Hyper-Fast Inference: Uses Llama 3 70B on Groq LPUs for sub-second reasoning.
- 🔌 Standardized Tooling: Built 100% on the open-source MCP standard.
- 🗣️ Multi-Modal Input: Supports both Text and Voice (WebRTC) interaction.
- 🧠 Smart Context: Maintains conversation history and context-aware responses.
🎮 Interface & Features by Tab
The application is structured into 5 professional modules, each serving a specific purpose in the Agentic workflow:
Live Interface Preview
Figure:High-Definition view of the Smart Weather Dashboard.

1. 🚀 Project Demo (Interactive Core)
The command center of the application.
- AI Chat Interface: Real-time conversation with the Agent.
- Quick Action Grid: One-click execution for 16+ common scenarios.
- Smart City Extraction: NLP-powered logic covers complex queries.
- Voice Input: Speak naturally to the agent.
⚡ See it in Action:
2. ℹ️ About Project (Educational Hub)
A detailed breakdown of the paradigm shift in AI.
- Evolution Timeline: Visualizing the shift from Static LLMs -> Tool-Use Agents -> MCP Ecosystems.
- Protocol Comparison: Why MCP is superior to proprietary plugin architectures.
- Interactive Simulations: step-by-step walkthroughs of the agent's decision-making process.
3. 🛠️ Tech Stack (Under the Hood)
Transparency in engineering.
- AI Core: Llama 3.3 70B (Reasoning), LangChain (Orchestration).
- Frontend: Streamlit Async Runtime, Custom CSS theming.
- Connectivity:
mcp-useClient,requestslibrary, RESTful APIs (Open-Meteo, NWS).
4. 🏗️ Architecture (System Design)
Enterprise-grade visualization of the system.
- Data Flow:
User -> Streamlit -> Agent -> MCP Client -> Tool -> Response. - Graphviz Charts: Dynamically generated DAGs (Directed Acyclic Graphs) of the agent's logic.
- Network Topology: Visualizing how the Host, Client, and Server interact.
Detailed Graph Visualization
📂 Detailed Architecture Diagrams
- Diagram 1: Basic MCP Architecture
- Diagram 2: LLM + MCP Tool Selection Flow
- Diagram 3: Multi-Tool MCP Server
- Diagram 4: A2A Multi-Agent Architecture
- Diagram 5: MCP vs A2A Combined System
5. 📋 System Logs (Observability)
Production-ready monitoring.
- Real-time Event Stream: Live tracking of every thought, tool call, and API response.
- Status Codes: Visual indicators for
SUCCESS,ERROR, andINFO. - Audit Trails: Downloadable JSON/TXT logs for debugging and analytics.
🛠️ Technology Stack
| Component | Technology | Purpose | | :--- | :--- | :--- | | Orchestration | LangChain | Manages the ReAct (Reason+Act) loop and prompt engineering. | | Protocol | MCP (Model Context Protocol) | The universal standard for connecting AI models to external tools. | | Inference Engine | Groq LPU | Provides the speed necessary for real-time agentic workflows. | | LLM | Llama 3.3 70B | The "Brain" capable of complex tool selection and JSON parsing. | | Frontend | Streamlit | Delivers a responsive, Python-native web interface. | | Data Source | Open-Meteo API | Provides high-precision weather data without API keys. | | Audio | SpeechRecognition / WebRTC | Handles voice-to-text conversion. |
⚙️ Installation & Local Setup
Follow these steps to run the agent on your local machine.
Prerequisites
- Python 3.10+
- A Groq API Key (Free)
1. Clone the Repository
git clone https://github.com/Ratnesh-181998/weather-mcp-a2a.git
cd weather-mcp-a2a
2. Set Up Virtual Environment
python -m venv venv
# Windows
venv\Scripts\activate
# Mac/Linux
source venv/bin/activate
3. Install Dependencies
pip install -r requirements.txt
4. Configure Secrets
Create a .env file in the root directory:
GROQ_API_KEY=your_actual_api_key_here
5. Run the App
streamlit run Weather_streamlit_app.py
🐳 Large File Support (Git LFS)
This repository may contain large assets (images/diagrams). We use Git LFS to manage them efficiently.
# Install Git LFS
git lfs install
# Track large files
git lfs track "*.png"
git lfs track "*.jpg"
# Push to remote
git add .
git commit -m "Add large visual assets"
git push origin main
🤝 Contributing
Contributions are welcome! Please follow these steps:
- Fork the repository.
- Create a feature branch (
git checkout -b feature/AmazingFeature). - Commit your changes (
git commit -m 'Add some AmazingFeature'). - Push to the branch (
git push origin feature/AmazingFeature). - Open a Pull Request.
📜 License
This project is licensed under the MIT License - see the LICENSE file for details.
📞 Contact & Community
Ratnesh Kumar Singh
Data Scientist (AI/Ml Engineer 4+ Yrs Exp)
- 💼 LinkedIn: ratneshkumar1998
- 🐙 GitHub: Ratnesh-181998
- 🌐 Live Demo: Streamlit Cloud App
Project Links
- 📖 Documentation: GitHub Wiki
- 🐛 Issue Tracker: GitHub Issues
- 💬 Discussions: GitHub Discussions
