MCP server by Durannd
Vertex AI MCP Server
A lightweight, zero-config MCP server to offload massive repository analysis from your local IDE to Google Cloud's Vertex AI (Gemini).
This server acts as a bridge, allowing any MCP-compatible client (like Gemini CLI, Cursor, Antigravity, etc.) to securely use your enterprise Google Cloud account for heavy-lifting AI tasks, powered by Gemini models.
💡 Key Feature: Zero-Token Offloading
The primary goal of this project is to enable "Zero-Token Offloading". Instead of your local tools reading and processing entire repositories—consuming your personal API keys and hitting local context window limits—this server does the work:
- It reads the repository from your local disk, respecting
.gitignoreand other ignore files. - It intelligently packages the code along with your prompt.
- It sends the entire workload to Vertex AI, using your secure, enterprise-grade
gcloudcredentials.
This allows you to leverage the massive context windows of models like Gemini 3.1 Pro (2M+ tokens) and use your company's Google Cloud credits instead of paying out-of-pocket.
🛠️ Prerequisites
- Node.js (v18 or higher)
- Google Cloud CLI (
gcloud) installed on your machine. - A Google Cloud Project with the Vertex AI API enabled.
🚀 Configuration (Choose your method)
This server is designed to be Zero-Config. It will try multiple ways to find your Google Cloud Project ID.
1. Automatic (Recommended)
Simply authenticate your terminal using the Google Cloud CLI. The server will automatically detect your project ID from these credentials.
gcloud auth application-default login
2. Global Config (For multiple projects)
If you work across multiple projects or want a persistent setup, create a file at ~/.vertex-mcp.json (in your user's home directory):
{
"GOOGLE_CLOUD_PROJECT_ID": "your-project-id",
"GOOGLE_CLOUD_LOCATION": "us-central1"
}
3. Environment Variables
You can also set the project ID directly in your MCP client's settings (e.g., in Gemini CLI's settings.json or Cursor's configuration).
📦 Installation & Build
- Install dependencies:
npm install - Build the server:
This will compile the TypeScript code into thenpm run builddist/folder.
🔌 Connecting to a Client
This MCP server communicates using Stdio (Standard I/O), not a network port. Your client application will run the server as a background process.
Example: Connecting the Gemini CLI
To add this server to the Gemini CLI, run the following command from the root of this project directory. This is the recommended and most robust method.
gemini mcp add vertex-ai "node C:\Users
icae\OneDrive\Documentos\Projetos\vertex-ai-mcp\dist\index.js"
This command tells Gemini to use "vertex-ai" as the name for this server and specifies how to start it.
Example: Connecting Other Clients (e.g., Antigravity)
Some clients may require an absolute path to the server's entry point.
-
Get the absolute path to the compiled
index.jsfile inside thedistfolder. -
Add the server configuration to your client. The example below is for Antigravity's
mcp_config.json:{ "mcpServers": { "vertex-ai": { "command": "node", "args": [ "C:\path o\your\project\vertex-ai-mcp\dist\index.js" ] } } } -
Restart your client completely for the changes to take effect.
🐳 Docker Support
You can also run the server inside a Docker container.
1. Build the Image
docker build -t vertex-ai-mcp .
2. Run with GCP Credentials
To allow the container to access your Google Cloud credentials, mount your local ADC file as a volume.
Windows (PowerShell):
docker run -i --rm `
-v "${env:APPDATA}\gcloud\application_default_credentials.json:/.config/gcloud/application_default_credentials.json" `
vertex-ai-mcp
Linux/macOS:
docker run -i --rm
-v "$HOME/.config/gcloud/application_default_credentials.json:/.config/gcloud/application_default_credentials.json"
vertex-ai-mcp
🛠️ Available Tools
Once connected, your client will have access to the following tools:
ask_vertex_agent: Sends a generic prompt to a Vertex AI model.vertex_analyze_repo: Reads an entire local directory, packages its content, and asks Vertex AI to analyze it based on your prompt.vertex_analyze_ui_screenshots: Reads a local folder of images and asks Vertex AI to perform a visual analysis.