May 12, 2026 10 min read
MCP for GCP: how Cloud Horizon connects to Compute Engine and Cloud SQL without touching your network
The Model Context Protocol is an open standard for connecting AI assistants to live systems. We built a GCP MCP server that reads Compute Engine, Cloud SQL, and Cost data with nothing but a service account key. The architecture, the security model, and why it matters for multi-cloud operations.
Every multi-cloud team eventually wants the same thing: ask a question about their infrastructure in plain English and get a precise, current answer. The Model Context Protocol (MCP) is the open standard that makes that possible. It is not a product. It is a contract between an AI assistant and an external system. Claude, Cursor, and any other MCP client can discover tools, call them, and reason about the results.
Cloud Horizon ships an MCP server for GCP that exposes Compute Engine, Cloud SQL, Cloud Storage, GKE, Billing, and Monitoring data as MCP tools. No agents run inside your VPC. No Terraform modules need deployment. The only credential is a GCP service account key with read-only scopes.
What MCP actually is
MCP has three primitives: tools (functions the AI can call), resources (data the AI can read), and prompts (pre-shaped queries the AI can run). A client (Claude Desktop, Cursor, or any custom integration) connects to an MCP server over stdio or SSE. The server advertises its tools. The client picks the right ones based on context.
For Cloud Horizon, this means Claude can ask: "What are my most
expensive Compute Engine instances in us-central1?" The client
discovers the gcp_compute_list_instances and
gcp_compute_get_metrics tools, calls them, and
presents the results in a structured table.
Architecture: why we chose Cloudflare Workers
The MCP server runs as a Cloudflare Pages Function, not a persistent VM. Each tool invocation is a stateless HTTP request. The service account key is stored in Cloudflare Secrets, never in code. The Function generates a short-lived access token from the key, calls the GCP API, and returns the result.
This matters for two reasons. First, there is nothing to patch. The Function is ephemeral and runs on Cloudflare\'s edge. Second, the blast radius is tiny. The service account has read-only IAM roles on a specific project. It cannot create, delete, or modify anything.
Security model: least privilege by design
The service account we recommend has exactly three roles:
roles/compute.viewer— see VM inventory, disks, networksroles/cloudsql.viewer— see SQL instances, connections, storageroles/billing.viewer— see cost data and forecasts
No Editor role. No Owner role. The
account cannot start VMs, change firewall rules, or read secrets from
Secret Manager. If the key were leaked, the worst case is an attacker
reads your VM list. That is the design.
What the tools expose
Each GCP service maps to one or two MCP tools:
| Tool | What it returns |
|---|---|
gcp_compute_list_instances | VMs, zones, machine types, status, labels |
gcp_compute_get_metrics | CPU, memory, disk, network for a specific instance |
gcp_sql_list_instances | SQL instances, engines, versions, storage, connectivity |
gcp_sql_get_metrics | Connection count, CPU, storage for a specific instance |
gcp_storage_list_buckets | Buckets, storage classes, access patterns |
gcp_billing_get_costs | MTD spend, daily trends, SKU-level breakdowns |
Each tool takes JSON parameters. gcp_compute_get_metrics
requires instance_name and zone.
gcp_billing_get_costs takes optional
start_date and end_date. The
rest is automatic.
Demo mode: no credentials required
We ship each tool with a demo mode. If no service account key is configured, the tool returns realistic mock data so you can explore the surface. The demo data is structurally identical to real GCP API responses, so when you connect your account, nothing changes except the values.
Connecting your account
Add a service account key to Cloudflare Secrets as
GCP_SERVICE_ACCOUNT_KEY. The MCP server detects it
automatically and switches from demo to live mode for that project. No
restart, no code change.
The key itself is a JSON blob downloaded from the GCP Console. The account should have the viewer roles listed above on the project you want to monitor. You can scope it further with IAM conditions if you only want specific resource types.
What works today
- Compute Engine: list, metrics, status, labels
- Cloud SQL: list, connection metrics, storage
- Cloud Storage: buckets, classes, access patterns
- GKE: cluster list, node pools, pod counts
- Billing: MTD, trends, SKU breakdowns, forecasts
- Monitoring: alert policies, uptime checks
What is next
BigQuery datasets and query stats. Cloud Run service inventory. Cloud Functions cold-start metrics. Cloud Armor rule audit. Everything that can be read from the GCP API surface belongs in the MCP server eventually.
The server is open source at cloudevolvers/cloudsight. The MCP config lives at /.well-known/mcp.json.
If you use GCP and want to talk to your infrastructure, the MCP server is the fastest way to do it.