Cloud Horizon Get the free audit

Cloud Horizon AI / Use cases

The LLM API for DevOps and platform teams.

Wire AI into the boring parts of operations without sending production logs to a US gateway. Incident response, PR review, IaC drafting, runbook upkeep. OpenAI-compatible, six open-weights models, EU residency by default.

Six workflows that pay for themselves

Real things DevOps teams ship in week one. Each one cuts the same kind of toil that fills up sprint retros.

Incident response copilot

Pain
On-call gets paged, opens five tabs, copies log lines into a doc, writes a summary at 3am.

Fix
Pipe the alert payload into a chat completion. The model summarizes the blast radius, drafts the customer comms, and proposes the runbook step. Logs and prompts stay in the EU.

PagerDuty webhooksSlack alertingDatadog log API

Pull request review in CI

Pain
Senior engineers spend hours catching the same five mistakes in PRs. Junior PRs sit waiting.

Fix
GitHub Action posts the diff to the API, model returns a structured review with severity tags. You merge faster, your seniors stop re-explaining the same patterns.

GitHub ActionsGitLab CIBitbucket Pipelines

Terraform from intent

Pain
Writing a new module from scratch every time you need an S3 bucket with KMS, lifecycle, and replication.

Fix
Describe the desired posture in plain English, the model returns a Terraform module. Qwen 3 Coder is tuned for HCL output. You review, commit, ship.

TerraformOpenTofuPulumi

Runbook drafting

Pain
Runbooks rot. The last person who knew how the failover worked left in 2024.

Fix
Feed your architecture diagrams, recent incidents, and on-call notes. The model drafts a runbook you can edit. Refresh quarterly with a single CI job.

ConfluenceNotionBackstage TechDocs

Log triage at scale

Pain
Datadog bill keeps growing because nobody has time to set up structured filters.

Fix
Stream logs to the embeddings endpoint, cluster by similarity, model auto-categorizes the top five anomalies per hour. Cuts ingest cost by trimming the noise.

DatadogLokiElastic

CI failure summarizer

Pain
GitHub Actions logs are 4MB of nothing followed by the actual error somewhere on line 17,000.

Fix
Action posts the failed step output to the API, model returns the actual cause and the likely fix in two sentences. Pin to PR comments.

GitHub ActionsCircleCIBuildkite

Pick the right model for the job

Six open-weights models behind one API. Aliases like latest-coder track the current best per category, so you do not chase upgrades.

Stack Model we recommend What it does well
Kubernetes glm-4.6 Tool calls into kubectl, draft Helm values
AWS / Azure / GCP kimi-k2.5 Long-doc reasoning over tf state, IAM, billing exports
Observability minimax-m2.5 Multi-language trace summaries, agent loops over Grafana
Code review qwen-3-coder Pure code generation and diff understanding

Plumb it into your stack

OpenAI-compatible means most integrations work by changing one config line. Here is the shape of an incident-response webhook calling the API directly.

# Pipe a PagerDuty incident into the API
curl -s "$INCIDENT_URL" | \
  curl https://api.cloudhorizons.ai/v1/chat/completions \
    -H "Authorization: Bearer $CLOUD_HORIZONS_KEY" \
    -H "Content-Type: application/json" \
    -d @- <<'JSON'
{
  "model": "kimi-k2.5",
  "messages": [
    {"role": "system", "content": "You are an SRE assistant. Output: blast radius, likely cause, proposed runbook step."},
    {"role": "user", "content": "$(cat -)"}
  ]
}
JSON

Why DevOps teams pick us

EU residency without giving up the OpenAI ecosystem

You keep the SDK, the patterns, the integrations. We change where the inference runs, who is on the hook for compliance, and how much it costs at the per-token level. Logs are yours, retention is yours, training is opt-out by default and contractually denied on team plans.

Join the waitlist