About
PMCP is a Go-based Model Context Protocol server that bridges MCP clients with Prometheus, providing full HTTP API compatibility and enabling natural language interactions for querying metrics, managing targets, rules, and TSDB operations.
Capabilities
PMCP – Prometheus Model Context Protocol Server
PMCP is a Go‑based implementation of the Model Context Protocol (MCP) that turns a standard Prometheus instance into a natural‑language friendly data source. By exposing the full Prometheus HTTP API through MCP, it allows AI assistants such as Claude Desktop to ask questions like “What was the CPU usage of two hours ago?” or “Show me a 15‑minute trend for memory usage on host .” The server translates those conversational queries into the appropriate Prometheus endpoints, fetches the results, and returns them in a format that MCP clients can ingest instantly.
The server solves a key pain point for developers and operators who want to embed observability insights directly into AI‑driven workflows. Rather than writing Grafana dashboards or crafting PromQL queries manually, an assistant can interpret user intent, generate the correct query string, and deliver actionable metrics—all while maintaining full type safety and API fidelity. This removes the need for custom adapters or SDKs, letting teams focus on higher‑level problem solving instead of boilerplate integration work.
PMCP’s core capabilities mirror the entire Prometheus HTTP API: instant and range queries, metadata discovery (label names/values), target and rule inspection, alerting configuration checks, and even TSDB administration commands like snapshots or series deletion. It supports multiple transport layers—HTTP for webhooks, Server‑Sent Events (SSE) for real‑time streaming, and stdio for local desktop clients—so it can fit into a variety of deployment topologies. The Go implementation ensures low latency, strong typing, and comprehensive error handling, which is critical when chaining multiple AI calls in a single workflow.
In practice, PMCP shines in scenarios such as automated incident response, where an AI assistant can answer “What alerts are firing for ?” and then drill down into historical trends to surface root causes. It also powers conversational monitoring dashboards, allowing users to request “Show the average disk I/O on over the last week” without leaving the chat interface. Because it keeps a one‑to‑one mapping with Prometheus’s API, developers can confidently rely on the same query semantics and data models they already use in Grafana or PromQL scripts.
Overall, PMCP offers a clean, type‑safe bridge between AI assistants and Prometheus observability data. Its comprehensive feature set, transport flexibility, and strict API compatibility make it a standout tool for any team that wants to bring real‑time metrics into natural language conversations without reinventing the wheel.
Related Servers
MarkItDown MCP Server
Convert documents to Markdown for LLMs quickly and accurately
Context7 MCP
Real‑time, version‑specific code docs for LLMs
Playwright MCP
Browser automation via structured accessibility trees
BlenderMCP
Claude AI meets Blender for instant 3D creation
Pydantic AI
Build GenAI agents with Pydantic validation and observability
Chrome DevTools MCP
AI-powered Chrome automation and debugging
Weekly Views
Server Health
Information
Explore More Servers
S3 MCP Server
Manage S3 buckets, objects, policies, and lifecycle from any MCP client
MCP OpenAPI Explorer
Explore APIs with Model Context Protocol
my-server MCP Server
Simple notes system powered by Model Context Protocol
McpDocs
Elixir docs via SSE MCP server
LangChain MCP Client Streamlit App
Interactive LLM playground with multi‑provider, tool‑enabled, file‑aware chat
Ancestry MCP Server
Interact with GEDCOM files via Model Context Protocol