About
A Model Context Protocol server that exposes read‑only VictoriaMetrics APIs, embedded documentation, and advanced query analysis for monitoring, debugging, and automation.
Capabilities

The VictoriaMetrics MCP Server bridges the gap between AI assistants and one of the fastest, most scalable time‑series databases available today. By exposing VictoriaMetrics’ read‑only API surface through the Model Context Protocol, it lets developers query metrics, inspect alerting rules, and explore instance configuration directly from conversational agents. This capability is especially valuable for teams that rely on AI‑driven tooling to accelerate monitoring, debugging, and incident response without leaving their preferred chat or IDE environment.
At its core, the server provides a rich set of tools that mirror the functionality found in VictoriaMetrics’ web UI (VMUI). Users can query metrics using PromQL, receive the raw data or a visual graph if the client supports rendering, and even export full series for offline analysis. The MCP also offers metadata discovery—listing available metrics, labels, and label values—to help engineers understand the shape of their data at a glance. For observability practitioners, this means instant access to cardinality statistics and usage patterns without manually navigating dashboards.
A standout feature is the rule analysis toolkit. The server can parse and evaluate alerting and recording rules, present their status, and trace which metrics trigger them. This aids in troubleshooting misconfigurations and validating rule logic before deployment. Additionally, the MCP can debug relabeling rules, downsampling strategies, and retention policies by simulating their effects against live data, providing a safety net for costly configuration changes.
The server ships with embedded documentation and an offline search engine, allowing AI assistants to answer questions about API endpoints or configuration parameters on the fly. When combined with VictoriaMetrics Cloud integration, users can manage both self‑hosted and cloud instances from a single conversational interface. The result is a seamless workflow where an engineer can ask, “What’s the memory usage trend for my front‑end pods over the last 24 hours?” and receive a ready‑made graph or CSV export—all within the same chat.
For developers building AI‑enhanced observability pipelines, the VictoriaMetrics MCP Server offers a plug‑and‑play solution that reduces context switching. By integrating this server into existing MCP ecosystems, teams can layer additional observability or documentation servers to create powerful, multi‑source knowledge bases that elevate the capabilities of their AI assistants.
Related Servers
n8n
Self‑hosted, code‑first workflow automation platform
FastMCP
TypeScript framework for rapid MCP server development
Activepieces
Open-source AI automation platform for building and deploying extensible workflows
MaxKB
Enterprise‑grade AI agent platform with RAG and workflow orchestration.
Filestash
Web‑based file manager for any storage backend
MCP for Beginners
Learn Model Context Protocol with hands‑on examples
Weekly Views
Server Health
Information
Explore More Servers
Adspirer MCP Server
AI‑powered ad management inside Claude
FEA-MCP Server
Unified API for AI-controlled Finite Element Analysis
Coupler.io MCP Server
Seamless AI analytics for Coupler.io data flows
Vite Plugin Vue MCP
MCP server for Vue apps with component, state, route and Pinia introspection
Record To Markdown
Convert Claude chats into Markdown or Apple Notes instantly
Jellyfish MCP Server
Natural language access to Jellyfish engineering data