About
The Druid MCP Server offers a feature‑based, tool‑centric MCP interface for Apache Druid, enabling AI assistants to execute management functions, access resources, and follow prompt templates through STDIO, SSE, or streamable HTTP.
Capabilities

The Druid MCP Server is a purpose‑built Model Context Protocol gateway that exposes the full breadth of Apache Druid’s management and analytics capabilities to AI assistants. By translating Druid’s REST APIs into MCP‑compliant tools, resources, and prompts, the server enables conversational agents—Claude, ChatGPT, Gemini, or any MCP‑aware LLM—to orchestrate cluster operations and data exploration with natural language commands. This removes the need for developers to write custom scripts or build bespoke integrations; instead, they can delegate routine tasks such as ingestion configuration, segment analysis, and health monitoring directly to the AI.
At its core, the server follows a feature‑based package architecture: each module encapsulates a distinct functional area of Druid (e.g., ingestion, query planning, monitoring). Tools are automatically registered through annotations, producing JSON schemas that the MCP client can consume without manual configuration. Resources provide read‑only access to cluster metadata, while prompts offer templated guidance that can be customized per use case. The result is a self‑documenting, discoverable API surface that AI assistants can explore in real time via the built‑in MCP inspector interface.
Key capabilities include:
- Multi‑transport support: STDIO, Server‑Sent Events (SSE), and streamable HTTP endpoints, all with optional OAuth authentication.
- Real‑time streaming: Long‑running queries and ingestion status updates can be pushed to the client as they occur, enabling interactive dashboards or chatbot responses that reflect current cluster state.
- Comprehensive error handling: Structured, meaningful error messages help AI agents diagnose issues and propose remediation steps.
- Customizable prompt templates: Developers can tailor guidance for common tasks—such as “Create a new ingestion spec” or “Analyze time‑series anomalies”—so the assistant speaks in domain‑specific language.
Typical use cases span the full lifecycle of a Druid deployment. A data engineer might ask an assistant to “Show me all active ingestion tasks and their progress,” receiving a concise table instantly. A product analyst could request “Run an anomaly detection query on the sales stream” and have the assistant generate, execute, and explain the results. In production environments, operators can rely on the server’s secure HTTP profile to embed Druid management into monitoring workflows or CI/CD pipelines, while still benefiting from AI‑driven explanations and suggestions.
By marrying MCP’s declarative protocol with Druid’s powerful analytical engine, the Druid MCP Server offers developers a seamless bridge between conversational AI and real‑world data operations. It abstracts away low‑level API details, provides discoverable tooling, and delivers real‑time insights—all of which accelerate development cycles, reduce operational overhead, and empower teams to harness Druid’s full potential through natural language.
Related Servers
n8n
Self‑hosted, code‑first workflow automation platform
FastMCP
TypeScript framework for rapid MCP server development
Activepieces
Open-source AI automation platform for building and deploying extensible workflows
MaxKB
Enterprise‑grade AI agent platform with RAG and workflow orchestration.
Filestash
Web‑based file manager for any storage backend
MCP for Beginners
Learn Model Context Protocol with hands‑on examples
Weekly Views
Server Health
Information
Tags
Explore More Servers
FalkorDB MCP Server
Bridge AI models to graph databases via MCP
JobSpy MCP Server
AI‑powered job search across multiple platforms
Dify Workflow MCP Server
On-demand execution of custom Dify workflows
Web Mcp Server
Automated web scraping with BeautifulSoup, Gemini AI, and Selenium
Baseline MCP Server
Provide Web Platform Dashboard API status via MCP
k6 MCP Server
Run k6 load tests via Model Context Protocol