MCPSERV.CLUB
kestra-io

Kestra Python MCP Server

MCP Server

Run Kestra AI tools via Docker with minimal setup

Stale(60)
13stars
1views
Updated 22 days ago

About

The Kestra Python MCP Server is a lightweight Docker-based solution that exposes Kestra AI tools (execution, flow, backfill, etc.) to MCP-compatible clients. It simplifies deployment by handling environment variables and supports both OSS and EE configurations.

Capabilities

Resources
Access data sources
Tools
Execute functions
Prompts
Pre-built templates
Sampling
AI model interactions

Kestra Python MCP Server

The Kestra Python MCP Server bridges the gap between AI assistants and the powerful workflow engine that is Kestra. By exposing a standardized set of tools—such as flow execution, file handling, and key‑value storage—to the Model Context Protocol (MCP), it lets developers turn complex data pipelines into first‑class commands that an AI can invoke directly. This eliminates the need for custom adapters or manual API calls, allowing a conversational agent to trigger, monitor, and manipulate data workflows as if they were native features of the assistant.

At its core, the server runs in a Docker container, simplifying deployment and ensuring that all required Python dependencies are encapsulated. Once the server is registered in an MCP‑aware client (e.g., Claude, Cursor, or VS Code), the AI gains access to a curated toolbox. For example, the backfill tool can retroactively run jobs on historical data, while execution and replay allow real‑time control over running workflows. The files tool provides read/write access to the Kestra file system, and kv offers a lightweight key‑value store for temporary state. The namespace tool lets users scope operations to specific tenant contexts, which is essential in multi‑tenant environments.

Developers benefit from this tight integration in several real‑world scenarios. In data engineering pipelines, an AI assistant can be asked to “rerun the last failed job” or “list all files in the staging bucket,” and the server translates those natural‑language requests into precise Kestra API calls. In DevOps, the restart and resume tools enable automated recovery of stalled workflows without leaving the chat interface. For analysts, the ability to backfill historical data or query key‑value stores directly from an assistant accelerates experimentation and reduces context switching between tools.

What sets the Kestra MCP Server apart is its flexible configuration. OSS users can authenticate via username/password or API tokens, while Enterprise Edition (EE) users unlock additional tools that are gated behind the group. The server can be tailored to disable unwanted tool groups through environment variables, ensuring that the assistant’s surface area matches organizational policy. Moreover, because the server is containerized, it can run on any host that supports Docker, making it straightforward to integrate into CI/CD pipelines or local development setups.

In summary, the Kestra Python MCP Server turns a complex workflow orchestrator into an intuitive command set for AI assistants. By exposing execution, monitoring, and data‑access tools through MCP, it empowers developers to harness the full power of Kestra directly from conversational interfaces—streamlining operations, enhancing productivity, and enabling new automation possibilities across data‑centric teams.