MCPSERV.CLUB
lum8rjack

Caddy MCP Server

MCP Server

Control Caddy via Model Context Protocol

Stale(55)
3stars
0views
Updated 23 days ago

About

The Caddy MCP Server exposes an MCP interface to manage a Caddy instance, offering tools for retrieving, updating, and converting configurations across JSON, Caddyfile, YAML, or Nginx formats, and monitoring upstream proxy statuses.

Capabilities

Resources
Access data sources
Tools
Execute functions
Prompts
Pre-built templates
Sampling
AI model interactions

Caddy MCP in Action

Overview

caddy‑mcp is a lightweight Model Context Protocol (MCP) server that turns a running Caddy web server into an AI‑friendly API. By exposing the full power of the Caddy HTTP Admin API through MCP tools, it allows AI assistants to inspect, modify, and migrate site configurations on the fly. The server can communicate over stdio, Server‑Sent Events (SSE), or an HTTP stream, giving developers flexibility in how they embed it into existing workflows.

What Problem Does It Solve?

Managing a Caddy instance traditionally requires editing files, restarting the process, and manually testing changes. When integrating with AI assistants—such as Claude or other LLMs—developers need a programmatic way to query and alter the server without exposing raw admin endpoints. caddy‑mcp bridges this gap by offering a structured set of tools that conform to MCP, enabling secure, version‑controlled interactions. It also provides format conversion utilities so that teams can work in their preferred configuration language (Caddyfile, YAML, or Nginx) and let the assistant translate it into Caddy’s native JSON format.

Core Features & Value

  • Configuration Retrieval fetches the live Caddy configuration in JSON, giving AI assistants a snapshot of the current state for analysis or documentation.
  • Dynamic Updates accepts a full JSON payload, allowing an assistant to apply new settings or roll back changes without manual file edits.
  • Format Conversion – Three dedicated tools convert Caddyfile, Nginx, or YAML configs into JSON. This makes it trivial for an assistant to accept user‑friendly inputs and generate a deployable Caddy configuration.
  • Upstream Monitoring reports the health of reverse‑proxy targets, enabling AI assistants to diagnose load‑balancing issues or suggest optimizations.
  • Transport Agnostic – By supporting stdio, SSE, and HTTP streams, the server can be embedded in CLI tools, web dashboards, or remote orchestration services.

Real‑World Use Cases

  • AI‑Driven Site Configuration – An assistant can ask a developer for desired routing rules, generate the corresponding Caddyfile or JSON, and push it live with .
  • Migration Assistance – When moving from Nginx to Caddy, the tool lets an assistant translate existing configs and highlight differences.
  • Operational Monitoring – Integrate into a health‑check routine that an AI can interpret and trigger alerts or auto‑scaling actions.
  • Education & Documentation – The server can expose the current configuration as a reference for newcomers, while the assistant explains each directive in plain language.

Integration with AI Workflows

Because caddy‑mcp follows the MCP specification, any compliant LLM can invoke its tools directly. A typical workflow involves:

  1. The assistant asks for a configuration change or status check.
  2. It calls the relevant MCP tool (e.g., ).
  3. The server returns the result over the chosen transport.
  4. The assistant formats the response, presents it to the user, and optionally applies updates.

This tight coupling eliminates the need for custom adapters or manual API calls, streamlining development and reducing cognitive load.

Unique Advantages

  • Zero‑Configuration for Custom Modules – The README explains how to include additional Caddy modules, ensuring the MCP server remains functional even with a heavily customized build.
  • Transport Flexibility – By supporting both streaming and non‑streaming transports, the server fits into diverse deployment environments—from local dev to cloud‑native microservices.
  • Unified Toolset – All configuration and monitoring operations are consolidated under one MCP endpoint, simplifying permission management and audit logging.

In summary, caddy‑mcp empowers developers to harness AI assistants for intelligent, real‑time management of Caddy servers, turning configuration tasks into conversational interactions while preserving the full capabilities of the underlying HTTP Admin API.