MCPSERV.CLUB
jokemanfire

MCP Containerd

MCP Server

Rust-powered MCP server for Containerd CRI operations

Stale(60)
50stars
2views
Updated 20 days ago

About

A Rust-based Model Context Protocol server that exposes all Containerd CRI interfaces, including runtime and image services, enabling seamless container management via MCP.

Capabilities

Resources
Access data sources
Tools
Execute functions
Prompts
Pre-built templates
Sampling
AI model interactions

Overview

The MCP Containerd server bridges the Model Context Protocol (MCP) with the popular container runtime, containerd. By exposing container lifecycle and image management operations through a standardized MCP interface, it lets AI assistants query and manipulate containers as if they were native tools. This eliminates the need for custom integration code on each client and provides a unified way to orchestrate containers directly from conversational agents.

The server is built on the Rust Model Context Protocol (RMCP) library, which gives it a robust foundation for RPC communication. It connects to the local containerd socket () by default, but can also be configured to use a stream‑over‑HTTP endpoint. Once running, the service presents three logical groups of capabilities: runtime, image, and version. The runtime group implements all standard CRI (Container Runtime Interface) operations such as creating, starting, stopping, and deleting pod sandboxes and containers. The image group exposes CRUD operations on container images—listing available images, pulling new ones, inspecting status, and deleting them. The version service simply reports the CRI version supported by containerd, which can be useful for compatibility checks.

For developers building AI workflows, this server is a drop‑in component that turns container management into a first‑class tool. An AI assistant can ask for a list of running containers, pull an image from a registry, or execute a command inside a container—all through simple tool calls. The simple-chat-client example demonstrates this pattern: the assistant issues a natural‑language request, the client translates it into an MCP tool invocation ( or ), and the server returns a JSON payload. This workflow scales to more complex scenarios such as automated testing, continuous integration pipelines, or on‑demand resource provisioning driven by conversational prompts.

Unique advantages of MCP Containerd include its language‑agnostic interface (any client that speaks MCP can interact), zero‑configuration defaults for local deployments, and the ability to leverage existing containerd tooling (e.g., ) without rewriting logic. Because it implements the full CRI interface, it can interoperate with Kubernetes or other orchestrators that rely on containerd underneath. This makes it ideal for edge deployments, serverless environments, or any setting where an AI assistant needs to manage containerized workloads on the fly.