About
A lightweight, fault‑tolerant MCP server that exposes a gRPC API for introspecting Linkerd2 service meshes and applying policy mutations, built atop Linkerd and Linkerd‑viz.
Capabilities
Overview
The Linkerd2 MCP server provides a lightweight, fault‑tolerant control plane that exposes the Model Context Protocol (MCP) over gRPC for Linkerd‑powered service meshes. By leveraging the existing Linkerd and Linkerd‑viz stacks, it offers developers a unified API to introspect high‑level mesh topology and mutate policy objects—such as authorization policies—directly from an AI assistant or any MCP‑compatible client. This eliminates the need to interact with Kubernetes manifests manually, enabling automated, programmatic mesh management that scales with dynamic workloads.
The server solves a common pain point for teams using AI assistants to manage distributed systems: the lack of a standard, declarative interface that can both read and write mesh state. Traditional approaches require parsing YAML files or using commands, which are error‑prone and difficult to embed in conversational agents. MCP abstracts these operations into a simple, strongly typed protocol that can be queried or updated with minimal overhead. The result is a consistent, secure channel for AI agents to reason about the mesh and apply changes in real time.
Key capabilities include:
- Mesh Graph Retrieval: Clients can request an aggregated, in‑memory representation of services, routes, and traffic policies. This snapshot is kept up‑to‑date via a Redis-backed delta mechanism, ensuring low latency and high availability.
- Policy Mutation: Through dedicated gRPC methods, agents can apply or update Linkerd authorization policies by supplying a JSON spec. The server validates and reconciles changes against the cluster, automatically propagating them to the underlying Kubernetes CRDs.
- Fault Tolerance & Scalability: Redis (Valkey) is used for snapshot storage and leader election, allowing multiple server instances to run in a highly available configuration without state conflicts.
- Open‑Source Extensibility: The project is built on Go 1.22+, with protobuf contracts defined in and a Helm chart for easy deployment. This modular design lets teams extend the server with additional MCP methods or integrate custom telemetry.
In practice, Linkerd2 MCP is invaluable for AI‑driven DevOps scenarios. A conversational assistant can ask, “Show me the current traffic split for service ,” receive a structured graph, and then request “Block all traffic from namespace to this service.” The assistant translates the user’s intent into an MCP mutation, which the server validates and applies instantly. This workflow is ideal for dynamic environments such as canary releases, A/B testing, or rapid rollback of misbehaving services.
Overall, the Linkerd2 MCP server bridges the gap between declarative Kubernetes policies and conversational AI tooling. By providing a single, robust protocol for both introspection and mutation, it empowers developers to build smarter, more automated mesh management workflows that respond instantly to changing business requirements.
Related Servers
Netdata
Real‑time infrastructure monitoring for every metric, every second.
Awesome MCP Servers
Curated list of production-ready Model Context Protocol servers
JumpServer
Browser‑based, open‑source privileged access management
OpenTofu
Infrastructure as Code for secure, efficient cloud management
FastAPI-MCP
Expose FastAPI endpoints as MCP tools with built‑in auth
Pipedream MCP Server
Event‑driven integration platform for developers
Weekly Views
Server Health
Information
Explore More Servers
Strava MCP Server
MCP server with Strava OAuth integration
MCP Server
Model Context Protocol for lightweight server communication
SourceSage
Language-agnostic code memory for LLMs
ZAP-MCP Server
AI‑powered OWASP ZAP integration via MCP
Omni Server
A Python MCP server for learning and prototyping
Simple MCP Sample Server
Lightweight MCP server for text tools and profile data