MCPSERV.CLUB
starbops

Harvester MCP Server

MCP Server

AI‑powered Kubernetes control for Harvester clusters

Stale(50)
0stars
1views
Updated Mar 25, 2025

About

A Go implementation of the Model Context Protocol that lets Claude Desktop, Cursor and other LLM assistants issue natural‑language CRUD commands to Harvester HCI clusters, translating them into Kubernetes API calls and returning human‑readable responses.

Capabilities

Resources
Access data sources
Tools
Execute functions
Prompts
Pre-built templates
Sampling
AI model interactions

Harvester MCP Server

The Harvester MCP Server is a purpose‑built Model Context Protocol (MCP) implementation that bridges AI assistants—such as Claude Desktop and Cursor—with Harvester HCI clusters. By exposing a rich set of Kubernetes core resources and Harvester‑specific CRDs over MCP, it allows natural‑language commands from an LLM to be translated into concrete API calls against a live cluster. This removes the need for developers or operators to manually craft commands, enabling a conversational workflow where an assistant can list virtual machines, inspect network status, or delete pods with simple prompts.

At its core the server follows a clear request‑to‑response pipeline. When an assistant sends a query, the MCP client forwards it to the server, which parses the intent and determines the target resource type. A formatter registry then selects an appropriate human‑readable renderer—either a generic Kubernetes formatter or one tailored to Harvester resources such as VMs, volumes, and images. The server issues the corresponding API call to the cluster, receives raw JSON, passes it through the formatter, and returns a concise text summary optimized for LLM consumption. This design keeps the assistant’s output natural while preserving all actionable details, making it suitable for both quick overviews and deep dives.

Key capabilities include full CRUD support for core Kubernetes objects (pods, deployments, services, namespaces, nodes, and CRDs) alongside Harvester‑specific entities like virtual machines, images, volumes, and networks. The server automatically groups resources by namespace or status, provides summary tables for rapid assessment, and can display detailed views when requested. These features reduce cognitive load for users who need to manage complex infrastructure without leaving their conversational context.

Typical use cases span from DevOps automation—where a team can ask an assistant to “list all VMs in the prod namespace” or “delete the oldest pod in deployment X”—to educational scenarios, where newcomers can learn cluster operations through guided dialogue. In CI/CD pipelines, an AI assistant could trigger resource creation or cleanup steps, while in monitoring workflows it can surface alerts and status summaries on demand.

What sets Harvester MCP Server apart is its tight integration with Harvester HCI, exposing a curated set of resources that are most relevant to virtualized workloads. The formatter registry ensures consistent, readable output across disparate resource types, and the server’s Go implementation offers high performance with minimal overhead. For developers building AI‑augmented tooling, this server provides a plug‑and‑play interface that turns any MCP‑compliant assistant into an instant, conversational Kubernetes controller tailored for Harvester environments.