MCPSERV.CLUB
weibaohui

K8M

MCP Server

AI‑powered lightweight Kubernetes dashboard for multi‑cluster management

Active(80)
674stars
1views
Updated 13 days ago

About

K8M is a compact, AI‑driven console that simplifies Kubernetes cluster administration. It integrates ChatGPT features for resource insights, logs analysis, and command suggestions while providing an MCP server to enable large‑model controlled cluster operations.

Capabilities

Resources
Access data sources
Tools
Execute functions
Prompts
Pre-built templates
Sampling
AI model interactions

k8m Dashboard

k8m is an AI‑driven, lightweight Kubernetes dashboard that consolidates cluster management into a single executable. It tackles the common pain points of managing multiple Kubernetes environments—complex configuration, scattered tooling, and steep learning curves—by embedding an intelligent layer that interacts directly with cluster APIs while offering a user‑friendly web interface. The platform is built on Go for performance and on Baidu’s AMIS framework for a responsive UI, ensuring that operations remain fast even under heavy load.

The server brings AI into the operational loop in several tangible ways. It ships with built‑in large‑language models such as Qwen2.5-Coder-7B and DeepSeek-R1-Distill-Qwen-7B, and it allows developers to plug in private models via Ollama or other hosts. These models power features like word‑selection explanations, automatic YAML translation, and intelligent command suggestions that surface directly in the dashboard. By integrating k8s‑gpt, it also supports Chinese‑native guidance for resource descriptions and troubleshooting. The AI layer is exposed through MCP, providing 49 predefined Kubernetes tools that can be combined into more than a hundred composite operations—making it straightforward for other AI assistants to orchestrate complex cluster tasks.

Key capabilities include:

  • Multi‑cluster discovery and in‑cluster registration: automatically scans kubeconfig directories, registers hosts, and applies fine‑grained RBAC per cluster or namespace.
  • Unified permissions: a single identity controls both AI interactions and actual cluster actions, preventing privilege escalation.
  • Pod file management: browse, edit, upload, and download files inside containers with real‑time log streaming and shell execution.
  • API exposure: generate API keys, view Swagger docs, and integrate third‑party services through MCP.
  • Operational automation: scheduled inspections, Lua‑based rule engines, and notifications to DingTalk, WeChat, or Feishu.

In practice, a DevOps team can deploy k8m on an existing cluster and immediately start querying the AI for help with resource limits, troubleshooting pods, or generating Helm charts. A data scientist can use the AI to translate complex YAML into plain language, while a security officer benefits from the tight integration of cluster permissions with AI actions. Because the server is fully open source and supports multiple databases, CI/CD pipelines can spin up k8m instances on demand, expose them via API keys, and let AI assistants perform day‑to‑day operations without manual intervention.