MCPSERV.CLUB
antoinebou12

UML-MCP Diagram Generation Server

MCP Server

Generate UML, Mermaid, D2 diagrams via MCP protocol

Stale(50)
52stars
3views
Updated 12 days ago

About

UML-MCP is a Python-based server that implements the Model Context Protocol, enabling AI assistants and other applications to generate UML, Mermaid, D2, and other diagram types in SVG, PNG, PDF, or custom formats with seamless integration.

Capabilities

Resources
Access data sources
Tools
Execute functions
Prompts
Pre-built templates
Sampling
AI model interactions

UML‑MCP: Diagram Generation for AI Workflows

UML‑MCP fills a niche that many AI‑powered development environments still lack: the ability to turn natural‑language or code‑based descriptions into visual models without leaving the assistant. By exposing a full Model Context Protocol interface, it lets LLMs such as Claude, GPT‑4o, or others request diagram rendering on demand, and receive the resulting image or file directly in the chat. This removes the need for manual copy‑paste into external editors, streamlines documentation pipelines, and enables richer explanations of system architecture, user flows, or data models within a single conversational thread.

The server is built around a modular core that implements the MCP specification. When an AI assistant sends a tool request, UML‑MCP selects the appropriate generator—UML Class, Sequence, Activity, Mermaid, D2, BPMN, C4, or any of the supported PlantUML extensions—and forwards the textual description to either a local rendering engine or an external service such as Kroki or PlantUML. The result is returned in the requested format (SVG, PNG, PDF) and can be stored to a configured output directory or streamed back inline. Developers appreciate the flexibility of environment variables that let them point to local instances for faster turnaround or use cloud‑hosted services for scalability.

Key capabilities include:

  • Wide diagram coverage: From classic UML to modern graph languages, giving teams the right visual tool for every scenario.
  • Format diversity: SVG for vector fidelity, PNG for quick previews, PDF for printable documentation.
  • Seamless MCP integration: The server speaks the same protocol that most LLM assistants understand, requiring no custom adapters.
  • Editor hooks: Ready‑made integration scripts for popular IDEs such as Cursor, or a simple JSON configuration that any MCP‑compliant editor can consume.
  • Local and remote rendering: Switch between on‑premise PlantUML/Kroki or the public services with a single environment variable.

Typical use cases span the full software lifecycle. Architects can ask an assistant to “draw a class diagram for the user service” and instantly receive a clean SVG embedded in the conversation. DevOps teams can request “generate a deployment diagram” that includes container and network layers, while documentation writers can pull the same diagrams into Markdown or Confluence pages without manual export steps. Because the server operates over standard input/output, it can be run as a background service on CI pipelines or in containerized environments, making it an attractive component for automated documentation generation or for teaching tools that illustrate code behavior in real time.

What sets UML‑MCP apart is its focus on integration over novelty. It does not reinvent diagramming engines; instead, it provides a unified MCP façade that abstracts away the differences between PlantUML, Kroki, Mermaid, and others. This abstraction lets developers write a single tool invocation once and have it work across all supported diagram types, dramatically reducing friction when switching between modeling languages. The server’s lightweight core and clear configuration also mean it can be dropped into existing MCP ecosystems—such as the Smithery platform or any custom LLM orchestration layer—with minimal effort.