MCPSERV.CLUB
xsp52Hz

CogniGraph MCP Server

MCP Server

Generate mindmaps, relationship graphs and AI‑powered knowledge maps

Active(70)
0stars
0views
Updated Apr 30, 2025

About

A Model Context Protocol server that converts Markdown to mindmaps, renders Mermaid relationship diagrams, and uses an OpenAI‑compatible API to produce AI‑generated knowledge graphs. It supports local MCP clients such as Claude Desktop, Cherry Studio, DeepChat and HyperChat.

Capabilities

Resources
Access data sources
Tools
Execute functions
Prompts
Pre-built templates
Sampling
AI model interactions

Cherry Studio 配置截图示例

CogniGraph MCP Server is a lightweight, command‑line driven MCP host that bridges natural language AI assistants with powerful diagramming tools. It addresses the common pain point of transforming raw Markdown or Mermaid syntax into interactive visual artifacts—mind maps, relationship graphs, and knowledge graphs—without leaving the AI workflow. By exposing these capabilities through the standard interface, developers can invoke diagram generation directly from Claude Desktop, Cherry Studio, DeepChat, or any MCP‑compatible client, keeping the user experience seamless and scriptable.

At its core, the server offers four distinct tools. The first, , turns plain Markdown into a self‑contained HTML or SVG mind map using . It returns the rendered content instantly, making it ideal for quick previews or inline display in chat interfaces. The second tool, , builds on the first by persisting the output to disk, honoring a configurable default directory () or falling back to the user’s home folder. The third, , accepts Mermaid source and produces a visual relationship graph, again with optional persistence. Finally, brings AI into the loop: it sends Markdown to an OpenAI‑compatible API, receives Mermaid code that captures inferred entities and relations, renders it into SVG or PNG, and saves the result. This last tool requires an API key () but otherwise shares the same environment configuration as its siblings.

The server’s design emphasizes flexibility and minimal friction. All tools are stateless except for their optional output paths, allowing the same binary to run in containerized or local environments without modification. Environment variables expose sensible defaults for API credentials, model selection (), and base URLs—supporting both cloud providers and local LLM hosts such as Ollama. Because the tools operate through standard JSON payloads, developers can compose complex pipelines: generate a mind map from Markdown, feed the output into another tool, or chain multiple tools to build a knowledge graph that updates automatically as documentation evolves.

Real‑world use cases abound. Technical writers can convert project README files into navigable mind maps that appear in their AI‑assisted editor. Knowledge managers can generate up‑to‑date relationship graphs from evolving documentation, integrating them into corporate wikis. Data scientists can ask an AI assistant to extract concepts from a research paper and instantly visualize the resulting knowledge graph, all within the same chat window. The server’s tight coupling with MCP clients means that these workflows remain consistent across platforms, whether on Windows, macOS, or Linux.

In summary, CogniGraph MCP Server turns the abstract concept of “visualizing knowledge” into a concrete, programmable service. By unifying Markdown parsing, Mermaid rendering, and AI inference under a single MCP interface, it empowers developers to enrich their AI assistants with diagrammatic intelligence—making complex information both accessible and actionable.