MCPSERV.CLUB
growilabs

Growi MCP Server

MCP Server

Connect AI models to Growi knowledge bases

Active(95)
8stars
0views
Updated 16 days ago

About

The Growi MCP Server bridges large language models with Growi wiki content, enabling search, retrieval, and management of pages, tags, comments, and share links across multiple Growi instances for context‑aware AI responses.

Capabilities

Resources
Access data sources
Tools
Execute functions
Prompts
Pre-built templates
Sampling
AI model interactions

GROWI MCP Server Dashboard

Overview

The @growi/mcp‑server is a Model Context Protocol (MCP) implementation that bridges large language models with the GROWI knowledge‑base platform. By exposing a rich set of page, tag, comment, revision and share‑link operations over the MCP interface, it lets AI assistants query, create, update, and manage wiki content in real time. This eliminates the need for custom integrations or manual data extraction, enabling developers to build context‑aware conversational agents that can pull the latest corporate documentation directly from GROWI.

Problem Solved

Organizations often maintain vast amounts of internal knowledge in wikis like GROWI, but LLMs struggle to retrieve accurate, up‑to‑date information without an explicit connector. The MCP server solves this by providing a standardized, secure API that translates LLM prompts into authenticated GROWI REST calls. It supports multiple GROWI instances, allowing a single AI service to span production, staging, and development wikis without hard‑coding URLs or tokens.

What It Does

  • Search & Retrieval – The tool lets an assistant find relevant pages using keyword queries, while and return full page content or metadata.
  • Content Management – Create, update, duplicate, rename, publish, and delete pages in bulk through dedicated tools.
  • Tag & Comment Handling – Retrieve or modify tags, list available tags, and fetch comments for discussion threads.
  • Revision History – Access edit histories with and inspect specific revisions.
  • Share Links – Generate, list, or delete share links for quick external access.
  • User Activity – Pull recent pages authored by a particular user to surface personal knowledge bases.

All these capabilities are exposed as MCP tools, so an AI assistant can invoke them directly in conversation, receiving structured responses that the model can incorporate into its reply.

Key Features

  • Multi‑app support – Configure dozens of GROWI instances via environment variables; the server automatically selects the appropriate one based on context.
  • Fine‑grained access control – Each app uses its own API token, ensuring that the assistant only accesses data it is permitted to read or modify.
  • Bulk operations – Tools like accept arrays, enabling efficient batch management.
  • Comprehensive metadata – Page listings include path, author, timestamps, and publish status for richer context.
  • Real‑time updates – Because calls are made against the live GROWI API, assistants always work with the latest content.

Use Cases

  • Internal FAQ Bots – An assistant can search the wiki for answers, then present or edit content directly.
  • Onboarding Guides – New employees receive instant access to relevant pages, and the bot can create personalized onboarding documents.
  • Knowledge Audits – Automatically list outdated or orphaned pages, tag them for review, and schedule cleanup tasks.
  • Collaboration Support – Generate share links on demand during meetings, or pull recent comments to surface discussion points.
  • Documentation Generation – An LLM can draft new pages, populate templates, and publish them without leaving the chat.

Integration with AI Workflows

In an MCP‑enabled environment, a developer simply registers the GROWI server in the client configuration. The assistant then receives tool definitions and can call them using natural language prompts such as “Find the latest security policy” or “Create a new page titled ‘API Migration Guide’.” The server translates these calls into authenticated GROWI API requests, returns structured JSON, and the model can weave the results into its response. Because all operations are defined by the MCP schema, developers can add or remove capabilities without changing the AI’s core logic.

Unique Advantages

  • Zero‑code connector – No custom adapters or SDKs are needed; the MCP server handles authentication, pagination, and error handling automatically.
  • Scalable multi‑environment support – A single instance can serve production, staging, and development wikis, simplifying CI/CD pipelines.
  • Open‑source and lightweight – Built on Node.js with minimal dependencies, making it easy to deploy in containerized or serverless environments.
  • Rich tooling for knowledge base hygiene – Built‑in functions for tagging, revision history, and share links give developers full control over content lifecycle directly from the AI layer.

By integrating GROWI with MCP, organizations empower their AI assistants to become true knowledge workers—