MCPSERV.CLUB
rawr-ai

Graphiti MCP Server

MCP Server

Multi‑project knowledge graph extraction with Neo4j

Stale(55)
71stars
1views
Updated 17 days ago

About

Graphiti MCP Server extracts entities and relationships from text into a shared Neo4j database, supporting multiple isolated projects via Docker containers. It offers a root server and per‑project MCP endpoints for easy integration.

Capabilities

Resources
Access data sources
Tools
Execute functions
Prompts
Pre-built templates
Sampling
AI model interactions

Graphiti MCP Server – A Multi‑Project Knowledge Graph Engine

Graphiti is a Model Context Protocol (MCP) server that turns natural‑language text into structured knowledge graphs stored in Neo4j. By extending the original getzep/graphiti example, this fork focuses on developer experience and multi‑project support. It lets several independent knowledge‑graph projects share a single Neo4j instance while keeping each project’s extraction rules, entities, and models isolated. The root server exposes a unified MCP endpoint on port 8000, and each project runs its own MCP instance in a Docker container starting at port 8001. This architecture enables teams to spin up dozens of context‑aware AI assistants that can query a shared graph without interfering with one another.

What Problem Does It Solve?

Modern AI assistants often need to reason over domain knowledge that is continuously updated from documents, logs, or user queries. Storing this knowledge in a graph database allows for efficient traversal and inference, but managing multiple knowledge graphs—each with its own schema and extraction logic—can become unwieldy. Graphiti removes this friction by providing a single orchestration layer: one Docker Compose file that launches a root MCP server and any number of project containers. Each container is automatically configured with its own , entity definitions, and model settings, so projects never clash even though they share the same Neo4j backend. This solves the common pain points of configuration drift, accidental data contamination, and hard‑to‑debug crashes that plague monolithic graph services.

Core Features Explained

  • Project Isolation – Every project has its own extraction rules, entity templates, and model parameters. A faulty prompt or schema change in one project triggers only that container’s restart, leaving the others unaffected.
  • Auto‑Discovery for Editors – The MCP server writes its listening ports to , allowing IDEs and tools that support the MCP protocol to discover and connect automatically.
  • Hot Reload – Modify a project’s YAML configuration or entity definitions and run to apply changes without rebuilding the image.
  • Centralized Neo4j – All projects write to a single Neo4j instance, simplifying backup and scaling while keeping graph data logically separated by project.
  • Safety Controls – Production‑grade password enforcement prevents accidental exposure of the Neo4j database, and a flag offers a quick way to wipe all data when needed.

Real‑World Use Cases

  • Domain‑Specific AI Assistants – Build separate assistants for finance, healthcare, or software engineering that each pull from their own knowledge graph while sharing a common database infrastructure.
  • Rapid Prototyping – Spin up new projects with minimal configuration, experiment with different extraction models, and iterate quickly without affecting existing assistants.
  • Enterprise Knowledge Management – Consolidate corporate documents into multiple, isolated knowledge graphs (e.g., HR policies vs. product specs) and expose them through a unified MCP interface.
  • Educational Platforms – Create course‑specific knowledge graphs for students and instructors, allowing AI tutors to query only the relevant domain.

Integration with AI Workflows

An MCP‑compatible client (such as Claude or any other tool that understands the Model Context Protocol) connects to a Graphiti instance via an SSE endpoint (). The client can send prompts that trigger entity extraction, relationship inference, and graph queries. Because each project’s MCP server is isolated, developers can tailor the extraction logic (e.g., custom entity templates or prompt prefixes) per project without worrying about cross‑project contamination. The root server aggregates status endpoints () and automatically updates the editor’s MCP configuration, streamlining development cycles.

Standout Advantages

Graphiti’s multi‑project design eliminates the need for separate Neo4j instances per assistant, cutting infrastructure costs while preserving logical separation. Its hot‑reload and crash containment features make it resilient in production, and the auto‑discovery mechanism removes manual configuration steps for developers. Together, these qualities make Graphiti an attractive choice for teams building sophisticated AI assistants that rely on rich, continuously evolving knowledge graphs.