About
A lightweight Python server that stores and retrieves agentic memory in a Neo4j graph database, enabling Claude Desktop to remember past interactions across sessions and share memories with other MCP clients.
Capabilities

Overview
The AI Engineer Neo4j Memory MCP Demo demonstrates how an external memory store can be wired into Claude Desktop using the Model Context Protocol. By connecting to a Neo4j graph database, every interaction with Claude is automatically persisted as structured nodes and relationships. This solves the problem of stateless chat models that lose context between sessions, enabling developers to build richer, continuity‑aware AI assistants without reinventing persistence logic.
When a user starts a conversation, the MCP server retrieves all relevant facts from Neo4j and injects them into Claude’s prompt. After the dialogue, any new observations are written back as entities and relationships, ensuring that knowledge grows incrementally. This cycle of read‑modify‑write makes the assistant “agentic” – it can reference its own memory, update it on the fly, and recall information across separate sessions or even across different clients that share the same database.
Key capabilities include:
- Graph‑backed memory: Facts are stored as typed nodes (e.g., Person, Organization) and linked via semantic relationships (worksAt, attended). This structure supports complex queries such as “show all projects a person has worked on” or “list all events related to a company.”
- Automatic persistence: Every turn of the conversation is logged without manual intervention, reducing boilerplate for developers.
- Cross‑client access: Any MCP‑compatible client (Claude Desktop, Cursor, Windsurf) can read from or write to the same Neo4j instance, allowing a unified knowledge base for multiple AI tools.
- Customizable prompts: The server can inject system messages that instruct Claude on how to retrieve and update memory, giving fine‑grained control over the agent’s behavior.
Typical use cases involve building long‑running assistants that remember user preferences, project histories, or domain knowledge. For example, a software engineering aide can recall the stack trace of past bugs and suggest fixes based on prior solutions. In research settings, a knowledge‑graph assistant can track citations and concepts across papers, enabling seamless literature reviews. Because Neo4j excels at traversing relationships, developers can expose powerful navigation features—such as “show me all downstream dependencies of this module”—directly to the AI, enriching user interactions with graph‑powered insights.
In summary, this MCP server turns Claude into a memory‑aware agent that persists knowledge in a scalable graph database. It removes the friction of managing state, promotes interoperability across tools, and unlocks new possibilities for intelligent assistants that truly learn from every conversation.
Related Servers
MarkItDown MCP Server
Convert documents to Markdown for LLMs quickly and accurately
Context7 MCP
Real‑time, version‑specific code docs for LLMs
Playwright MCP
Browser automation via structured accessibility trees
BlenderMCP
Claude AI meets Blender for instant 3D creation
Pydantic AI
Build GenAI agents with Pydantic validation and observability
Chrome DevTools MCP
AI-powered Chrome automation and debugging
Weekly Views
Server Health
Information
Explore More Servers
Mcp With Semantic Kernel
Integrate MCP tools into Semantic Kernel for seamless AI function calling
Portfolio Manager MCP Server
AI‑powered investment portfolio management and analysis
MCP Access Point
Bridge HTTP services to MCP clients without code changes
MCP Tools
Simplify MCP integration for clients and servers
MCP Ping-Pong Server
FastAPI‑based demo of remote MCP calls via API or SSE
HarmonyOS MCP Server
Control HarmonyOS devices via Model Context Protocol