About
This Python-based MCP server stores and retrieves user context in a local knowledge graph, allowing Claude to remember facts across conversations. It supports entities, relations, and observations with CRUD operations via a simple API.
Capabilities

Overview
The Mcp Memory Py server provides a lightweight, persistent knowledge‑graph backend that lets Claude and other Model Context Protocol clients remember user‑specific facts across sessions. By exposing a set of CRUD‑style tools that manipulate entities, relations, and observations, it turns transient chat context into a durable memory store. This solves the common problem of stateless AI assistants that forget prior conversations, enabling richer, more personalized interactions.
The server is built around three core concepts: entities, relations, and observations. Entities are named nodes (e.g., John_Smith or Anthropic) that carry a type label and an arbitrary list of observations. Relations are directed edges in active voice (e.g., works_at) that connect two entities, capturing how they interact. Observations are atomic strings attached to a single entity, representing discrete facts that can be added or removed independently. Together these structures form a graph that can be queried, updated, and traversed through the MCP tool set.
Key capabilities include:
- Entity and relation lifecycle management (create, delete, bulk operations) with built‑in deduplication.
- Fine‑grained observation handling that allows adding or removing single facts without affecting other data.
- Graph retrieval and search tools (, , ) that let an assistant fetch the entire memory, locate specific nodes by keyword, or pull a subset of related entities.
- Atomic operations that preserve data integrity; for example, attempting to add an observation to a non‑existent entity fails cleanly.
In real‑world scenarios, developers can use this server to build context‑aware assistants that remember user preferences, project details, or domain knowledge across conversations. For example, a virtual research assistant could retain information about a user’s publication history, collaborators, and ongoing projects, enabling it to suggest relevant papers or schedule meetings without repeated prompts. Likewise, a customer support bot could persist user ticket history and product usage patterns to deliver faster, more tailored assistance.
Integration into AI workflows is straightforward: an MCP‑enabled client calls the appropriate tool to persist new facts, then later queries or updates the graph as needed. The server’s stateless HTTP interface means it can run alongside other MCP services, scale independently, and be swapped out for alternative backends (e.g., a graph database) without changing client code. Its Python implementation also offers easy extensibility for custom logic or persistence layers.
Overall, Mcp Memory Py stands out by combining a simple, graph‑based memory model with robust MCP tooling, giving developers an efficient way to add long‑term context to conversational AI applications.
Related Servers
MindsDB MCP Server
Unified AI-driven data query across all sources
Homebrew Legacy Server
Legacy Homebrew repository split into core formulae and package manager
Daytona
Secure, elastic sandbox infrastructure for AI code execution
SafeLine WAF Server
Secure your web apps with a self‑hosted reverse‑proxy firewall
mediar-ai/screenpipe
MCP Server: mediar-ai/screenpipe
Skyvern
MCP Server: Skyvern
Weekly Views
Server Health
Information
Explore More Servers
Git MCP Server
Secure Git operations for LLMs via MCP
Barnsworthburning MCP
Search Barnsworthburning.net via Model Context Protocol
VA Design System Monitor
Real-time monitoring and example generation for VA design components
MCP Base
Central directory for Model Context Protocol servers and clients
JumpServer
Browser‑based, open‑source privileged access management
Ms Industry AI
Empowering industry workflows with modular MCP integration