About
Memory MCP is an MCP server that stores, retrieves, and manages LLM conversation memories using MongoDB. It offers context window caching, relevance scoring, tag-based search, and summary creation for efficient conversation management.
Capabilities
Memory MCP Server
Memory MCP is a knowledge‑graph management server built on the Model Context Protocol. It solves the common pain point of storing, querying, and evolving structured knowledge for AI assistants: instead of hard‑coding facts or relying on external databases, developers can create a self‑contained, persistable graph that the assistant can interrogate and augment on the fly.
The server exposes a rich set of MCP endpoints that mirror typical graph‑database operations. With and you can add or remove nodes, while and manage edges that describe how those entities are connected. Observation APIs let you attach contextual notes to nodes, and provides a snapshot of the entire knowledge base. The integration layer is polished: it supports the , , and MCP methods, meaning a Claude or other AI client can invoke graph queries as tools, request persisted resources (like short stories generated from the graph), or sample new content directly through the protocol.
Key capabilities include:
- Interactive, CLI, and JSON‑configurable startup – developers can launch the server in a mode that best fits their workflow, from double‑clicking on Windows to piping JSON into the binary for automated environments.
- Automatic persistence – all graph changes are written in JSON‑lines format to a configurable file, and the server remembers its own settings by writing . Missing directories are created automatically.
- Story generation resource – the server can produce short stories that reflect the current knowledge graph. Clients access these via or a topic‑specific endpoint, enabling creative applications such as dynamic narrative generation or contextual storytelling.
- Flexible configuration – port and memory file path can be set through command‑line flags, environment variables, or a JSON payload, giving developers full control over deployment.
Typical use cases span from conversational agents that need to maintain stateful world knowledge, to educational tools that evolve a curriculum graph as users interact, to internal tooling where an AI assistant can browse and modify company data without touching a database. Because the server speaks MCP, it plugs seamlessly into existing AI workflows: a client can send a tool call to add a new concept, then immediately query related entities or request a short story about the newly added topic. The result is a tightly coupled, self‑contained knowledge engine that empowers AI assistants to reason over persistent data without external dependencies.
Related Servers
MarkItDown MCP Server
Convert documents to Markdown for LLMs quickly and accurately
Context7 MCP
Real‑time, version‑specific code docs for LLMs
Playwright MCP
Browser automation via structured accessibility trees
BlenderMCP
Claude AI meets Blender for instant 3D creation
Pydantic AI
Build GenAI agents with Pydantic validation and observability
Chrome DevTools MCP
AI-powered Chrome automation and debugging
Weekly Views
Server Health
Information
Explore More Servers
Ton MCP Server
Connect AI to your Ton wallet effortlessly
Project Opener MCP Server
Open and manage project files from Claude Desktop
Geolocation MCP Server
Fetch walkability, transit, and bike scores for any location
Gridly MCP Server
Manage Gridly projects, databases, grids and views via MCP
Jewish-Interest MCP Projects
A curated hub of Jewish content for AI integration
Linux Command MCP
Secure remote execution of Linux commands via Model Context Protocol