MCPSERV.CLUB
samwang0723

MCP Memory with Redis Graph

MCP Server

Graph‑based memory storage for LLM conversations

Stale(50)
84stars
2views
Updated 11 days ago

About

MCP Memory provides a suite of tools to create, store, query, and relate conversational memories using Redis Graph. It enables complex knowledge graphs for long‑term LLM context management.

Capabilities

Resources
Access data sources
Tools
Execute functions
Prompts
Pre-built templates
Sampling
AI model interactions

Watch the video

MCP Memory – Persistent, Contextual Knowledge for AI Assistants

MCP Memory solves a core limitation of conversational AI: the inability to retain user‑specific information across sessions. Traditional chat models treat each interaction as stateless, which forces developers to re‑pass context or rebuild memory on every request. MCP Memory replaces this pattern with a lightweight, vector‑based knowledge base that stores and retrieves user memories on demand. By embedding every piece of text with a semantic model, the server can match new queries to past statements even when wording differs, delivering truly personalized experiences without compromising speed.

The server exposes a simple MCP interface that lets any compliant client—Cursor, Claude, Windsurf, or custom agents—store and query memories. When a user shares preferences, goals, or prior interactions, the client sends the text to MCP Memory. Workers AI processes it with the open‑source model, producing a 1024‑dimensional embedding. This vector is indexed in Cloudflare Vectorize for rapid similarity search, while the raw text and metadata are persisted in Cloudflare D1. A Durable Object orchestrates stateful operations, ensuring consistency across concurrent requests and protecting each user’s namespace. Retrieval follows the same path: a query is embedded, Vectorize returns the most similar vectors, and D1 supplies the original passages. The result is a ranked list of relevant memories that can be injected back into the prompt or used to shape downstream actions.

Key capabilities include:

  • Semantic search – finds conceptually related memories, not just keyword matches.
  • Fast, scalable retrieval – Vectorize delivers sub‑millisecond similarity queries at scale.
  • Stateful coordination – Durable Objects maintain per‑user namespaces and enforce isolation.
  • Standardized MCP protocol – seamless integration with any MCP‑aware assistant.
  • Secure, rate‑limited infrastructure – built on Cloudflare’s TLS‑protected network with per‑user namespaces and configurable limits.

Typical use cases span from personalized recommendation engines (remembering a user’s favorite cuisines) to adaptive tutoring systems (tracking progress over time) and enterprise knowledge bases that evolve with employee interactions. Developers can embed MCP Memory into a workflow where an assistant first queries the memory store, augments its prompt with retrieved snippets, and then generates a response that feels contextually aware. Because the server handles all embedding, indexing, and retrieval, developers avoid re‑implementing these complex components.

MCP Memory’s standout advantage lies in its combination of semantic richness, low latency, and built‑in security. By leveraging Cloudflare’s edge network, the server delivers consistent performance worldwide while keeping user data isolated and protected. Its free tier for most users makes it an attractive starting point for prototyping or scaling a conversational AI that truly “remembers” its users.