MCPSERV.CLUB
Vic563

MemGPT MCP Server

MCP Server

Memory‑powered LLM chat server with multi‑provider support

Stale(50)
0stars
1views
Updated Jan 14, 2025

About

MemGPT MCP Server is a TypeScript MCP server that stores conversation history and lets you chat with OpenAI, Anthropic, OpenRouter, or Ollama models. It offers tools for sending messages, retrieving or clearing memory, and switching providers or models.

Capabilities

Resources
Access data sources
Tools
Execute functions
Prompts
Pre-built templates
Sampling
AI model interactions

MemGPT MCP Server

The MemGPT MCP Server bridges the gap between conversational AI assistants and persistent, provider‑agnostic memory. In many AI workflows, an assistant can respond accurately only if it has access to the full context of a conversation. Traditional LLM APIs treat each request as stateless, discarding prior exchanges unless the developer manually stitches them together. MemGPT solves this by maintaining an internal conversation history that can be queried, cleared, or switched across multiple model providers on the fly. Developers using Claude or other MCP‑compatible assistants can now embed long‑term memory into their applications without writing custom state‑management code.

At its core, the server exposes a small set of intuitive tools. The tool forwards user messages to the currently selected LLM provider, automatically appending all relevant past memories. The tool returns the conversation history, supporting both limited and unlimited retrieval with a simple parameter. wipes the history, allowing fresh starts when needed. To switch between providers or models, and persist the choice across sessions, enabling a single assistant to toggle between OpenAI, Anthropic, OpenRouter, or local Ollama models without changing the client code.

Key capabilities include:

  • Multi‑provider support: Switch seamlessly between OpenAI, Anthropic, OpenRouter, and Ollama models.
  • Model granularity: Pick specific Claude 3/3.5 variants or any OpenAI model, with the ability to reference OpenRouter’s “provider/model” syntax.
  • Memory persistence: Store and retrieve conversation history with timestamps, ensuring chronological integrity.
  • Unlimited memory retrieval: Fetch the entire chat log by setting to null, or a configurable subset for performance.
  • Provider‑agnostic API: The same tool interface works regardless of the underlying LLM, simplifying client logic.

In practice, this server is invaluable for building chatbots that need to remember user preferences, maintain context across sessions, or audit conversations. A customer‑support bot can keep track of a user’s past tickets; an educational tutor can remember earlier lessons; a code‑generation assistant can recall prior prompts to avoid repetition. By exposing memory as an MCP tool, developers can compose complex workflows—such as chaining with a data‑retrieval tool—to create richer, contextually aware interactions.

The MemGPT MCP Server integrates effortlessly into existing AI pipelines. A Claude Desktop user simply adds the server’s configuration to their local config file, providing API keys for each provider. Once running, any MCP‑enabled assistant can call the server’s tools via standard tool invocation syntax. Because the server handles provider switching and memory management internally, developers can focus on higher‑level logic, confident that context will be preserved across provider changes and long conversations.