About
This MCP server extracts prompts from your local Cursor IDE, generates text embeddings using a local Ollama model, stores them in LanceDB, and exposes a FastAPI endpoint for vector similarity search—ideal for RAG workflows.
Capabilities
Overview
The Cursor History MCP is a dedicated Model Context Protocol server that turns the rich chat logs generated by the Cursor IDE into an intelligent, searchable knowledge base. By vectorizing each message with embeddings and storing them in LanceDB, the server enables AI assistants to retrieve relevant conversation snippets quickly and accurately. This capability addresses a common pain point for developers: the difficulty of recalling or reusing specific parts of long, multi‑turn dialogues that contain code snippets, design decisions, or debugging steps.
For developers building AI‑augmented workflows, the server offers a lightweight API built on FastAPI. It exposes two primary endpoints— for query‑based retrieval and for bulk access—to integrate seamlessly with any assistant that supports MCP. Because the service is self‑hosted and containerized, teams can keep all chat data on-premises, satisfying strict privacy or compliance requirements while still benefiting from the power of local LLMs via Ollama.
Key features include:
- Vectorized search: Embeddings transform natural‑language queries into high‑dimensional vectors, allowing LanceDB to perform fast nearest‑neighbor lookups. This yields more relevant results than simple keyword matching.
- Local LLM integration: The server can forward retrieved snippets to an Ollama‑powered model for context augmentation, enabling retrieval‑augmented generation (RAG) without external API calls.
- FastAPI performance: The asynchronous framework delivers low latency responses, essential for real‑time assistant interactions.
- Docker support: A single container image simplifies deployment across development, staging, and production environments.
Real‑world use cases abound: a developer asking the assistant to “show me where I last discussed authentication flow” will receive precise excerpts from past chats; a team lead can retrieve historical decisions about architecture to inform new meetings; or an AI tutor can pull earlier explanations of a concept to reinforce learning. In each scenario, the server transforms static chat logs into a dynamic knowledge source that AI tools can query on demand.
Because the MCP server exposes its capabilities via standard Model Context Protocol interfaces, it plugs into any Claude‑compatible assistant or other MCP‑aware client. Developers can chain the endpoint into a prompt template, feed results to a local LLM for summarization, or even combine it with other vector stores. This composability makes the Cursor History MCP a versatile component in modern AI‑driven development pipelines, offering instant access to conversational context without sacrificing data ownership or performance.
Related Servers
MindsDB MCP Server
Unified AI-driven data query across all sources
Homebrew Legacy Server
Legacy Homebrew repository split into core formulae and package manager
Daytona
Secure, elastic sandbox infrastructure for AI code execution
SafeLine WAF Server
Secure your web apps with a self‑hosted reverse‑proxy firewall
mediar-ai/screenpipe
MCP Server: mediar-ai/screenpipe
Skyvern
MCP Server: Skyvern
Weekly Views
Server Health
Information
Tags
Explore More Servers
MCP Demo Server
Showcase of Message Control Protocol for AI agent extensions
wcgw MCP Server
Interactive shell and code editing for AI agents
Bitcoin MCP Server
Real-time Bitcoin blockchain data via mempool.space
SpaceBridge-MCP
AI‑driven issue management in your dev environment
Airflow MCP Server
Control Airflow workflows via Model Context Protocol
Integrator MCP Server
Turn Integrator scenarios into AI‑assistant tools