About
RagDocs is an MCP server that stores, indexes, and retrieves documents using vector embeddings. It supports automatic chunking, metadata tagging, and semantic search powered by Qdrant with Ollama or OpenAI embeddings.
Capabilities

Overview
The RagDocs MCP server fills a critical gap for developers building AI assistants that need to browse, understand, and retrieve information from large bodies of text. By coupling a vector‑based search engine (Qdrant) with modern embedding providers such as Ollama or OpenAI, RagDocs turns raw documentation into a searchable knowledge base that an AI can query with natural language. This eliminates the need to hard‑code answers or maintain brittle rule sets, allowing assistants to surface up‑to‑date, contextually relevant content from any set of documents.
At its core, RagDocs exposes four simple yet powerful tools: add_document, search_documents, list_documents, and delete_document. The add operation automatically splits incoming text into manageable chunks, generates embeddings, and stores them in Qdrant along with rich metadata like title, domain, or timestamps. The search tool performs semantic similarity queries against this vector space, returning the most relevant passages and allowing fine‑grained filtering (by domain, code presence, or date ranges). Listing provides pagination and grouping capabilities so developers can audit or reorganize the corpus, while deletion keeps the knowledge base fresh. Because the server abstracts away the complexities of vector storage and embedding generation, developers can focus on defining higher‑level flows in their AI assistants.
RagDocs is especially valuable for scenarios that demand up‑to‑date, domain‑specific knowledge—such as internal help desks, product documentation assistants, or compliance bots. By feeding the server with company policies, user manuals, or codebases, an AI can answer questions about internal processes or retrieve relevant code snippets without exposing the underlying data to external services. The dual support for free, local Ollama embeddings and paid OpenAI models gives teams flexibility: they can start entirely on premises or scale to the higher‑accuracy embeddings of OpenAI when needed.
Integration into existing AI workflows is straightforward. An MCP client can invoke add_document whenever new documentation is published, keeping the vector index in sync. During runtime, a conversational agent can call search_documents with the user’s query and use the returned passages as context for generation, ensuring that responses are grounded in the latest documents. Because RagDocs communicates over the standard MCP protocol, it can be combined with other MCP servers—such as file systems or external APIs—to create a fully autonomous knowledge‑retrieval pipeline.
What sets RagDocs apart is its seamless blend of open‑source tooling with enterprise‑grade features. Automatic chunking, metadata enrichment, and configurable similarity thresholds lower the barrier to entry for developers unfamiliar with vector search. Meanwhile, the ability to run locally on Docker or connect to Qdrant Cloud provides both cost‑effective dev environments and scalable production deployments. In sum, RagDocs equips AI assistants with robust, semantic search over arbitrary text, enabling richer, more accurate interactions without compromising data privacy or requiring deep machine‑learning expertise.
Related Servers
MarkItDown MCP Server
Convert documents to Markdown for LLMs quickly and accurately
Context7 MCP
Real‑time, version‑specific code docs for LLMs
Playwright MCP
Browser automation via structured accessibility trees
BlenderMCP
Claude AI meets Blender for instant 3D creation
Pydantic AI
Build GenAI agents with Pydantic validation and observability
Chrome DevTools MCP
AI-powered Chrome automation and debugging
Weekly Views
Server Health
Information
Explore More Servers
Agent Care MCP Server
AI‑powered EMR integration for Cerner and Epic
Genai Everyday MCP Server
Your everyday GenAI companion for prompts, code, and ideas
Rust Docs MCP Server
On‑demand, up-to-date Rust crate documentation for LLMs
Confluence Wiki MCP Server Extension
Seamless Confluence integration for AI chat tools
Google Workspace MCP Server
Securely bridge Google Workspace with AI clients
Skynet-MCP
Hierarchical AI agent network with MCP integration