MCPSERV.CLUB
Kurogoma4D

File Search MCP

MCP Server

Instant full-text search across your filesystem

Stale(50)
14stars
1views
Updated 16 days ago

About

A Rust‑powered Model Context Protocol server that indexes text files in a directory using Tantivy, enabling fast keyword searches and file content retrieval with relevance scoring.

Capabilities

Resources
Access data sources
Tools
Execute functions
Prompts
Pre-built templates
Sampling
AI model interactions

File Search MCP

The File Search MCP addresses a common bottleneck in AI‑driven development: quickly locating relevant information inside large codebases, documentation sets, or any collection of text files. By exposing a full‑text search engine over the filesystem via the Model Context Protocol, it lets assistants such as Claude or other MCP‑compatible clients perform instant queries without manual indexing or custom tooling. This removes the friction of “searching my own files” and enables richer, context‑aware interactions.

At its core, the server leverages Tantivy, a high‑performance Rust search library, to build an in‑memory index of all readable text files in a user‑specified directory. Binary files are automatically filtered out, ensuring that searches remain fast and relevant. When a query is issued, Tantivy returns hits ranked by relevance scores, and the server packages each result with its file path. Additionally, a dedicated File Content Reader tool lets users retrieve the full text of any matched file simply by supplying its path, streamlining the workflow from discovery to inspection.

Key capabilities include:

  • Full‑text search across nested directories – no need for external indexing services.
  • Smart file detection – only text files are indexed, keeping the index lightweight.
  • Relevance‑scored results – each hit comes with a confidence score, helping assistants surface the most useful matches.
  • MCP compatibility – the server implements RMCP, making it plug‑in ready for any AI platform that supports the protocol.
  • Rapid startup – the in‑memory index is built on demand, so developers can spin it up locally without persistent storage concerns.

Typical use cases span from debugging large repositories—searching for specific error messages or function definitions—to research assistants combing through academic papers stored locally, and even knowledge‑base bots that need to reference internal documentation. By integrating directly into AI workflows, developers can ask high‑level questions like “Show me all instances of in the utils module” and receive precise file paths and excerpts without leaving the assistant interface.

What sets this MCP apart is its blend of performance, simplicity, and protocol‑first design. Written in Rust for speed and safety, it delivers instant search results while remaining a drop‑in component for any system that speaks MCP. This makes it an invaluable bridge between raw filesystem data and conversational AI, enabling developers to harness the power of their own code and documentation in a truly interactive way.