MCPSERV.CLUB
Alihkhawaher

Everything Search MCP Server

MCP Server

Instant file search via Everything Engine

Stale(50)
8stars
0views
Updated 17 days ago

About

An MCP server that integrates with the Everything Search Engine, providing powerful file search capabilities through a Model Context Protocol tool. It supports full-text, regex, and advanced filtering with sorting options.

Capabilities

Resources
Access data sources
Tools
Execute functions
Prompts
Pre-built templates
Sampling
AI model interactions

Everything Search MCP Server

The Everything Search MCP Server bridges the gap between local file systems and AI assistants by exposing the lightning‑fast Everything Search Engine through the Model Context Protocol. Developers can now let Claude or other MCP‑enabled agents perform instant, full‑text searches across thousands of files without leaving the AI environment. This eliminates manual file navigation and enables context‑aware code reviews, documentation lookup, or rapid data discovery directly from conversational prompts.

What It Solves

Local file search is traditionally a command‑line or GUI task, requiring the user to remember exact paths or use brittle pattern matching. In AI workflows, an assistant needs quick access to the developer’s workspace to answer questions like “Which files contain this function?” or “Show me all recent changes in the test folder.” The Everything Search MCP Server provides a unified, declarative API that turns these ad‑hoc queries into reproducible tool calls. By leveraging Everything’s HTTP interface, the server delivers sub‑second results even on large codebases.

Core Capabilities

  • Full‑text and path search across the entire file system, limited by an optional scope (e.g., ).
  • Advanced filters: case sensitivity, whole‑word matching, regular expressions, and path‑only searches.
  • Result customization: limit results (), sort by name, path, size, or modification date, and choose ascending or descending order.
  • Human‑friendly output: file sizes are formatted (e.g., “2 MB”), dates are rendered in a readable format, and full paths are returned for seamless navigation.

These features allow developers to craft precise queries while keeping the tool’s interface simple enough for natural language requests.

Use Cases

  • Codebase exploration: Quickly locate functions, classes, or configuration files across a monorepo.
  • Debugging assistance: Find all instances of an error message or stack trace fragment.
  • Documentation retrieval: Pull README files, changelogs, or policy documents without leaving the chat.
  • Compliance checks: Search for sensitive keywords (e.g., API keys) across source directories.
  • Automation scripts: Combine search results with other MCP tools (e.g., file editors or linters) to build end‑to‑end pipelines.

Integration with AI Workflows

An MCP client simply invokes the tool, passing a JSON payload that mirrors the server’s arguments. The assistant can interpret natural language (“Show me all JavaScript files in the utils folder”) into a structured query, send it to the server, and then present the formatted results back to the user. Because the server operates over HTTP on a configurable port, it can run locally or in a sandboxed environment, ensuring that file access stays within the developer’s permissions.

Unique Advantages

  • Speed: Everything Search Engine indexes files in milliseconds, making even large repositories searchable in real time.
  • Flexibility: The API exposes all of Everything’s powerful options, from regex to whole‑word matching, without exposing low‑level details.
  • Simplicity: A single MCP tool () keeps the integration lightweight while offering a rich feature set.
  • Extensibility: Developers can easily adjust the server’s port or add authentication layers, fitting it into diverse security models.

By turning a local search engine into an AI‑friendly service, the Everything Search MCP Server empowers assistants to act as immediate, context‑aware helpers—streamlining development, debugging, and documentation workflows in a single, cohesive experience.