About
A Python-based MCP server that enables AI agents to interact with Lean projects through the Language Server Protocol. It provides diagnostics, goal states, hover documentation, and external search tools for efficient theorem proving.
Capabilities

Overview
The Lean LSP MCP is a specialized Model Context Protocol server that bridges AI assistants with the Lean theorem prover through the Language Server Protocol (LSP). It enables conversational agents such as Claude, GPT‑4o, or other LLMs to interrogate Lean projects, retrieve diagnostics, explore goal states, and invoke theorem‑search tools—all within a single chat interface. By exposing Lean’s rich semantic information as MCP resources and tools, the server turns a complex proof environment into an interactive API that LLMs can query in natural language.
Problem Solved
Developers and researchers often struggle to get AI assistants to understand the intricacies of Lean proofs. Traditional LSP clients require manual navigation, command‑line interaction, or IDE overlays to surface information such as goal contexts, type hierarchies, and theorem dependencies. The Lean LSP MCP eliminates this friction by providing a unified, agent‑friendly endpoint that exposes all LSP capabilities as structured MCP resources. This allows an AI assistant to request, for example, the current proof goals or hover documentation without needing to run separate tooling or parse raw Lean output.
Core Functionality
- Rich Lean Interaction: The server offers diagnostics, goal states, term information, hover documentation, and other LSP features as MCP tools. An AI can ask for the type of a term or request the list of open goals, and receive machine‑readable responses that can be used to drive further reasoning or code generation.
- External Search Integration: Built‑in search tools such as , , , and allow the assistant to query Lean’s theorem libraries, find relevant lemmas, or locate missing proofs. These searches are exposed as simple MCP commands that return structured results.
- Seamless Client Support: The server is pre‑configured for popular IDEs and AI platforms, including VS Code (agent mode), Cursor, and Claude Code. Clients can add the MCP server with a single command or configuration file, after which they can invoke Lean operations directly from chat.
Use Cases
- Automated Proof Construction: An LLM can iteratively request the current goal, propose a tactic or lemma, and apply it via the MCP toolset. This workflow supports “auto proof” features in chat interfaces.
- Proof Exploration and Documentation: Developers can ask the assistant to explain a theorem, list its dependencies, or fetch hover information for any identifier. The server returns precise type signatures and documentation strings.
- Library Search: When a user needs to find an existing lemma that matches a pattern, the assistant can invoke or through MCP, receiving ranked results that can be directly inserted into the proof script.
- IDE Integration: In VS Code or Cursor, a user can switch to agent mode and let the assistant suggest edits, fix diagnostics, or generate new Lean code—all powered by the MCP server.
Unique Advantages
- Agent‑First Design: Unlike generic LSP servers, Lean LSP MCP is intentionally exposed as an MCP service. It serializes responses in JSON and provides a clean command interface, making it trivial for LLMs to parse and act upon the data.
- Zero‑Configuration Search Tools: The bundled search utilities are ready to use out of the box, eliminating the need for separate installation or configuration of external theorem‑search engines.
- Optimized Build Handling: The server automatically runs on the project root, ensuring that the language server has up‑to‑date information. This pre‑build step is also exposed as an MCP resource, allowing the assistant to trigger a rebuild when necessary.
By integrating Lean’s powerful proof engine with an LLM‑friendly protocol, the Lean LSP MCP transforms theorem proving from a manual, IDE‑centric activity into an interactive, conversational experience. It empowers developers to harness AI assistance for rapid proof development, exploration, and maintenance across Lean projects.
Related Servers
MindsDB MCP Server
Unified AI-driven data query across all sources
Homebrew Legacy Server
Legacy Homebrew repository split into core formulae and package manager
Daytona
Secure, elastic sandbox infrastructure for AI code execution
SafeLine WAF Server
Secure your web apps with a self‑hosted reverse‑proxy firewall
mediar-ai/screenpipe
MCP Server: mediar-ai/screenpipe
Skyvern
MCP Server: Skyvern
Weekly Views
Server Health
Information
Explore More Servers
G4F MCP Server
Free GPT‑4 API access via a lightweight MCP interface
Mapbox MCP Server
Fast Mapbox API integration for navigation and geocoding
Non Dirty MCP Server Restart
Restart Claude Desktop via Model Context Protocol
Frida MCP Server
Android APK analysis via Frida integration
Spotify Playlist Curator MCP Server
Curate Spotify playlists with AI-driven recommendations
SQLite-Anet-MCP Server
Rust MCP server enabling SQLite via NATS and JSON-RPC