About
An MCP server that starts an LSP client and exposes hover, completion, diagnostics, code actions, and other LSP features to large language models in a format they can consume.
Capabilities
LSP MCP Server
The LSP MCP Server bridges the gap between large language models (LLMs) and Language Server Protocol (LSP) tooling. By exposing LSP capabilities—hover information, completions, diagnostics, and code actions—as MCP tools and resources, it lets an AI assistant query a language server as if it were a native part of its own context. This solves the problem of LLMs producing generic, sometimes inaccurate code suggestions when they lack language‑specific semantics or project‑wide type information.
At its core, the server starts an LSP client that connects to any standard LSP implementation. It then offers a set of MCP tools: , , , among others. These tools translate LLM requests into the appropriate LSP messages, forward them to the server, and return responses in a format that the assistant can embed directly into generated text. Resources such as and provide real‑time, subscription‑based access to diagnostics or hover data, enabling the assistant to keep its suggestions in sync with the evolving codebase.
Key capabilities include:
- Runtime LSP management: Tools to start, restart, and configure the LSP server without restarting the MCP itself.
- Granular logging: Eight severity levels with colorized console output, allowing developers to tune visibility during debugging or production runs.
- Comprehensive diagnostics: Real‑time error, warning, and informational messages that the assistant can surface or use to refine code actions.
- Code action integration: The ability to fetch and apply automated fixes directly from the LSP, giving the assistant a powerful corrective tool.
Real‑world scenarios that benefit most include:
- Code completion in conversational IDEs: An assistant can ask for suggestions at a cursor position and receive context‑aware completions that respect project dependencies.
- On‑the‑fly debugging: By subscribing to diagnostics, the assistant can alert users to errors as they type and offer fix suggestions.
- Language‑agnostic tooling: Because the MCP server can be configured for any LSP, developers working across multiple languages can use a single AI interface without custom adapters.
Integrating the LSP MCP Server into an AI workflow is straightforward: the assistant calls the exposed tools with file paths, positions, and optional root directories. The server handles LSP lifecycle events, ensuring that the language server is correctly initialized and shut down as needed. This seamless integration empowers developers to leverage mature LSP features—such as type‑checking, refactoring, and static analysis—within conversational AI, leading to more accurate code generation, reduced bugs, and a smoother development experience.
Related Servers
MarkItDown MCP Server
Convert documents to Markdown for LLMs quickly and accurately
Context7 MCP
Real‑time, version‑specific code docs for LLMs
Playwright MCP
Browser automation via structured accessibility trees
BlenderMCP
Claude AI meets Blender for instant 3D creation
Pydantic AI
Build GenAI agents with Pydantic validation and observability
Chrome DevTools MCP
AI-powered Chrome automation and debugging
Weekly Views
Server Health
Information
Explore More Servers
Police UK API MCP Server
Access UK police data with 21 ready‑to‑use tools
Alpaca Trading MCP Server
Connect AI agents to Alpaca trading and market data
Whisper King MCP Server
A lightweight MCP server for whispering data
Maven Dependencies MCP Server
Instant Maven dependency checks and latest version lookup
BigQuery MCP Server
Empower AI agents to explore BigQuery data effortlessly
Microsoft Dynamics 365 MCP Server
Connect Claude Desktop to Dynamics 365 via Model Context Protocol