About
A VSCode extension that exposes the editor as an MCP server, enabling LLM clients like Claude Desktop to edit code, run terminal commands, preview URLs, and manage debugging sessions directly within VSCode.
Capabilities

Overview
The VSCode as MCP Server extension transforms your local VSCode instance into a fully‑featured Model Context Protocol (MCP) server. By exposing a rich set of built‑in tools and diagnostics, it lets AI assistants such as Claude Desktop interact with your development environment in real time. This solves a common pain point for developers who rely on cloud‑based LLM services: the cost and latency of external code execution. With a self‑hosted MCP server, all LLM interactions—including code generation, debugging, and terminal commands—remain on the developer’s machine, eliminating subscription fees while preserving privacy.
At its core, the server provides a suite of tools that mirror everyday VSCode actions. The code editing support feature allows an LLM to propose changes via diffs, which developers can review and accept or reject directly in the editor. Real‑time diagnostics are streamed back to the model, enabling instant feedback loops for type errors or linting issues. Terminal operations let the assistant run arbitrary shell commands inside VSCode’s integrated terminal, with support for background execution and timeout controls. The preview tool opens URLs—such as local development servers—in VSCode’s built‑in browser, streamlining front‑end workflows. Additionally, the server can manage debug sessions and file operations through simple tool calls.
For teams that use multiple VSCode windows, the extension offers multi‑instance switching: a quick status‑bar toggle lets you designate which window hosts the active MCP server. An experimental relay functionality further extends the server’s reach by exposing other MCP extensions (e.g., GitHub Copilot) to external clients, effectively creating a hub for all available tools.
Real‑world use cases include automated code reviews where an assistant proposes refactors, interactive debugging sessions that trigger breakpoints and step through logic, or continuous integration pipelines that execute build commands locally. Developers integrating AI into their workflow benefit from a seamless bridge: the assistant can read the current codebase, propose edits, run tests, and even open a preview—all without leaving VSCode. This tight coupling reduces context switching, speeds up iteration cycles, and keeps sensitive code on the local machine.
Unique advantages of this server lie in its affordability, self‑hosting model, and deep integration with VSCode’s native capabilities. By turning the editor into an MCP server, it removes reliance on metered cloud services while still offering the same powerful AI interactions. The extension’s roadmap hints at further enhancements—such as selective server exposure, web‑view approvals, and auto‑toggle features—ensuring that it will continue to evolve alongside the needs of AI‑augmented development.
Related Servers
n8n
Self‑hosted, code‑first workflow automation platform
FastMCP
TypeScript framework for rapid MCP server development
Activepieces
Open-source AI automation platform for building and deploying extensible workflows
MaxKB
Enterprise‑grade AI agent platform with RAG and workflow orchestration.
Filestash
Web‑based file manager for any storage backend
MCP for Beginners
Learn Model Context Protocol with hands‑on examples
Weekly Views
Server Health
Information
Explore More Servers
Powerpoint MCP Server
Create and edit PowerPoint decks programmatically
Unusual Whales MCP Server
MCP interface for Unusual Whales stock data API
GitHub Trending MCP Server
Real‑time GitHub trending data via simple API
Shodan MCP
Unleash Shodan’s power via the Model Context Protocol
MCP LLM Sandbox
Validate Model Context Protocol servers with live LLM chat testing
LaunchDarkly MCP Server
Feature flag management via Model Context Protocol