About
Nchan MCP Transport is a high‑performance, low‑latency API gateway that bridges Anthropic’s Model Context Protocol with WebSocket and SSE. It enables Claude plugins, LLM agents, and external APIs to communicate in real time using Nginx + Nchan and FastAPI.
Capabilities

Nchan MCP Transport is a purpose‑built gateway that turns Anthropic’s Model Context Protocol (MCP) into a high‑throughput, real‑time channel for AI assistants such as Claude. It bridges the gap between MCP’s native HTTP+SSE interface and WebSocket‑based clients, providing a robust, scalable transport layer that can handle thousands of concurrent connections with minimal latency. By leveraging the proven Nginx + Nchan stack for pub/sub and FastAPI for business logic, the server delivers a lightweight yet powerful backend that scales horizontally without sacrificing performance.
At its core, the gateway offers dual‑protocol support: it automatically detects whether a client prefers WebSocket or Server‑Sent Events and negotiates the appropriate channel. This flexibility is crucial for developers who need to integrate Claude into environments where WebSockets are the de‑facto standard (e.g., browser extensions, real‑time dashboards) but still want to maintain compatibility with existing SSE‑only workflows. The MCP compliance is fully baked into the transport, so every JSON‑RPC 2.0 message sent by a client is faithfully routed to the appropriate tool or resource on the server.
Key capabilities include OpenAPI‑driven tool registration, allowing any RESTful service to be exposed as an MCP tool with a single decorator. The server also supports asynchronous execution of long‑running tasks, pushing progress updates back to the client via push notifications. This is particularly valuable for AI agents that need to perform complex computations or external API calls without blocking the conversation flow. Docker‑ready deployment and a lightweight Python SDK (HTTMCP) make it easy to spin up the gateway in production or CI environments.
Real‑world use cases span from building Claude plugins that need to stream live data, to creating LLM agents that orchestrate multiple internal services in real time. Developers can also expose internal APIs to Claude through a single OpenAPI specification, eliminating the need for custom adapters. The gateway’s low‑latency pub/sub layer ensures that even under heavy load, responses arrive quickly and reliably, which is essential for user‑facing AI applications that demand instant feedback.
In summary, Nchan MCP Transport transforms the MCP ecosystem into a high‑performance, WebSocket‑friendly platform. It empowers AI developers to build scalable, real‑time integrations with Claude and other LLM agents while keeping the development workflow simple through OpenAPI integration, asynchronous task handling, and Docker‑friendly deployment.
Related Servers
MarkItDown MCP Server
Convert documents to Markdown for LLMs quickly and accurately
Context7 MCP
Real‑time, version‑specific code docs for LLMs
Playwright MCP
Browser automation via structured accessibility trees
BlenderMCP
Claude AI meets Blender for instant 3D creation
Pydantic AI
Build GenAI agents with Pydantic validation and observability
Chrome DevTools MCP
AI-powered Chrome automation and debugging
Weekly Views
Server Health
Information
Tags
Explore More Servers
Marginfi MCP Server
Facilitates margin trading on Solana via Model Context Protocol
CodingBaby Browser MCP
AI‑driven Chrome automation via WebSocket
Mcp Angular Client
Angular UI for MCP Server Management
Make MCP Server (legacy)
Expose Make automation workflows to AI assistants
VolcEngine TOS MCP Server
Intelligent query and retrieval for VolcEngine TOS storage
Clickup Operator MCP Server
Effortless note management for ClickUp workflows