MCPSERV.CLUB
ConechoAI

Nchan MCP Transport

MCP Server

Real‑time WebSocket/SSE gateway for Anthropic’s MCP

Stale(50)
26stars
0views
Updated Aug 9, 2025

About

Nchan MCP Transport is a high‑performance, low‑latency API gateway that bridges Anthropic’s Model Context Protocol with WebSocket and SSE. It enables Claude plugins, LLM agents, and external APIs to communicate in real time using Nginx + Nchan and FastAPI.

Capabilities

Resources
Access data sources
Tools
Execute functions
Prompts
Pre-built templates
Sampling
AI model interactions

Nchan MCP Transport Demo

Nchan MCP Transport is a purpose‑built gateway that turns Anthropic’s Model Context Protocol (MCP) into a high‑throughput, real‑time channel for AI assistants such as Claude. It bridges the gap between MCP’s native HTTP+SSE interface and WebSocket‑based clients, providing a robust, scalable transport layer that can handle thousands of concurrent connections with minimal latency. By leveraging the proven Nginx + Nchan stack for pub/sub and FastAPI for business logic, the server delivers a lightweight yet powerful backend that scales horizontally without sacrificing performance.

At its core, the gateway offers dual‑protocol support: it automatically detects whether a client prefers WebSocket or Server‑Sent Events and negotiates the appropriate channel. This flexibility is crucial for developers who need to integrate Claude into environments where WebSockets are the de‑facto standard (e.g., browser extensions, real‑time dashboards) but still want to maintain compatibility with existing SSE‑only workflows. The MCP compliance is fully baked into the transport, so every JSON‑RPC 2.0 message sent by a client is faithfully routed to the appropriate tool or resource on the server.

Key capabilities include OpenAPI‑driven tool registration, allowing any RESTful service to be exposed as an MCP tool with a single decorator. The server also supports asynchronous execution of long‑running tasks, pushing progress updates back to the client via push notifications. This is particularly valuable for AI agents that need to perform complex computations or external API calls without blocking the conversation flow. Docker‑ready deployment and a lightweight Python SDK (HTTMCP) make it easy to spin up the gateway in production or CI environments.

Real‑world use cases span from building Claude plugins that need to stream live data, to creating LLM agents that orchestrate multiple internal services in real time. Developers can also expose internal APIs to Claude through a single OpenAPI specification, eliminating the need for custom adapters. The gateway’s low‑latency pub/sub layer ensures that even under heavy load, responses arrive quickly and reliably, which is essential for user‑facing AI applications that demand instant feedback.

In summary, Nchan MCP Transport transforms the MCP ecosystem into a high‑performance, WebSocket‑friendly platform. It empowers AI developers to build scalable, real‑time integrations with Claude and other LLM agents while keeping the development workflow simple through OpenAPI integration, asynchronous task handling, and Docker‑friendly deployment.