MCPSERV.CLUB
spontaneous-order

Mcp Http Proxy

MCP Server

Bridge MCP stdio to HTTP and SSE

Stale(50)
0stars
0views
Updated Apr 11, 2025

About

An Express‑based proxy that spawns an MCP server as a child process, exposing its stdio interface over HTTP endpoints and Server‑Sent Events for real‑time communication. Ideal for web apps, CLI tools, or any HTTP client needing MCP access without SDKs.

Capabilities

Resources
Access data sources
Tools
Execute functions
Prompts
Pre-built templates
Sampling
AI model interactions

Overview

The MCP HTTP Proxy is a lightweight Node.js service that turns any standard MCP (Model Context Protocol) server into a web‑friendly endpoint. By spawning the MCP process as a child and mediating all communication through HTTP or Server‑Sent Events (SSE), it removes the need for clients to understand low‑level JSON‑RPC interactions. This makes it trivial to expose a rich set of AI tooling—tools, prompts, resources, and sampling—to web front‑ends, command‑line utilities, or other HTTP clients.

Solving the “MCP plumbing” problem

Developers building AI assistants often need to run an MCP server locally or in a container and then expose its capabilities to other services. The native MCP protocol communicates over /, which is inconvenient for HTTP‑centric ecosystems. The proxy abstracts this plumbing: it starts the MCP process, keeps its lifecycle under control, and translates incoming HTTP requests into JSON‑RPC messages that the MCP understands. Likewise, it streams outgoing JSON‑RPC responses and asynchronous events back to clients via SSE. This eliminates the boilerplate of writing an MCP SDK or managing child processes for every consumer.

Core capabilities in plain language

  • HTTP endpoints – Clients can invoke tools or query resources with simple GET/POST calls. The proxy handles URL‑parameter parsing and JSON‑RPC framing automatically.
  • Raw JSON‑RPC – For advanced use cases, a dedicated endpoint lets callers send any JSON‑RPC payload directly to the MCP server, offering full control without adding a new SDK.
  • SSE stream – Long‑running tool calls or log streams are pushed to browsers or services in real time, enabling responsive UIs and efficient monitoring.
  • Web dashboard – A minimal interface (, ) lists available tools, lets users trigger calls manually, and displays raw request/response traffic for debugging.
  • Process management – The proxy spawns the MCP child process once and keeps it alive, restarting automatically if needed. Clients never have to worry about starting or stopping the underlying server.

Real‑world use cases

  • Browser‑based AI assistants – A web app can call to execute a tool and immediately receive the result via SSE, all without installing an MCP SDK in the browser.
  • Command‑line utilities – Scripts written in Python, Go, or shell can the proxy’s endpoints to trigger tool calls from CI/CD pipelines or local workflows.
  • Hybrid applications – A backend service can forward user requests to the proxy, aggregate responses from multiple tools, and then feed them back into a larger AI pipeline.
  • Debugging & monitoring – The built‑in dashboard lets developers inspect tool listings, send test commands, and watch live logs to diagnose issues quickly.

Unique advantages

Unlike generic JSON‑RPC proxies, this server is tightly coupled to MCP’s interface, ensuring that every message adheres to the protocol’s expectations. It also avoids pulling in an MCP SDK into the proxy itself, keeping dependencies minimal and giving developers fine‑grained control over how requests are serialized or streamed. The combination of HTTP simplicity, SSE real‑time delivery, and a lightweight dashboard makes the MCP HTTP Proxy an ideal bridge between powerful AI toolchains and conventional web or CLI ecosystems.