About
A dual‑port server that handles standard MCP requests on port 8080 and provides WebSocket subscriptions on port 8765, delivering instant push notifications when new data is available.
Capabilities
Overview
The MCP WebSocket Server is a hybrid solution that unifies the traditional request‑response model of MCP with real‑time push capabilities. It addresses a common pain point for developers building AI assistants: the need to keep context and data in sync across multiple clients without resorting to inefficient polling or complex message queues. By exposing a standard MCP endpoint on port 8080 and a WebSocket hub on port 8765, the server lets AI agents fetch data in the familiar MCP style while also subscribing to live updates whenever that data changes.
At its core, the server receives ordinary MCP requests—such as or other custom methods—and returns responses in the expected JSON format. The added WebSocket layer enables clients to issue a simple message and receive asynchronous notifications whenever new data is available. This push mechanism eliminates latency introduced by polling loops, reduces network overhead, and keeps conversational state fresh for all connected assistants. The architecture is fully asynchronous, leveraging Python’s to handle thousands of concurrent connections with minimal resource consumption.
Key capabilities include:
- Dual‑port operation: Separate MCP and WebSocket services simplify deployment and allow existing MCP clients to continue functioning unchanged.
- Push notifications: Whenever the underlying data store is updated, every subscribed client receives an immediate message, ensuring consistency across distributed agents.
- Scalable async design: Non‑blocking I/O means the server can grow with demand without needing additional threads or processes.
- Extensibility: Developers can plug in custom data sources, authentication layers, or transform logic without altering the core protocol handling.
Typical use cases arise in AI‑driven dashboards, collaborative editing tools, or multi‑assistant environments where context must be shared in real time. For example, a financial analytics assistant can pull historical data via MCP while simultaneously receiving live market updates through the WebSocket channel, allowing it to provide up‑to‑date insights without re‑initiating a request. In collaborative coding assistants, multiple developers can see code changes instantly as the server pushes updates to all connected agents.
Integration into existing AI workflows is straightforward. A Claude or other MCP‑compatible assistant can be configured to send standard requests to the HTTP endpoint for one‑off data retrieval, and then open a WebSocket connection to listen for change events. The server’s lightweight design means it can be deployed behind a reverse proxy or within a container orchestration platform, making it suitable for both prototype projects and production deployments. Its open‑source nature invites community contributions, ensuring that the tool can evolve alongside emerging AI assistant use cases.
Related Servers
MarkItDown MCP Server
Convert documents to Markdown for LLMs quickly and accurately
Context7 MCP
Real‑time, version‑specific code docs for LLMs
Playwright MCP
Browser automation via structured accessibility trees
BlenderMCP
Claude AI meets Blender for instant 3D creation
Pydantic AI
Build GenAI agents with Pydantic validation and observability
Chrome DevTools MCP
AI-powered Chrome automation and debugging
Weekly Views
Server Health
Information
Explore More Servers
Sanity MCP Server
Integrate Sanity.io with Claude Desktop effortlessly
BigGo MCP Server
Price comparison and product discovery via BigGo APIs
Mcp Veo2 Video Generation Server
Generate videos from text or images using Google Veo2
ChatSum MCP Server
Summarize chat conversations with ease
Slowtime MCP Server
Secure time‑based operations with fuzzed timing and interval encryption
GibsonAI MCP Server
Powerful database tooling via natural language