About
A Python demo that runs two Model Context Protocol servers—one math server via stdio and one weather server via Server‑Sent Events—and connects them with a LangChain agent to answer mixed queries.
Capabilities
MCP Multi‑Server Demo with SSE Transport
The MCP Multi‑Server Demo showcases how an AI assistant can orchestrate a diverse set of external tools by connecting to multiple MCP servers that use different transport protocols. By combining a lightweight stdio‑based math server with an HTTP‑based Server‑Sent Events (SSE) weather server, the demo demonstrates that a single agent can seamlessly query distinct services without being bound to one communication channel.
At its core, the system provides an agent‑oriented workflow. The main application launches the weather server as a separate process, then creates a that aggregates both the math and weather MCP endpoints. LangChain’s agent framework consumes this aggregated client, treating each server as a collection of tools that can be invoked on demand. When the user poses a question such as “what’s (3 + 5) × 12?”, the agent automatically selects the math tool; for “what is the weather in NYC?” it hands off to the weather tool. This dynamic routing eliminates the need for hard‑coded API calls and allows developers to plug in new services with minimal friction.
Key capabilities of the demo include:
- Transport agnosticism: By supporting both stdio and SSE, developers can choose the most appropriate channel for each service—low‑latency pipes for local utilities and scalable HTTP streams for remote data sources.
- Tool discovery & invocation: The MCP server exposes metadata (resource names, parameters, and descriptions) that the agent uses to present a natural language interface. This removes boilerplate code for argument parsing and error handling.
- Extensibility: Adding a new tool is as simple as creating another MCP server and registering it with the . The agent automatically inherits the new capabilities, enabling rapid iteration.
Real‑world scenarios that benefit from this architecture include:
- Enterprise chatbots that need to access both internal analytics (via stdio) and public data feeds (via SSE).
- IoT control systems where local devices expose MCP interfaces over serial connections while cloud services provide updates through SSE.
- Developer tools that combine code execution, static analysis, and live documentation lookup into a single conversational interface.
By unifying heterogeneous services under the MCP umbrella, this demo illustrates how developers can build robust, modular AI assistants that adapt to varied transport mechanisms and evolving tool sets.
Related Servers
MarkItDown MCP Server
Convert documents to Markdown for LLMs quickly and accurately
Context7 MCP
Real‑time, version‑specific code docs for LLMs
Playwright MCP
Browser automation via structured accessibility trees
BlenderMCP
Claude AI meets Blender for instant 3D creation
Pydantic AI
Build GenAI agents with Pydantic validation and observability
Chrome DevTools MCP
AI-powered Chrome automation and debugging
Weekly Views
Server Health
Information
Explore More Servers
CodePortal MCP Server
Organize code projects and access AI locally
PIF Self‑Modifying MCP Server
Dynamic tool creation and formal reasoning on the fly
Delve MCP Server
AI‑powered Go debugging via Delve
DeepSeek Thinking Claude 3.5 Sonnet MCP
Two‑stage reasoning and response generation in one server
Fetch MCP
Quickly retrieve web content and YouTube transcripts
Ollama MCP Server
Seamless Ollama integration via Model Context Protocol