MCPSERV.CLUB
restful3

Dummy MCP Server

MCP Server

Simple SSE‑based Meta‑Agent Protocol demo server

Stale(55)
0stars
1views
Updated Jun 21, 2025

About

A lightweight FastMCP 1.0.0 server that exposes two tools—echo and dummy—over Server‑Sent Events, ideal for testing MCP clients in a Dockerized environment.

Capabilities

Resources
Access data sources
Tools
Execute functions
Prompts
Pre-built templates
Sampling
AI model interactions

Overview

The Dummy MCP Server is a lightweight, production‑ready example built with the FastMCP framework. It demonstrates how an MCP server can expose simple, synchronous and streaming tools over Server‑Sent Events (SSE) while running inside a Docker container. By providing two intentionally minimal tools— and —the server shows how developers can quickly prototype tool integrations, test AI assistant workflows, and verify communication patterns before moving to more complex services.

What Problem Does It Solve?

Many AI developers need a fast, isolated environment to validate MCP interactions without pulling in heavy dependencies or dealing with authentication overhead. The Dummy MCP Server addresses this by offering a ready‑to‑run container that listens on port 8002 and exposes a single endpoint. It eliminates the need for custom server scaffolding, allowing teams to focus on designing prompts, orchestrating tool calls, and debugging message flows. Because the server runs in Docker Compose, it can be spun up with a single command and shut down cleanly, making it ideal for CI/CD pipelines or local experimentation.

Server Functionality and Value

At its core, the server implements two distinct tool types:

  • – A synchronous, blocking tool that echoes a received string back to the caller. It is useful for quick sanity checks, logging, or as a placeholder when a real backend service is not yet available.
  • – An SSE‑streaming tool that sends a message three times with one‑second pauses. This showcases how long‑running or incremental responses can be streamed to an AI assistant, enabling real‑time feedback or progress updates.

Both tools are registered in and automatically become available to any MCP client that connects. The server’s SSE implementation follows the FastMCP convention, so clients only need to point their to . No authentication is required, which lowers the barrier to entry for learning and testing.

Key Features in Plain Language

  • FastMCP 1.0.0 Integration – Leverages the latest stable MCP framework with minimal configuration.
  • SSE Streaming Support – Enables real‑time, event‑based communication without WebSocket overhead.
  • Docker & Docker Compose Ready – One‑file provides instant deployment, networking, and port mapping.
  • Tool Registration Example – Shows how to add new tools by simply creating a Python module and registering it in the main application.
  • Extensible Architecture – The server can be extended to include authentication, richer tool logic, or additional endpoints without changing the core.

Use Cases and Real‑World Scenarios

  • Rapid Prototyping – Quickly spin up a local MCP server to test new tool designs or prompt strategies.
  • CI/CD Integration – Run the container in a build pipeline to validate tool availability before deploying production services.
  • Educational Demonstrations – Use the streaming example to illustrate how AI assistants can consume progressive data streams.
  • n8n Workflow Integration – The README includes a guide for connecting the server to n8n, enabling automated workflows that trigger MCP tool calls as part of broader data pipelines.

Integration with AI Workflows

Developers can connect any MCP‑compatible client—such as Claude, GPT‑4o, or custom agents—to the Dummy MCP Server by configuring the SSE endpoint and specifying which tools to expose. Because the server follows standard MCP conventions, clients automatically discover tool schemas and can invoke them with JSON payloads. The tool is perfect for quick, synchronous responses, while the tool demonstrates streaming capabilities that can be leveraged in conversational agents needing real‑time updates or progress notifications.

Unique Advantages

  • Zero Setup Overhead – No code changes required to run the server; Docker Compose handles everything.
  • Clear Separation of Concerns – Tools are isolated modules, making it straightforward to swap out or extend functionality.
  • Demonstrates Both Sync and Stream – Provides a concise example of how MCP can handle both instant and continuous responses.
  • Developer‑Friendly Documentation – The README walks through Docker usage, n8n integration, and tool details, reducing friction for new users.

In summary, the Dummy MCP Server is a practical playground that embodies best practices in MCP server design while remaining simple enough for quick experimentation. It serves as both a learning tool and a reusable component in larger AI‑powered systems.