MCPSERV.CLUB
kimtth

MCP FastAPI Ping-Pong Server

MCP Server

Fast, thread-safe MCP ping-pong demo with API and SSE

Stale(50)
1stars
2views
Updated Apr 9, 2025

About

An experimental FastAPI server that showcases Model Context Protocol (MCP) calls via REST endpoints and Server-Sent Events, allowing users to send ping/pong/count commands and receive real-time responses.

Capabilities

Resources
Access data sources
Tools
Execute functions
Prompts
Pre-built templates
Sampling
AI model interactions

MCP Ping‑Pong UI

The MCP Ping‑Pong Server is a lightweight, educational implementation of the Model Context Protocol (MCP) that demonstrates how AI assistants can interact with external services through a FastAPI backend. It showcases the core MCP concepts—tool invocation, prompt retrieval, and session‑aware command handling—in a single, easy‑to‑run server. By exposing both RESTful endpoints and Server‑Sent Events (SSE) streams, the example illustrates two common transport mechanisms for real‑time AI workflows.

At its heart, the server implements three simple MCP tools: , , and . When a client sends the string “ping”, the server replies with “pong”; conversely, sending “pong” returns “ping”. The command maintains a per‑session counter that increments with each invocation, demonstrating how state can be preserved across multiple calls. These tools are wrapped in FastAPI routes () and also exposed via SSE, allowing developers to experiment with both synchronous HTTP calls and asynchronous event streams. The session management is thread‑safe, ensuring that concurrent requests from multiple AI assistants do not interfere with one another.

For developers building AI‑powered applications, this server serves as a sandbox for testing MCP integration without needing to set up a full production environment. By exposing a simple API, it allows rapid prototyping of toolchains: an AI assistant can retrieve the “ping‑pong” prompt, send a command, and receive a deterministic response—all within milliseconds. The SSE variant is particularly useful for streaming scenarios where an assistant needs to process a sequence of responses in real time, such as live data feeds or interactive dialogue.

Key features include:

  • Dual transport support: REST and SSE, giving flexibility for different client architectures.
  • Thread‑safe session handling: Ensures consistent state across concurrent calls, a critical requirement for multi‑user AI assistants.
  • Modular MCP integration: The tool definitions are easily extendable, encouraging developers to add more complex logic or external API calls.
  • Educational focus: The concise codebase and accompanying UI make it ideal for workshops, tutorials, or exploratory learning about MCP.

Typical use cases encompass:

  • Rapid prototyping of new AI tools before moving to a full‑blown service.
  • Testing integration pipelines where an assistant must coordinate multiple backends (e.g., a weather API, database lookup, or custom logic).
  • Demonstrating real‑time AI interactions in classrooms or conferences, showing how assistants can push updates via SSE.
  • Benchmarking the latency and reliability of MCP calls in different network conditions.

Because it is built on FastAPI—a modern, high‑performance framework—developers can easily hook the server into existing Python ecosystems, add authentication layers, or deploy it behind a reverse proxy. The MCP Ping‑Pong Server thus provides both a practical tool for developers and a clear, runnable example of how the Model Context Protocol can be leveraged to bridge AI assistants with external services.