MCPSERV.CLUB
liuhaotian9420

Mcpy CLI

MCP Server

Fast, command‑line MCP service builder for Python functions

Stale(55)
1stars
2views
Updated 14 days ago

About

Mcpy CLI is a lightweight tool that turns Python scripts or modules into Model Context Protocol services. It offers quick packaging, auto‑routing, and one‑click deployment with support for event persistence and caching.

Capabilities

Resources
Access data sources
Tools
Execute functions
Prompts
Pre-built templates
Sampling
AI model interactions

Mcpy CLI Server Demo

Overview

is a lightweight, command‑line toolkit that turns ordinary Python scripts into fully‑featured Model Context Protocol (MCP) services. By scanning a directory of modules, it automatically discovers callable functions and exposes them as MCP tools without the need for manual routing or boilerplate code. This solves a common pain point in AI‑assistant development: the friction of wiring up custom logic to an MCP server. Instead of writing FastAPI endpoints or manually registering RPC handlers, developers can simply point at their codebase and have a ready‑to‑use MCP service running in seconds.

The server is valuable for developers because it abstracts away the plumbing of an MCP implementation. It bundles a FastMCP runtime, handles HTTP/JSON‑RPC framing, and provides optional features such as session caching and event‑store persistence—all configurable via straightforward command‑line flags. This means teams can prototype, iterate, and deploy AI tool integrations with minimal overhead, focusing on business logic rather than infrastructure.

Key capabilities include:

  • Automatic tool discovery and routing: Two architecture modes—composed (single host with prefixed tool names) and routed (per‑module sub‑services)—allow developers to choose the namespace strategy that best fits their project structure.
  • Zero‑config deployment: A single command () launches a local server, while bundles the code into a deployable directory with a start script for production environments.
  • Flexible transport: The default streamable‑HTTP protocol is ideal for most workloads, with legacy SSE support retained for backward compatibility.
  • Stateful features: Optional event‑store persistence (via SQLite) and in‑memory session caching enable long‑running or stateful toolchains without additional infrastructure.
  • Developer tooling integration: The generated service can be inspected and interacted with through MCP Inspector or any MCP‑compatible client such as CherryStudio, making testing and debugging straightforward.

Typical use cases span from rapid prototyping of custom NLP utilities to building production‑grade AI assistants that combine multiple domain experts (math, text, data) into a single MCP endpoint. By reducing the setup time from hours to minutes and eliminating boilerplate, empowers developers to iterate faster and deliver richer AI experiences.