MCPSERV.CLUB
oatpp

oatpp-mcp

MCP Server

Anthropic Model Context Protocol server for Oat++

Stale(50)
47stars
2views
Updated 15 days ago

About

oatpp-mcp implements Anthropic’s Model Context Protocol (MCP) in the Oat++ framework, enabling automatic tool generation from API controllers and providing prompts, resources, and tools over STDIO or HTTP SSE.

Capabilities

Resources
Access data sources
Tools
Execute functions
Prompts
Pre-built templates
Sampling
AI model interactions

oatpp-mcp Demo

Overview

oatpp‑mcp is an implementation of Anthropic’s Model Context Protocol (MCP) built on top of the Oat++ web framework. It bridges the gap between modern C++ REST APIs and large‑language‑model (LLM) assistants, allowing developers to expose their existing API endpoints as tools that an LLM can invoke directly. By automating the translation of classes into MCP‑compatible tools, it eliminates manual boilerplate and ensures that the API surface remains in sync with the assistant’s capabilities.

The server supports two transport mechanisms: STDIO for local, process‑level communication and HTTP Server-Sent Events (SSE) for web‑based or distributed scenarios. This flexibility lets teams choose the most efficient channel for their deployment model—whether they’re running an LLM locally or hosting a cloud‑based service. The core MCP features—prompts, resources, and tools—are fully exposed: prompts let the assistant start with context‑aware instructions, resources provide static or dynamic data sets to reference, and tools represent callable API endpoints.

Key capabilities include:

  • Automatic tool generation from Oat++ classes, ensuring that any new endpoint is instantly available to the assistant without extra code.
  • Built‑in support for common MCP services such as prompts (e.g., a code‑review prompt), resources (file access, configuration data), and tools (logging, database queries).
  • Dual transport modes that let developers test locally via STDIO or deploy over HTTP SSE for real‑time, event‑driven interactions.
  • Seamless integration with existing Oat++ projects, requiring only the inclusion of the module and a few method calls to register prompts, resources, and tools.

Real‑world use cases span from automated code review assistants that query a repository’s API to fetch file contents, to customer‑support bots that pull product data from an internal catalog via REST endpoints. In each scenario, the MCP server translates LLM requests into concrete HTTP calls, aggregates responses, and streams them back to the assistant in a structured format. This reduces latency, keeps data up‑to‑date, and allows developers to maintain a single source of truth for business logic.

By leveraging oatpp‑mcp, teams can rapidly prototype AI‑powered workflows that interact with complex C++ backends. The automatic tool generation and transport versatility give developers a powerful, low‑overhead solution for embedding LLM capabilities directly into their existing services.