MCPSERV.CLUB
ckanthony

OpenAPI‑MCP Server

MCP Server

Generate MCP tools from any OpenAPI spec in Docker

Stale(50)
140stars
0views
Updated 24 days ago

About

OpenAPI‑MCP is a Dockerized Model Context Protocol server that reads Swagger or OpenAPI files and automatically produces MCP tool definitions, enabling AI agents to interact with any API without custom coding.

Capabilities

Resources
Access data sources
Tools
Execute functions
Prompts
Pre-built templates
Sampling
AI model interactions

demo

OpenAPI‑MCP is a Docker‑based Model Context Protocol server that automatically turns any Swagger or OpenAPI specification into a fully‑functional MCP toolset. By feeding the server an or , developers can expose the entire API surface to AI assistants without writing custom adapters or boilerplate code. The server parses the spec, generates tool schemas for each operation, and proxies calls to the underlying API while keeping sensitive credentials hidden from the client.

This approach solves a common pain point for AI‑powered workflows: integrating external services in a secure, standardized way. Traditional methods require manual mapping of endpoints to tool definitions and repeated updates whenever the API evolves. OpenAPI‑MCP eliminates that maintenance burden by generating tools on demand, automatically reflecting any changes in the specification. It also handles authentication transparently—API keys can be injected into headers, query parameters, or cookies from environment variables or files, ensuring that the AI assistant never sees raw credentials.

Key capabilities include:

  • Full OpenAPI v2 and v3 support – the server understands both Swagger 2.0 and modern OpenAPI 3.x formats, making it compatible with most public and private APIs.
  • Automatic schema creation – operation parameters, request bodies, and responses are translated into MCP schemas, providing rich type information for the client.
  • Selective exposure – developers can filter operations or tags to expose only what is needed, keeping the toolset focused and secure.
  • Custom request headers – additional headers (e.g., for tracing or alternate auth mechanisms) can be injected via an environment variable.
  • Dockerized deployment – the entire service runs in a container, simplifying installation and scaling across environments.

In practice, OpenAPI‑MCP enables scenarios such as:

  • A customer support AI that can query a ticketing system, create tickets, or retrieve status updates by simply referencing the system’s OpenAPI spec.
  • A data‑analysis assistant that pulls metrics from a cloud monitoring API without exposing the API key to the model.
  • Rapid prototyping of new AI agents that need to interact with third‑party services; developers can swap out the OpenAPI file and instantly gain full access.

By bridging standard API documentation with MCP’s tool abstraction, OpenAPI‑MCP provides developers a low‑friction, secure, and automated pathway to enrich AI assistants with external data and functionality.