MCPSERV.CLUB
criteo

OpenAPI to MCP Server

MCP Server

Generate strongly typed tools from OpenAPI specs

Stale(55)
22stars
0views
Updated 12 days ago

About

Transforms an OpenAPI specification into a set of MCP-compatible, strongly typed tools, supporting authentication and custom naming strategies for seamless API integration.

Capabilities

Resources
Access data sources
Tools
Execute functions
Prompts
Pre-built templates
Sampling
AI model interactions

Github demo

Overview

The openapi-to-mcp server solves a common friction point for developers who want to expose RESTful APIs as AI‑ready tools. Traditionally, an AI assistant must be wired manually to each endpoint, writing adapters that translate JSON payloads into the tool‑call format expected by Model Context Protocol (MCP). This process is repetitive and error‑prone, especially for large or evolving APIs. openapi-to-mcp bridges that gap by automatically converting an OpenAPI (or Swagger) specification into a fully‑functional MCP server. The result is a set of strongly typed tools that mirror the original API’s operations, ready for instant consumption by Claude or any MCP‑compliant assistant.

At its core, the server parses the OpenAPI document—whether fetched from a URL or read from a local file—and generates tool definitions for each operation. It preserves the original parameter schemas, request bodies, and response types, ensuring that calls made through the assistant are validated against the API contract. Authentication is handled transparently: developers can inject bearer tokens, OAuth 2 flows, or host overrides directly in the MCP configuration. This means that protected endpoints become available to the assistant without exposing credentials or writing custom logic.

Key capabilities include:

  • Automatic tool naming with several strategies (extension, operation ID, verb‑and‑path) to accommodate diverse OpenAPI designs.
  • Full support for OAuth 2 with client‑credentials, password, and refresh‑token grants, allowing secure access to APIs that require dynamic token acquisition.
  • Custom instructions that the MCP server can advertise, giving assistants context about how to use the generated tools.
  • Strong typing of inputs and outputs, leveraging the OpenAPI schema to enforce correct data shapes at runtime.

Real‑world use cases abound. A developer maintaining a microservices architecture can expose all services to an AI assistant, enabling natural‑language queries like “List all users who signed up last week” that are translated into precise API calls. In DevOps, an AI can automatically trigger build pipelines or fetch deployment status by calling the corresponding GitHub or Azure DevOps endpoints. Even in data science, a notebook assistant can invoke an internal data API to retrieve metrics or upload artifacts without manual wrapper code.

Integration into existing AI workflows is straightforward: the MCP server runs as a lightweight process (or container) and registers itself in the client’s configuration. Once registered, the assistant discovers the tools through MCP discovery and can invoke them as part of a conversation. Because the server handles serialization, validation, and authentication internally, developers can focus on building higher‑level logic rather than plumbing.

The standout advantage of openapi-to-mcp is its zero‑code, schema‑driven approach. By turning an OpenAPI spec into a ready‑to‑use MCP server, it eliminates boilerplate adapters, guarantees consistency with the API contract, and scales effortlessly as the underlying service evolves. This makes it an indispensable component for any team looking to unlock their APIs with conversational AI.