MCPSERV.CLUB
nihal1294

OpenAPI → MCP Server

MCP Server

Convert OpenAPI specs into a Node.js MCP proxy in seconds

Stale(55)
1stars
2views
Updated Jul 3, 2025

About

This Python CLI tool generates a ready‑to‑run Node.js/TypeScript MCP server that maps each OpenAPI v3 operation to an MCP tool, acting as a proxy translating MCP calls into HTTP requests.

Capabilities

Resources
Access data sources
Tools
Execute functions
Prompts
Pre-built templates
Sampling
AI model interactions

OpenAPI → MCP Server Demo

The OpenAPI → MCP server bridges the gap between traditional REST APIs and modern AI assistants that rely on the Model Context Protocol. By parsing an OpenAPI v3 specification—whether hosted locally or served over HTTP—the tool automatically generates a fully‑functional Node.js/TypeScript MCP server. Each API operation (GET, POST, PUT, DELETE, PATCH) becomes an MCP tool, complete with a JSON‑Schema derived from the operation’s parameters and request bodies. This eliminates manual work for developers who need to expose existing services to Claude, Gemini, or other MCP‑compatible assistants.

For AI developers, the server offers a ready‑made proxy that translates MCP tool calls into standard HTTP requests. It supports configurable transports (stdio or Server‑Sent Events), customizable ports, and environment‑based configuration for the target API base URL and authentication headers. Error handling is also automated: HTTP status codes are mapped to MCP error codes, ensuring that assistants receive meaningful feedback when something goes wrong. The generated project includes a comprehensive , strict TypeScript settings, linting and formatting tools, as well as unit and integration tests—all of which help maintain quality and ease future modifications.

Key capabilities include:

  • Automatic tool creation from OpenAPI definitions, reducing boilerplate and the risk of human error.
  • Dynamic input schemas that reflect complex parameter types, arrays, enums, and even basic local resolution.
  • Extensible transport options that let you run the server in diverse environments, from local development to cloud functions.
  • Secure configuration via a file, keeping credentials out of source control while still allowing quick setup.
  • Robust logging with JSON output, facilitating monitoring and debugging in production.

Typical use cases span from internal tooling to public API exposure: a company can quickly convert its Swagger‑defined services into MCP tools, enabling chat‑based interfaces for data retrieval or command execution. Researchers can prototype conversational agents that interact with experimental APIs without writing custom adapters. Moreover, the server’s auto‑generated tests provide confidence that the translation layer remains correct as APIs evolve.

In summary, the OpenAPI → MCP server turns static API contracts into dynamic, assistant‑ready services with minimal effort. It streamlines the integration of existing REST endpoints into AI workflows, offers a clean separation between API logic and assistant communication, and delivers a production‑grade codebase out of the box—making it an indispensable asset for developers looking to empower their applications with conversational AI.