MCPSERV.CLUB
pyroprompts

Any Chat Completions MCP Server

MCP Server

Bridge any OpenAI‑compatible chat API to Claude Desktop

Active(70)
141stars
3views
Updated 17 days ago

About

A TypeScript MCP server that forwards chat requests to any OpenAI‑compatible provider (OpenAI, Perplexity, Groq, xAI, PyroPrompts, etc.) using environment variables for configuration. It exposes a single chat tool that can be added to Claude Desktop or other MCP clients.

Capabilities

Resources
Access data sources
Tools
Execute functions
Prompts
Pre-built templates
Sampling
AI model interactions

Claude Desktop Home with Chat Tools

The Any Chat Completions MCP Server bridges the gap between Claude and any LLM provider that offers an OpenAI‑compatible chat completion API. By exposing a single tool, it allows developers to plug in services such as OpenAI, Perplexity, Groq, xAI, PyroPrompts and more without modifying their existing Claude workflows. This server essentially translates the MCP tool invocation into a standard chat completion request, then streams the response back to Claude in real time.

For developers building AI‑augmented applications, this capability is invaluable. It removes the need to write custom adapters for each provider; instead, a single MCP server instance can be reused across multiple models by simply changing environment variables. This modularity means you can experiment with new LLMs, switch between models for cost or performance reasons, and even run multiple providers side‑by‑side within the same Claude session. The server’s TypeScript implementation ensures type safety and fast development cycles, while its compatibility with the OpenAI SDK guarantees minimal friction for teams already familiar with that ecosystem.

Key features include:

  • Single‑tool simplicity: A single tool handles all chat interactions, making the MCP surface minimal and easy to understand.
  • Environment‑driven configuration: API keys, model names, base URLs and other parameters are supplied via environment variables, allowing secure, per‑instance customization.
  • Multi‑provider support: The same binary can be launched multiple times with different env settings, yielding separate tools in Claude’s UI for each LLM.
  • Streaming responses: Chat completions are streamed back to Claude, preserving the conversational feel and enabling real‑time interaction.
  • Cross‑platform compatibility: The server can be run on macOS, Windows or Linux, and is easily integrated into desktop clients like Claude Desktop or web interfaces such as LibreChat.

Typical use cases span from rapid prototyping—where a developer wants to test a new model without rewriting code—to production deployments that require dynamic switching between providers for latency or cost optimization. For example, a customer support bot might use the cheaper Perplexity model for general queries while reserving GPT‑4o for complex troubleshooting. In research settings, the ability to toggle models on demand facilitates comparative studies of LLM performance across domains.

In short, the Any Chat Completions MCP Server offers a lightweight, configurable bridge that extends Claude’s reach to virtually any OpenAI‑compatible chat service. Its straightforward integration pattern and robust feature set make it a practical choice for developers seeking flexibility, speed, and scalability in AI‑driven applications.