About
A TypeScript MCP server that forwards chat requests to any OpenAI‑compatible provider (OpenAI, Perplexity, Groq, xAI, PyroPrompts, etc.) using environment variables for configuration. It exposes a single chat tool that can be added to Claude Desktop or other MCP clients.
Capabilities

The Any Chat Completions MCP Server bridges the gap between Claude and any LLM provider that offers an OpenAI‑compatible chat completion API. By exposing a single tool, it allows developers to plug in services such as OpenAI, Perplexity, Groq, xAI, PyroPrompts and more without modifying their existing Claude workflows. This server essentially translates the MCP tool invocation into a standard chat completion request, then streams the response back to Claude in real time.
For developers building AI‑augmented applications, this capability is invaluable. It removes the need to write custom adapters for each provider; instead, a single MCP server instance can be reused across multiple models by simply changing environment variables. This modularity means you can experiment with new LLMs, switch between models for cost or performance reasons, and even run multiple providers side‑by‑side within the same Claude session. The server’s TypeScript implementation ensures type safety and fast development cycles, while its compatibility with the OpenAI SDK guarantees minimal friction for teams already familiar with that ecosystem.
Key features include:
- Single‑tool simplicity: A single tool handles all chat interactions, making the MCP surface minimal and easy to understand.
- Environment‑driven configuration: API keys, model names, base URLs and other parameters are supplied via environment variables, allowing secure, per‑instance customization.
- Multi‑provider support: The same binary can be launched multiple times with different env settings, yielding separate tools in Claude’s UI for each LLM.
- Streaming responses: Chat completions are streamed back to Claude, preserving the conversational feel and enabling real‑time interaction.
- Cross‑platform compatibility: The server can be run on macOS, Windows or Linux, and is easily integrated into desktop clients like Claude Desktop or web interfaces such as LibreChat.
Typical use cases span from rapid prototyping—where a developer wants to test a new model without rewriting code—to production deployments that require dynamic switching between providers for latency or cost optimization. For example, a customer support bot might use the cheaper Perplexity model for general queries while reserving GPT‑4o for complex troubleshooting. In research settings, the ability to toggle models on demand facilitates comparative studies of LLM performance across domains.
In short, the Any Chat Completions MCP Server offers a lightweight, configurable bridge that extends Claude’s reach to virtually any OpenAI‑compatible chat service. Its straightforward integration pattern and robust feature set make it a practical choice for developers seeking flexibility, speed, and scalability in AI‑driven applications.
Related Servers
Netdata
Real‑time infrastructure monitoring for every metric, every second.
Awesome MCP Servers
Curated list of production-ready Model Context Protocol servers
JumpServer
Browser‑based, open‑source privileged access management
OpenTofu
Infrastructure as Code for secure, efficient cloud management
FastAPI-MCP
Expose FastAPI endpoints as MCP tools with built‑in auth
Pipedream MCP Server
Event‑driven integration platform for developers
Weekly Views
Server Health
Information
Explore More Servers
Gcore MCP Server
Interact with Gcore Cloud via LLM assistants
Awesome MCP Server CN
Curated list of Chinese MCP servers for developers
Openfort MCP Server
Plug‑and‑play AI interface for Openfort wallet infrastructure
Useful Model Context Protocol Servers (MCPS)
A collection of Python MCP servers for AI assistant utilities
GitHub MCP Server Plus
Powerful GitHub API integration for file, repo, and issue management
Site Cloner MCP Server
Clone entire websites with LLM-powered tools