About
Mcp Origin is a lightweight MCP server that proxies tool calls to multiple connected MCP servers using a consistent naming scheme. It stores configurations in JSON, automatically discovers tools, and supports stdio and HTTP transports.
Capabilities
Overview
mcp‑origin is a lightweight, single‑point MCP server that consolidates access to multiple external MCP servers. It solves the common pain of juggling several tool providers by offering a unified proxy that forwards calls to the correct backend based on a consistent naming scheme. Developers can now register, discover, and invoke tools from disparate MCP services through one API surface, simplifying integration into AI assistant workflows.
The server exposes a set of administrative tools that manage the lifecycle of connected MCP servers. With and , developers can programmatically add or remove servers using a simple JSON configuration. The command keeps the tool registry up‑to‑date, automatically pulling new or removed tools from each backend. This dynamic discovery eliminates manual reconfiguration when a server’s capabilities change.
Each connected MCP server is identified by a unique ID, and its tools are namespaced with that ID as a prefix (). This clear separation prevents name collisions and makes it obvious which backend a tool originates from. For example, a tool provided by the server with ID becomes callable as . This naming convention is consistent across all supported transports, ensuring that the same API works whether the server communicates over stdio, HTTP, or SSE.
mcp‑origin’s configuration is intentionally simple: a single JSON file stores all server definitions, making it easy to version‑control or share across environments. The command‑line options expose the server’s listening address, enable optional transports, and allow custom configuration paths. By default it listens on port 8080 and supports stdio, with SSE support slated for future releases.
Real‑world scenarios that benefit from mcp‑origin include building a multi‑tool AI assistant that pulls data from a knowledge base server, an analytics engine, and a proprietary service, all without exposing each service’s individual MCP endpoints. It also facilitates rapid prototyping: developers can spin up a new server, register it, and immediately start calling its tools without modifying the assistant’s code. In enterprise settings, mcp‑origin can act as a gateway that enforces authentication or rate limits across all downstream MCP services, providing a single point of policy enforcement.
Overall, mcp‑origin streamlines the integration of multiple MCP backends into a cohesive AI workflow. Its automated discovery, simple configuration, and clear namespacing give developers the flexibility to compose powerful assistants while keeping infrastructure overhead minimal.
Related Servers
MindsDB MCP Server
Unified AI-driven data query across all sources
Homebrew Legacy Server
Legacy Homebrew repository split into core formulae and package manager
Daytona
Secure, elastic sandbox infrastructure for AI code execution
SafeLine WAF Server
Secure your web apps with a self‑hosted reverse‑proxy firewall
mediar-ai/screenpipe
MCP Server: mediar-ai/screenpipe
Skyvern
MCP Server: Skyvern
Weekly Views
Server Health
Information
Explore More Servers
Hacker News MCP Server
Instant access to Hacker News data via Model Context Protocol
Quip MCP Server
Read Quip spreadsheets as CSV via MCP
Express MCP Server
Fast, lightweight Express-based MCP server template
GitHub Code Explorer MCP Server
Search and view GitHub code via Model Context Protocol
World Bank MCP Server
AI-driven access to World Bank data
EEG Server
Real‑time EEG data streaming for multimodal medical research