About
Grpcmcp is a lightweight MCP server that forwards requests to a gRPC backend. It supports descriptor files or reflection, and can expose an SSE endpoint or read commands from STDIN.
Capabilities
Grpcmcp – A Lightweight MCP Proxy for gRPC Backends
Grpcmcp is a minimal, opinion‑free Model Context Protocol (MCP) server that bridges an MCP client to any gRPC service. By consuming either a protobuf descriptor set or gRPC server reflection, it exposes the same set of resources, tools, prompts and sampling endpoints that a standard MCP server would provide. The core problem it solves is the need for an easy, drop‑in MCP layer when you already have a gRPC service but want to plug it into Claude or other AI assistants that communicate over MCP.
The server listens on a configurable host and port. When the option is supplied, Grpcmcp serves an SSE endpoint () that forwards MCP messages to the underlying gRPC service. If is omitted, Grpcmcp can also be invoked via stdin/stdout, allowing it to be launched as a child process of an MCP client. In both cases the server transparently translates MCP messages into gRPC calls and streams responses back to the client, making it feel like a native MCP server.
Key capabilities include:
- Dynamic discovery – With the flag Grpcmcp can query a live gRPC server for its services, methods and message types, eliminating the need to maintain static descriptor files.
- Selective exposure – The option lets you filter which fully‑qualified gRPC services are made available, useful for multi‑service backends where only a subset should be exposed to the assistant.
- Secure integration – Tokens can be injected via or an environment variable () and arbitrary headers are supported, enabling authenticated calls to protected gRPC endpoints.
- Flexible transport – The SSE endpoint makes it trivial to embed the MCP server in web applications, while the stdin mode is ideal for containerized deployments or as a sidecar.
Typical use cases include:
- AI‑powered microservices – Wrap an existing gRPC service with MCP so that Claude can invoke it as a tool, enabling natural‑language control of business logic.
- Rapid prototyping – Quickly expose a local gRPC development server to an MCP client without writing additional glue code.
- Secure internal tooling – Use the bearer‑token and header features to connect to authenticated gRPC APIs behind corporate firewalls.
By acting as a thin, configurable proxy, Grpcmcp lets developers leverage the full power of MCP while reusing their existing gRPC infrastructure. Its simplicity, combined with reflection support and secure header handling, makes it a standout choice for integrating gRPC services into AI assistant workflows.
Related Servers
n8n
Self‑hosted, code‑first workflow automation platform
FastMCP
TypeScript framework for rapid MCP server development
Activepieces
Open-source AI automation platform for building and deploying extensible workflows
MaxKB
Enterprise‑grade AI agent platform with RAG and workflow orchestration.
Filestash
Web‑based file manager for any storage backend
MCP for Beginners
Learn Model Context Protocol with hands‑on examples
Weekly Views
Server Health
Information
Explore More Servers
GitLab MCP Server Tools
Adapt and troubleshoot Git MCP servers for GitLab
MindBridge MCP Server
AI command hub for multi‑model orchestration
Gmail Attachment MCP Server
Download Gmail attachments via Message Control Protocol
Steadybit MCP Server
LLM‑powered interaction with Steadybit experiments
MacOS Resource Monitor MCP Server
Track CPU, memory and network usage on macOS in real time
xPilot MCP Server Library
Modular servers for model context provisioning