About
A lightweight Model Context Protocol (MCP) server that runs as a single AWS Lambda function and exposes an HTTP POST endpoint via API Gateway. It supports local development with serverless-offline and includes example JSON‑RPC tools.
Capabilities
Serverless MCP Server
The Serverless MCP Server is a lightweight, cloud‑native implementation of the Model Context Protocol (MCP) that runs entirely on AWS Lambda and is exposed through Amazon API Gateway. By packaging MCP into a single stateless function, it removes the need for dedicated infrastructure, allowing developers to focus on building AI‑powered tools rather than managing servers. This server solves the common pain point of integrating MCP into existing cloud workflows—providing a ready‑to‑deploy, cost‑efficient entry point for AI assistants that need to call external services or run custom logic.
At its core, the server listens for JSON‑RPC requests on a endpoint. It uses the official to parse incoming calls, route them to registered tools, and return responses in the MCP format. The implementation leverages Middy middleware for Lambda to handle HTTP errors, logging, and request validation, ensuring that the server behaves predictably in production. A simple “add” tool is bundled as an example, demonstrating how developers can expose arbitrary JavaScript functions over MCP with minimal boilerplate.
Key capabilities include:
- Zero‑configuration deployment: A single Lambda function, a minimal , and the Serverless Framework make it trivial to spin up or tear down an MCP endpoint.
- Local development support: The plugin mirrors the production API Gateway, allowing rapid iteration with local HTTP requests or Postman.
- HTTP vs REST: The repository supports both API Gateway V1 (REST) and V2 (HTTP APIs), giving teams the flexibility to choose the most suitable gateway type for latency or cost considerations.
- Extensibility: Tools are registered via the API, enabling developers to add new functionalities—whether simple arithmetic, database queries, or calls to third‑party services—without touching the underlying Lambda handler.
Real‑world use cases abound: a customer support bot can query an internal knowledge base by invoking a “search” tool; a data‑science assistant can trigger ETL jobs on AWS Glue through a “runJob” tool; or a finance app can perform real‑time currency conversions by calling an external API wrapped as a tool. Because MCP is designed for composable AI workflows, the Serverless MCP Server becomes a central hub where multiple tools are orchestrated by an LLM or other AI orchestrator.
In summary, the Serverless MCP Server delivers a cost‑effective, scalable, and developer‑friendly way to expose custom tools to AI assistants. By abstracting away infrastructure concerns and providing a clean JSON‑RPC interface, it empowers teams to rapidly prototype and deploy AI‑enabled services that can be seamlessly integrated into larger machine learning pipelines.
Related Servers
n8n
Self‑hosted, code‑first workflow automation platform
FastMCP
TypeScript framework for rapid MCP server development
Activepieces
Open-source AI automation platform for building and deploying extensible workflows
MaxKB
Enterprise‑grade AI agent platform with RAG and workflow orchestration.
Filestash
Web‑based file manager for any storage backend
MCP for Beginners
Learn Model Context Protocol with hands‑on examples
Weekly Views
Server Health
Information
Explore More Servers
Grafana MCP Server
Real-time metrics integration for Grafana via MCP
Google Cloud MCP Server
MCP server for Google Cloud services in Go
Dameng MCP Server
MCP service for Dameng 8 databases
BrowserStack MCP Server
Run real-device tests with natural language from your IDE
WikiFunctions MCP Server
Bridging AI models to Wikimedia code library
SQL Server Express MCP Server
Seamless Microsoft SQL Server Express integration for MCP