About
The Mcp App Demo is a minimal chat application that showcases how to run a local Model Context Protocol server protected by Pomerium. It allows OpenAI or other LLMs to securely query a demo SQLite database through authenticated MCP endpoints.
Capabilities
The Pomerium Chat MCP server is a lightweight, secure gateway that demonstrates how an AI assistant can interact with local data stores through the Model Context Protocol while remaining protected by enterprise‑grade identity and access management. It solves the problem of exposing internal services—such as a SQLite database or any custom data source—to external LLMs without opening the network to the public internet. By leveraging Pomerium’s OAuth 2.1 authentication, the server can automatically obtain fresh upstream tokens on behalf of the user and pass them along to downstream APIs, ensuring that only authorized users can query sensitive data.
At its core, the server offers a single resource: a database connection that the LLM can query via MCP’s Database capability. When an assistant receives a user prompt like “What were our sales by year?”, the LLM generates a structured query, sends it to the MCP server, and receives the result in real time. This tight integration eliminates the need for custom adapters or manual data extraction pipelines, allowing developers to focus on building conversational experiences rather than plumbing. The server’s Docker‑compose configuration bundles a demo SQLite instance, making it trivial to spin up a fully functional test environment.
Key features include:
- Secure remote access: Pomerium protects the MCP endpoints with token‑based authentication and fine‑grained policy rules, so only users within a specified domain can reach the server.
- OAuth token proxying: The gateway fetches and forwards OAuth tokens from upstream providers (e.g., Google Drive, GitHub) to the MCP server, enabling seamless integration with services that require user‑delegated credentials.
- Zero‑configuration LLM calls: Once the external token is issued, any LLM client (OpenAI, Claude.ai, or a custom agent) can invoke the MCP endpoints directly without additional plumbing.
- Demo database integration: The included SQLite server demonstrates real‑world data querying, while the architecture supports swapping in any database or API by adding a new MCP resource.
Typical use cases span from internal business analytics—where analysts want an LLM to answer questions against company data—to secure knowledge bases that must remain behind corporate firewalls. Developers can embed the MCP server into existing microservices, expose it through Pomerium to a private LLM deployment, and let agents act as data‑aware assistants that respect organizational access controls.
By combining MCP’s expressive capability model with Pomerium’s robust identity layer, the Pomerium Chat server provides a ready‑made, secure foundation for building agentic applications that need to read and write to protected data sources without compromising security or requiring custom authentication logic.
Related Servers
Netdata
Real‑time infrastructure monitoring for every metric, every second.
Awesome MCP Servers
Curated list of production-ready Model Context Protocol servers
JumpServer
Browser‑based, open‑source privileged access management
OpenTofu
Infrastructure as Code for secure, efficient cloud management
FastAPI-MCP
Expose FastAPI endpoints as MCP tools with built‑in auth
Pipedream MCP Server
Event‑driven integration platform for developers
Weekly Views
Server Health
Information
Explore More Servers
Azure DevOps MCP Server
Integrate your IDE with Azure DevOps via MCP
MCP LLMS Txt
Embed LLM‑text docs directly into your conversation
LibreChat MCP Server
AI chat interface built on Next.js
VNDB MCP Server
Access Visual Novel data via Claude AI effortlessly
AGC Todo MCP Server
Intelligent task manager powered by Claude Desktop
Kusto MCP Server
Connect to Azure Data Explorer from any MCP client