MCPSERV.CLUB
NapthaAI

HTTP + SSE OAuth MCP Server

MCP Server

OAuth‑secured MCP server for Streamable HTTP & SSE

Stale(55)
93stars
2views
Updated 11 days ago

About

A reference implementation of an OAuth‑authorized Model Context Protocol server that supports Streamable HTTP and Server‑Sent Events transports, built on Express.js and Bun.

Capabilities

Resources
Access data sources
Tools
Execute functions
Prompts
Pre-built templates
Sampling
AI model interactions

HTTP + SSE MCP Server with OAuth

The HTTP + SSE MCP Server is a reference implementation that bridges the Model Context Protocol (MCP) with OAuth‑based authentication, delivering both Streamable HTTP and Server‑Sent Events (SSE) transports. It tackles a gap in the MCP ecosystem: while the specification now mandates OAuth support, most SDKs lack end‑to‑end examples that combine dynamic client registration with the streamable HTTP transport. This server gives developers a ready‑made, plug‑and‑play foundation for building secure, scalable MCP services that can be consumed by agents and desktop clients such as Claude or Cursor.

At its core, the project consists of two loosely coupled components. First, an MCP server exposes the standard MCP endpoints (resources, tools, prompts, sampling) and can be swapped out for a custom implementation. Second, an Express.js application orchestrates OAuth flows—including , token issuance, and RFC 8414 metadata—while also managing the SSE and Streamable HTTP transports. The Express layer does not act as an OAuth authority; instead, it proxies authentication to a third‑party provider that supports Dynamic Client Registration (RFC 7591). This design lets developers point the server at any compliant OAuth provider (Auth0, Okta, etc.) without re‑implementing the protocol stack.

Why It Matters

For teams building AI assistants that must access protected data or internal services, the combination of OAuth and MCP ensures that every request carries verifiable credentials while remaining compatible with existing SCP‑based tooling. The streamable HTTP transport allows large or continuous responses (e.g., streaming tool outputs) to be delivered efficiently, a feature not yet widely supported by mainstream MCP hosts. By providing a concrete example that ties these pieces together, the server reduces the friction of prototyping secure AI workflows and accelerates adoption of MCP in production environments.

Key Features

  • OAuth 2.0 Integration – Implements the required endpoints for authorization, token issuance, and provider metadata while delegating actual credential validation to a trusted OAuth server.
  • Dynamic Client Registration – Supports automatic client registration per MCP spec, enabling agents to onboard without manual configuration.
  • Dual Transport Support – Handles both SSE and Streamable HTTP, giving developers flexibility to choose the transport that best fits their latency or bandwidth needs.
  • Modular MCP Server – The core MCP logic is isolated, making it straightforward to replace or extend with custom resource handlers or tool definitions.
  • Developer‑Friendly Setup – Built on Bun/Node.js with minimal dependencies, and includes clear configuration guidelines for connecting to an OAuth provider.

Real‑World Use Cases

  • Enterprise AI Assistants – Securely query internal APIs (HR, finance, inventory) while ensuring each request is authenticated and auditable.
  • Multi‑Tenant SaaS Platforms – Allow customers to spin up agents that automatically register as OAuth clients, simplifying onboarding and billing.
  • Continuous Learning Pipelines – Stream large datasets or model checkpoints to agents in real time via SSE, all under OAuth protection.
  • Hybrid Cloud Deployments – Combine on‑premise MCP services with cloud identity providers, leveraging dynamic registration for seamless scaling.

Integration Workflow

  1. Configure OAuth Provider – Register a client with the chosen provider and enable dynamic registration.
  2. Set Environment Variables – Supply the Express app with client credentials, token URLs, and MCP endpoint details.
  3. Deploy – Run the Express server; it will expose routes and OAuth endpoints.
  4. Client Connection – An MCP client (e.g., Claude) requests an access token via the flow, receives a bearer token, and then interacts with the MCP server over SSE or streamable HTTP.
  5. Operation – The client can now invoke tools, fetch resources, or submit prompts, all authenticated and streamed as needed.

By unifying OAuth with MCP’s modern transport mechanisms in a single, extensible codebase, this server empowers developers to build secure, high‑performance AI assistants that can scale across diverse environments and data sources.