MCPSERV.CLUB
MCP-Mirror

Portkey MCP Server

MCP Server

Integrate Claude with Portkey for full AI platform control

Stale(50)
2stars
1views
Updated 12 days ago

About

The Portkey MCP Server connects Claude to the Portkey API, enabling comprehensive management of users, workspaces, analytics, and configuration settings for AI deployments.

Capabilities

Resources
Access data sources
Tools
Execute functions
Prompts
Pre-built templates
Sampling
AI model interactions

Portkey Server MCP server

The Portkey Admin MCP Server turns a Claude assistant into a full‑featured Portkey platform controller. By exposing the Portkey API through MCP, developers can query usage statistics, manage workspaces and users, and fine‑tune API settings—all from within an AI conversation. This eliminates the need to switch between dashboards and code, allowing rapid iteration on policy changes or troubleshooting directly from the chat interface.

At its core, the server provides a unified set of capabilities that mirror the Portkey web console. Developers can ask questions like “What are my current API usage statistics across different models?” or “Show me the latency statistics for my API calls,” and the assistant will translate those natural‑language prompts into precise API calls, returning structured data that can be further processed or visualized. This tight integration is especially valuable for teams that rely on continuous monitoring and quick decision‑making; the assistant can surface alerts, generate reports, or even trigger rate‑limit adjustments without leaving the conversational context.

Key features include:

  • User & Access Management – list, invite, and role‑assign users at both organization and workspace levels.
  • Analytics & Reporting – retrieve detailed usage metrics, cost breakdowns, and token statistics, with filtering by status code or time window.
  • Workspace Management – view all workspaces, inspect configurations, and manage virtual keys (API keys with usage limits).
  • Configuration & API Settings – explore cache policies, retry strategies, and routing rules that govern request handling.

These capabilities enable a range of real‑world scenarios: a product manager can ask for a month‑long cost analysis, an ops engineer can set up custom headers or rate limits on the fly, and a data scientist can generate analytics reports to share with stakeholders—all without leaving Claude.

Integration into AI workflows is seamless: the MCP server registers its resources, tools, and prompts with Claude’s client configuration, so the assistant automatically offers relevant actions as suggestions. Because every request is routed through the server’s secure API key, sensitive credentials remain protected while still granting full administrative power to the AI. The result is a powerful, developer‑centric interface that turns routine platform tasks into conversational commands, dramatically speeding up iteration cycles and reducing context switching.