About
Whistle MCP Server lets AI assistants manage, configure, and debug local Whistle proxy servers via the Model Context Protocol, enabling rule, group, value management, request interception and replay through natural language.
Capabilities
Whistle MCP Server – Overview
Whistle MCP Server bridges the gap between AI assistants and local network debugging by exposing a rich set of Whistle proxy operations through the Model Context Protocol. Rather than opening a browser-based interface or manually editing configuration files, developers can issue natural‑language commands to an AI assistant and have it create rules, manage groups, adjust proxy settings, or replay traffic—all in real time. This eliminates the friction of context switching and lets teams iterate faster on API design, security testing, or performance tuning.
The server translates MCP tool calls into Whistle commands. For example, a simple request can add a new rewrite or block rule; and toggle activation without touching the UI. Group operations let you bundle rules logically, mirroring how developers organize environments or feature flags. Value management adds dynamic variables that can be referenced across rules, enabling parameterized configurations. Proxy control covers the full spectrum of Whistle features—enabling/disabling HTTP/HTTPS interception, toggling HTTP/2 support, or switching the entire proxy on or off—so an AI can reconfigure a debugging session with a single prompt.
Key capabilities include:
- Rule lifecycle management: create, update, rename, delete, enable/disable, and bulk disable rules.
- Group orchestration: create, rename, delete groups; move rules in and out of groups.
- Value handling: add, update, rename, delete values, with group support for logical grouping of variables.
- Proxy toggles: start/stop the proxy, switch interception modes, and control protocol support.
- Request insight: list intercepted requests with URL filtering; replay captured traffic with custom parameters.
Real‑world scenarios benefit from this tight integration. QA teams can ask an AI to “enable the error‑mock rule for all POST endpoints” while a developer simultaneously tests downstream services. Security auditors might request “replay the last 50 requests to with altered headers” to validate input sanitization. API designers can prototype new endpoints by “create a rule that redirects to the staging server” and immediately see the effect in their browser, all without leaving the AI chat.
By exposing these operations over MCP, Whistle Server fits seamlessly into any AI‑driven workflow—whether on Claude Desktop, Raycast, or Cursor. Its declarative toolset empowers assistants to act as first‑class network engineers, reducing manual effort and accelerating the feedback loop in modern web development pipelines.
Related Servers
MindsDB MCP Server
Unified AI-driven data query across all sources
Homebrew Legacy Server
Legacy Homebrew repository split into core formulae and package manager
Daytona
Secure, elastic sandbox infrastructure for AI code execution
SafeLine WAF Server
Secure your web apps with a self‑hosted reverse‑proxy firewall
mediar-ai/screenpipe
MCP Server: mediar-ai/screenpipe
Skyvern
MCP Server: Skyvern
Weekly Views
Server Health
Information
Explore More Servers
Mcp Hub
Central hub for Model Context Protocol servers
Linux MCP Server
Secure shell command execution via Model Context Protocol
Dune Analytics MCP Server
Bridging Dune data to AI agents
Nmap MCP Server
Containerized Nmap scanning via Model Context Protocol
InfluxDB MCP Server
Access InfluxDB via Model Context Protocol
Korea Stock MCP Server
AI‑powered Korean stock analysis via DART and KRX APIs