About
A lightweight Model Context Protocol server that stores and retrieves perfrunner test and cluster configurations using Couchbase with full-text search. It enables quick access to static test data for performance teams.
Capabilities
Perfrunner MCP Server Overview
The Perfrunner MCP Server bridges the gap between performance testing teams and AI assistants by exposing a rich, queryable interface to static configuration data. Performance engineers routinely generate large collections of and files that describe workload scenarios, cluster topologies, and resource allocations. These artifacts are often stored in ad‑hoc file systems or spreadsheets, making it difficult for automated tooling to discover and reuse them. By loading these configurations into a Couchbase cluster and exposing them via MCP, the server turns raw files into first‑class resources that can be queried, filtered, and combined directly from an AI assistant.
At its core, the server provides a basic MCP implementation that supports the standard resource and tool endpoints. The integration with Couchbase gives it a robust, scalable backing store that can handle thousands of configuration documents while offering full‑text search (FTS) capabilities. This means an AI assistant can perform natural‑language queries such as “Show me all tests that target a 4‑node cluster with SSD storage” and receive precise results without any custom code. The FTS layer also enables fuzzy matching, which is invaluable when dealing with legacy naming conventions or incomplete metadata.
Developers benefit from the server’s modular architecture. The configuration loader script () parses and files, normalizes them into JSON documents, and writes them to Couchbase. Once the data is in place, any MCP‑compatible client—Claude, LangChain, or a custom workflow—can discover the available resources through standard calls. Tools can then be created to trigger test runs, fetch results, or even modify configuration parameters on the fly. Because the server follows MCP best practices, it can be easily chained with other services such as continuous integration pipelines or monitoring dashboards.
Typical use cases include:
- Automated test selection: An AI assistant recommends the most relevant performance tests based on current cluster health or recent workloads.
- Dynamic test generation: Engineers can ask the assistant to produce a new file that balances load across nodes, and the server will store it for future runs.
- Historical analysis: By querying past test configurations and results, the assistant can identify regressions or confirm performance improvements.
- Documentation assistance: The server’s searchable index allows the assistant to pull configuration snippets into documentation or knowledge bases automatically.
What sets Perfrunner apart is its focus on static configuration data—a niche often overlooked in AI tooling. By providing a dedicated MCP surface for performance testing artifacts, it enables developers to treat configuration files as first‑class citizens in their AI workflows. This leads to faster onboarding, more consistent test execution, and a tighter feedback loop between performance teams and the rest of the organization.
Related Servers
n8n
Self‑hosted, code‑first workflow automation platform
FastMCP
TypeScript framework for rapid MCP server development
Activepieces
Open-source AI automation platform for building and deploying extensible workflows
MaxKB
Enterprise‑grade AI agent platform with RAG and workflow orchestration.
Filestash
Web‑based file manager for any storage backend
MCP for Beginners
Learn Model Context Protocol with hands‑on examples
Weekly Views
Server Health
Information
Explore More Servers
File Search MCP
Instant full-text search across your filesystem
Simple Jira MCP Server
AI-driven Jira integration via Model Context Protocol
Go Mcps
Build MCP servers to pull context from Slack and GitHub
Office MCP Server
AI‑powered office file automation via Model Context Protocol
Google Calendar MCP Server
Integrate Claude Desktop with Google Calendar via natural language
MNS MCP Server
Manage Aliyun MNS queues via the MCP framework