MCPSERV.CLUB
MCP-Mirror

Enemyrr MCP Server Pagespeed

MCP Server

Analyze webpage performance via Google PageSpeed Insights

Stale(65)
0stars
1views
Updated Feb 16, 2025

About

A Model Context Protocol server that provides real‑time analysis of web page performance using the Google PageSpeed Insights API. It returns comprehensive metrics, loading experience data, and prioritized improvement suggestions for AI models.

Capabilities

Resources
Access data sources
Tools
Execute functions
Prompts
Pre-built templates
Sampling
AI model interactions

Server Pagespeed MCP server

The @enemyrr/mcp-server-pagespeed is a Model Context Protocol (MCP) server that exposes Google PageSpeed Insights as a callable tool for AI assistants. By turning the PageSpeed API into an MCP command, developers can embed real‑time web performance analysis directly into conversational or code‑generation workflows. This removes the need to manually query the API, parse JSON responses, and surface actionable insights—tasks that are repetitive and error‑prone when done outside an AI context.

When a model invokes the tool, it receives a concise yet comprehensive report: an overall performance score (0–100), key loading experience metrics such as First Contentful Paint and First Input Delay, and the top five improvement suggestions. Each suggestion is accompanied by a title, description, potential impact score, and the current metric value. This structure aligns with how developers think about performance tuning—scores for quick health checks, metrics for diagnostics, and ranked suggestions that guide the next steps. The server’s error handling is robust: it validates URLs, reports API failures, and flags malformed tool calls, ensuring that the AI client can gracefully handle edge cases without breaking the conversation.

For developers building AI‑powered web optimization tools, this MCP server offers several practical advantages. First, it eliminates boilerplate code for authentication and request orchestration; the server internally manages API keys and rate limits. Second, it delivers consistent, machine‑readable outputs that can be directly fed into other MCP tools or custom logic—such as automatically generating Lighthouse reports, inserting performance data into issue trackers, or triggering CI pipelines when thresholds are breached. Third, because the tool is type‑safe (thanks to TypeScript support), developers can rely on compile‑time checks when integrating the MCP client, reducing runtime errors.

Typical use cases include:

  • Chatbot assistants that answer questions about a site’s performance and suggest fixes.
  • CI/CD pipelines where an AI model reviews the latest build’s PageSpeed score before merging.
  • Web analytics dashboards that surface real‑time performance metrics alongside other KPIs.
  • Developer education tools that explain performance concepts through concrete examples pulled from live sites.

By providing a standardized interface, the MCP server enables seamless integration into any AI workflow that supports Model Context Protocol. Whether you’re building a productivity bot, an automated code review system, or a performance monitoring service, this server turns the complex PageSpeed Insights API into a simple, repeatable command that AI models can leverage to deliver actionable value.