About
A Model Context Protocol server that provides real‑time analysis of web page performance using the Google PageSpeed Insights API. It returns comprehensive metrics, loading experience data, and prioritized improvement suggestions for AI models.
Capabilities
The @enemyrr/mcp-server-pagespeed is a Model Context Protocol (MCP) server that exposes Google PageSpeed Insights as a callable tool for AI assistants. By turning the PageSpeed API into an MCP command, developers can embed real‑time web performance analysis directly into conversational or code‑generation workflows. This removes the need to manually query the API, parse JSON responses, and surface actionable insights—tasks that are repetitive and error‑prone when done outside an AI context.
When a model invokes the tool, it receives a concise yet comprehensive report: an overall performance score (0–100), key loading experience metrics such as First Contentful Paint and First Input Delay, and the top five improvement suggestions. Each suggestion is accompanied by a title, description, potential impact score, and the current metric value. This structure aligns with how developers think about performance tuning—scores for quick health checks, metrics for diagnostics, and ranked suggestions that guide the next steps. The server’s error handling is robust: it validates URLs, reports API failures, and flags malformed tool calls, ensuring that the AI client can gracefully handle edge cases without breaking the conversation.
For developers building AI‑powered web optimization tools, this MCP server offers several practical advantages. First, it eliminates boilerplate code for authentication and request orchestration; the server internally manages API keys and rate limits. Second, it delivers consistent, machine‑readable outputs that can be directly fed into other MCP tools or custom logic—such as automatically generating Lighthouse reports, inserting performance data into issue trackers, or triggering CI pipelines when thresholds are breached. Third, because the tool is type‑safe (thanks to TypeScript support), developers can rely on compile‑time checks when integrating the MCP client, reducing runtime errors.
Typical use cases include:
- Chatbot assistants that answer questions about a site’s performance and suggest fixes.
- CI/CD pipelines where an AI model reviews the latest build’s PageSpeed score before merging.
- Web analytics dashboards that surface real‑time performance metrics alongside other KPIs.
- Developer education tools that explain performance concepts through concrete examples pulled from live sites.
By providing a standardized interface, the MCP server enables seamless integration into any AI workflow that supports Model Context Protocol. Whether you’re building a productivity bot, an automated code review system, or a performance monitoring service, this server turns the complex PageSpeed Insights API into a simple, repeatable command that AI models can leverage to deliver actionable value.
Related Servers
MarkItDown MCP Server
Convert documents to Markdown for LLMs quickly and accurately
Context7 MCP
Real‑time, version‑specific code docs for LLMs
Playwright MCP
Browser automation via structured accessibility trees
BlenderMCP
Claude AI meets Blender for instant 3D creation
Pydantic AI
Build GenAI agents with Pydantic validation and observability
Chrome DevTools MCP
AI-powered Chrome automation and debugging
Weekly Views
Server Health
Information
Tags
Explore More Servers
AISDK MCP Bridge
Connect AI SDKs with MCP servers effortlessly
MCP Tools
Bridge LLMs to SaaS tools via Model Context Protocol
Gmail IMAP MCP Server
AI‑powered Gmail integration via IMAP
Playwright MCP
Browser automation via structured accessibility trees
Pinner MCP Server
Pin third‑party dependencies to immutable digests with ease.
Cloudsecurityalliance Csa Mcp Servers
Unified repository of multiple MCP servers for CSA use