MCPSERV.CLUB
syucream

Lightdash MCP Server

MCP Server

MCP‑compatible bridge to Lightdash data

Active(70)
17stars
3views
Updated Jun 9, 2025

About

A lightweight Model Context Protocol server that exposes Lightdash’s API, enabling AI assistants to query projects, spaces, charts, dashboards and metrics through a standardized interface.

Capabilities

Resources
Access data sources
Tools
Execute functions
Prompts
Pre-built templates
Sampling
AI model interactions

Lightdash MCP Server

The Lightdash MCP server bridges the gap between AI assistants and a Lightdash analytics environment, allowing developers to query, retrieve, and manipulate business intelligence data directly from their conversational agents. By exposing Lightdash’s REST API through the Model Context Protocol, the server gives Claude or any MCP‑compatible client a consistent, tool‑based interface to explore dashboards, charts, and metrics without leaving the chat. This eliminates the need for manual API calls or custom integration code, making data insights instantly accessible to non‑technical users and automated workflows.

At its core, the server offers a rich set of tools that map to common Lightdash operations. Developers can list all projects, spaces, charts, or dashboards within an organization; fetch detailed metadata for a specific project; and retrieve custom metrics or catalog information. For teams that deploy data as code, the server also supports pulling charts and dashboards in a programmable format. These capabilities are designed to mirror the most frequently used Lightdash actions, enabling AI assistants to ask questions like “Show me all dashboards in Project X” or “What are the custom metrics for this space?” and receive structured, actionable responses.

The server’s dual transport modes—standard Stdio for local development and Streamable HTTP for networked or cloud deployments—provide flexibility in how it is consumed. In a desktop setting, an AI assistant can launch the server as a subprocess and communicate via stdin/stdout. In larger environments, the HTTP mode allows multiple clients or services to connect concurrently, making it ideal for integration into CI/CD pipelines, automated reporting tools, or web‑based dashboards that embed conversational AI. The ability to run the same MCP instance over both transports ensures consistent behavior regardless of deployment topology.

Real‑world use cases abound. Business analysts can query Lightdash directly from a chat interface to get real‑time metrics, reducing the friction of switching between tools. Data engineers can automate the generation of chart code for version control, while product managers might use an AI assistant to pull up dashboards on demand during meetings. Because the server communicates through a standard protocol, it can be combined with other MCP servers—such as SQL or data lake connectors—to create end‑to‑end workflows that span data extraction, transformation, and visualization, all orchestrated by natural language.

Unique to this implementation is its tight integration with Lightdash’s “as code” features, enabling AI assistants not only to read but also to export visualizations in a reproducible format. This empowers developers to treat dashboards as first‑class artifacts in code repositories, facilitating collaboration and auditability. The combination of comprehensive tooling, flexible transport options, and native support for code‑centric analytics makes the Lightdash MCP server a powerful asset for any team looking to embed data intelligence into conversational AI workflows.