About
An MCP server that lets large language models store data tables and generate Vega‑Lite visualizations, returning either a full specification or a PNG image.
Capabilities

The Vega‑Lite MCP server turns a conversational AI into an interactive data‑visualization assistant. Rather than requiring developers to hand‑craft Vega‑Lite JSON or embed plotting libraries in their applications, the server exposes a minimal set of tools that let an LLM store raw tabular data and later generate fully‑rendered charts. This bridges the gap between natural language analysis and visual exploration, enabling AI agents to present insights in a format that is both machine‑readable and human‑digestible.
At its core, the server offers two commands. accepts a name and an array of objects, persisting the table for later use. This step decouples data ingestion from visualization, allowing the LLM to first curate and summarize information before committing it. takes a previously stored table name and a Vega‑Lite specification string. Depending on the requested output type, it can return either the enriched Vega‑Lite JSON (useful for further programmatic manipulation) or a base64‑encoded PNG image. The ability to switch between text and image outputs gives developers flexibility: they can embed charts directly in reports or continue processing the specification within another toolchain.
The server’s design is intentionally lightweight, making it easy to integrate into existing MCP‑enabled workflows. Developers can add the server to their Claude Desktop configuration with a single JSON entry, specifying whether they want PNGs or text. Once registered, any AI assistant that understands MCP can call to stash a dataset and later invoke to produce charts on demand. This pattern is ideal for use cases such as automated business dashboards, data‑driven chatbots that answer queries with visual evidence, or research assistants that generate exploratory plots from statistical outputs.
Unique advantages stem from the server’s strict separation of concerns and its adherence to Vega‑Lite—a declarative, JSON‑based visualization grammar that is both expressive and lightweight. Because the LLM supplies only a Vega‑Lite spec, developers can rely on the server’s rendering engine to handle all intricacies of scaling, encoding, and layout. The PNG output eliminates the need for client‑side rendering libraries, simplifying deployment in environments where graphical toolkits are unavailable or undesirable.
In practice, a data analyst might ask an AI assistant to “show me the monthly sales trend for product X.” The LLM would first query a database, use to store the result, and then craft a Vega‑Lite specification for a line chart. The assistant would call , receive a PNG, and embed it directly in an email or chat. This seamless pipeline—from natural language to visual artifact—illustrates how the Vega‑Lite MCP server empowers developers to build richer, more interactive AI experiences without wrestling with visualization code.
Related Servers
Data Exploration MCP Server
Turn CSVs into insights with AI-driven exploration
BloodHound-MCP
AI‑powered natural language queries for Active Directory analysis
Google Ads MCP
Chat with Claude to analyze and optimize Google Ads campaigns
Bazi MCP
AI‑powered Bazi calculator for accurate destiny insights
Smart Tree
Fast AI-friendly directory visualization with spicy terminal UI
Google Search Console MCP Server for SEOs
Chat‑powered SEO insights from Google Search Console
Weekly Views
Server Health
Information
Explore More Servers
BoldSign MCP Server
Enable LLMs to manage e‑signatures via BoldSign API
Memgraph MCP Server
Connect Memgraph to LLMs via Model Context Protocol
TripGo MCP Server
Remote API wrapper for public transport data
MCP Learning Project
Simple arithmetic MCP server with SSE and stdio support
Azure DevOps MCP Server
FastAPI-based MCP server for Azure DevOps integration
Whisper King MCP Server
A lightweight MCP server for whispering data