MCPSERV.CLUB
thoughtspot

ThoughtSpot MCP Server

MCP Server

Secure OAuth-powered analytics with ThoughtSpot via Cloudflare

Active(92)
19stars
0views
Updated 17 days ago

About

The ThoughtSpot MCP Server provides OAuth-based authentication and a set of tools for querying and retrieving data from your ThoughtSpot instance. Hosted on Cloudflare, it enables LLMs to seamlessly access and analyze ThoughtSpot data.

Capabilities

Resources
Access data sources
Tools
Execute functions
Prompts
Pre-built templates
Sampling
AI model interactions

ThoughtSpot MCP Server in Action

The ThoughtSpot MCP Server is a Cloudflare‑hosted gateway that bridges large language models (LLMs) with ThoughtSpot’s analytics platform. By exposing ThoughtSpot resources, tools, and prompt templates through the Model Context Protocol (MCP), it enables developers to turn conversational AI into a data‑driven assistant that can authenticate, query, and return actionable insights directly from their ThoughtSpot instance. This removes the need for custom API wrappers or manual OAuth flows, streamlining integration across a variety of AI clients such as Claude, OpenAI’s Deep Research, and Gemini.

At its core, the server provides secure OAuth‑based authentication. Clients supply a ThoughtSpot access token and host identifier in the request headers, and the MCP server forwards these credentials to ThoughtSpot’s REST endpoints. The result is a seamless, single‑step authentication that keeps tokens out of the client’s codebase while still granting fine‑grained access to the user’s data. Once authenticated, the server lists all available ThoughtSpot datasources as MCP resources, allowing a user to set context by selecting the appropriate datasource. From there, the LLM can issue analytical queries—such as “Show me sales trends for Q3”—and the server will translate that intent into a ThoughtSpot query, execute it, and return the results in a format the model can interpret.

Key capabilities include:

  • Resource discovery: Automatic enumeration of ThoughtSpot tables and dashboards, presented as selectable resources in the MCP client.
  • Tool execution: Dedicated tools that translate natural language into ThoughtSpot query language, execute queries, and stream results back to the LLM.
  • Prompt orchestration: Built‑in prompts that guide the model’s reasoning about which tool to invoke and how to format responses.
  • Transport flexibility: Support for HTTP, WebSocket, and Cloudflare Workers environments, ensuring low latency and high reliability.
  • Fallback to stdio: For developers without a native MCP client, the server can be invoked via command line using the helper, providing a quick test harness.

Real‑world scenarios benefit from this integration include sales forecasting dashboards that auto‑populate with the latest data, operational analytics bots that answer ad‑hoc questions in Slack or Teams, and data science pipelines where an LLM can iteratively refine queries based on user feedback. Because the MCP server handles authentication and query translation, developers can focus on designing conversational flows rather than plumbing data access layers.

In practice, integrating ThoughtSpot into an AI workflow is as simple as adding the MCP server URL to your client’s configuration. Once connected, users can select a datasource, pose analytical questions, and let the LLM orchestrate ThoughtSpot queries—all within the same conversational interface. This tight coupling between AI and analytics unlocks powerful, context‑aware insights that would otherwise require extensive custom development.