MCPSERV.CLUB
microsoft

Microsoft Fabric Real-Time Intelligence MCP Server

MCP Server

Bridge AI agents to live Fabric RTI data with KQL

Active(72)
62stars
0views
Updated 18 days ago

About

This MCP server exposes Microsoft Fabric Real-Time Intelligence services to AI agents, translating natural language into Kusto queries and providing secure, real‑time access to Eventhouse and Eventstreams for analytics and management.

Capabilities

Resources
Access data sources
Tools
Execute functions
Prompts
Pre-built templates
Sampling
AI model interactions

Overview

Microsoft Fabric Real‑Time Intelligence (RTI) is a cloud‑native platform that unifies streaming, analytics, and AI at scale. The Fabric Real‑Time Intelligence MCP server turns the rich capabilities of Fabric RTI into a first‑class tool set that can be consumed by any Model Context Protocol (MCP) client, such as Claude or other AI assistants. By exposing RTI services through the MCP interface, developers can embed live data querying and stream management directly into conversational workflows without writing custom connectors or managing authentication tokens manually.

The server acts as a bridge between an AI agent and Fabric RTI services. When an assistant receives a natural‑language request, the MCP server translates that intent into Kusto Query Language (KQL) or RTI‑specific API calls. It then authenticates the request using Azure Identity, ensuring that only authorized users can access the underlying Eventhouse (Kusto) or Eventstreams resources. The response is returned as a structured tool output that the assistant can incorporate into its reply, enabling dynamic, data‑driven conversations.

Key capabilities include:

  • KQL execution: Run arbitrary queries against Eventhouse databases and retrieve results, schema information, or random samples.
  • Management operations: List databases, tables, functions, and even ingest inline CSV data into a table.
  • Eventstream control: Enumerate all Eventstreams in a workspace, fetch detailed metadata, or pull the full JSON definition of a stream.
  • AI‑enhanced suggestions: Retrieve semantically similar query examples (“shots”) to help users craft better queries.

Real‑world use cases span from operational monitoring—where an assistant can answer “Show me the last 10 alerts in Eventhouse” and immediately surface live data—to compliance auditing, where it can pull historical command execution logs and classify risk levels. In data‑science pipelines, a researcher can ask the assistant to “Analyze the StormEvents table for trend analysis over the past decade,” and receive a concise, data‑rich response that includes visual or statistical summaries.

Integrating the Fabric RTI MCP into an AI workflow is straightforward: a developer registers the server with the assistant, defines the desired tool set (e.g., , ), and then writes prompts that trigger those tools. The assistant’s reasoning engine orchestrates the conversation, invokes the appropriate tool via MCP, and stitches the output back into a natural‑language answer. This pattern removes boilerplate code for authentication, query translation, and error handling, allowing developers to focus on higher‑level business logic.