MCPSERV.CLUB
alexneyler

Natural Language to Kusto Query MCP Server

MCP Server

Convert natural language to Kusto queries and execute instantly.

Stale(50)
3stars
2views
Updated Jun 4, 2025

About

This MCP server transforms natural language prompts into executable Kusto queries, supporting multiple databases and outputting results as JSON or CSV. It is fully configurable via a YAML file, integrating Azure OpenAI for query generation.

Capabilities

Resources
Access data sources
Tools
Execute functions
Prompts
Pre-built templates
Sampling
AI model interactions

Natural Language to Kusto Query MCP Server

The Natural Language to Kusto Query MCP server bridges the gap between conversational AI and Azure Data Explorer (Kusto). By accepting plain‑English prompts, it automatically translates them into valid Kusto Query Language (KQL) statements and can execute those queries against any configured Kusto database. This capability is invaluable for developers who want to let AI assistants perform data analysis without exposing raw query syntax or requiring users to learn KQL.

What Problem Does It Solve?

In many data‑centric organizations, analysts and developers spend a significant portion of their time writing or debugging KQL. When integrating AI assistants, this friction hampers productivity and increases the risk of errors. The MCP server eliminates that barrier by providing a natural‑language interface: users can ask questions like “Show me the number of login failures in the last 24 hours” and receive a structured result. This approach enables broader adoption of AI tools across teams that are not Kusto experts, while still allowing seasoned users to refine or inspect the generated queries.

Core Functionality and Value

  • Natural‑Language Parsing: The server leverages an Azure OpenAI model configured via to interpret user prompts and produce accurate KQL snippets.
  • Query Execution: Beyond generating queries, the server can execute them directly against the target Kusto database and return results in JSON or CSV format, making the output immediately usable in downstream applications.
  • Multi‑Database Support: A single configuration file can list multiple databases, tables, and categories, allowing the same AI assistant to serve diverse data sources without redeploying.
  • Toolset for Developers: The MCP exposes a set of tools (, , and ) that can be invoked by AI assistants to fetch metadata, craft queries, or run them on demand. This modularity lets developers compose sophisticated AI workflows that interact with Kusto in a controlled manner.

Key Features Explained

  • Fully Configurable via YAML: The file contains model credentials, Kusto endpoints, access tokens, and table metadata. Environment variables can be interpolated, enabling secure deployment in CI/CD pipelines or containerized environments.
  • Prompt Templates: For each table, you can define prompt templates that guide the model on how to formulate queries. This reduces ambiguity and improves consistency across generated KQL statements.
  • Output Flexibility: By selecting or in the tool, developers can choose the format that best fits their downstream processing or reporting pipeline.
  • Security‑Aware Access: Optional access tokens are supported; otherwise, the server falls back to Azure Default Credentials. This design balances convenience with security best practices.

Real‑World Use Cases

  • Operational Dashboards: An AI assistant can answer ad‑hoc questions about application performance, alert rates, or resource utilization without manual query writing.
  • Incident Investigation: When a security incident occurs, analysts can request specific event logs or anomaly metrics in natural language and receive ready‑to‑consume data.
  • Data Exploration: New team members can experiment with Kusto data through conversational prompts, accelerating onboarding and reducing the learning curve.
  • Automated Reporting: Scheduled AI agents can generate periodic reports by querying Kusto and exporting results to CSV for ingestion into BI tools.

Integration with AI Workflows

Because the server adheres to MCP conventions, any AI assistant that understands MCP can interact with it seamlessly. Developers can embed the server as a backend service, exposing its tools to the assistant’s prompt engine. The assistant can then orchestrate complex sequences—first listing available tables, then generating a query for the chosen table, and finally executing it to fetch results—all while maintaining context across turns. This tight integration enables rich, conversational data experiences that feel natural to end users while preserving the power and precision of Kusto.