About
This server connects an Azure OpenAI-powered AI agent to a Microsoft Fabric data warehouse using the Model Context Protocol and GraphQL, enabling dynamic discovery of tools, data resources, and bidirectional query/mutation access.
Capabilities

Overview
The Aifoundry Mcpconnector Fabricgraphql server is a Model Context Protocol (MCP) implementation that bridges Azure OpenAI‑powered agents with Microsoft Fabric data warehouses through GraphQL. By exposing a unified MCP interface, it allows an AI assistant to discover and invoke data‑centric tools without hardcoding API calls, thereby transforming static data queries into dynamic, agent‑driven interactions. For developers building conversational AI or data‑analysis workflows, this connector eliminates the need to write custom adapters for each database schema and instead leverages GraphQL’s declarative query language as a common abstraction layer.
Solving the Data‑Access Bottleneck
Traditional AI agents often struggle to reach enterprise data because each source requires a distinct SDK, authentication flow, or query language. The MCP connector solves this by translating generic MCP requests into GraphQL queries that the Fabric backend can execute. Developers no longer need to manage separate connectors for each table or data model; the server automatically maps GraphQL endpoints to MCP resources, making enterprise data discoverable and actionable by the AI agent.
What the Server Provides
- Dynamic Tool Discovery: The MCP server advertises GraphQL queries and mutations as tools that the AI agent can invoke, complete with parameter schemas derived from the Fabric data model.
- Bidirectional Data Flow: Agents can read from and write to the warehouse using standard GraphQL operations, enabling real‑time data updates directly from conversational prompts.
- Security & Context: The connector respects Azure’s authentication mechanisms, ensuring that only authorized agents can access sensitive data.
- Extensibility: While the current implementation focuses on Microsoft Fabric, the same MCP‑GraphQL pattern can be extended to other GraphQL‑enabled data stores with minimal changes.
Key Features Explained
- Unified API Surface: GraphQL abstracts diverse tables into a single endpoint, so the MCP server exposes only one resource per warehouse.
- Schema‑Driven Tool Generation: The server introspects the GraphQL schema to auto‑generate tool definitions, reducing manual effort.
- Sample Warehouse Integration: The README demonstrates how to create a sample Fabric warehouse and expose it via GraphQL, making the connector immediately usable for prototyping.
- Gradio UI for Interaction: A lightweight Gradio interface allows developers to initialize the MCP server, view available tools, and test queries without writing additional code.
Real‑World Use Cases
- Business Intelligence Chatbots: An AI assistant can answer questions like “What were our sales figures last quarter?” by invoking a GraphQL query exposed through MCP.
- Data‑Driven Decision Support: Agents can suggest data updates (e.g., inserting new transaction records) by executing GraphQL mutations, all while maintaining audit trails.
- Rapid Prototyping: Data scientists can spin up a sample Fabric warehouse, expose it via the connector, and immediately test AI interactions in minutes.
- Compliance‑Aware Data Access: By centralizing authentication through Azure and GraphQL, the connector ensures that data access policies are enforced consistently across all AI‑driven queries.
Integration into AI Workflows
Developers embed the MCP connector within their existing AI stack by configuring environment variables for Azure OpenAI and the Fabric GraphQL endpoint. Once started, the MCP server registers its tools with the AI agent, allowing the agent to discover and call them using natural language prompts. The Gradio UI offers a quick way to validate tool functionality, but the same endpoints can be consumed programmatically by any MCP‑compliant client. This seamless integration means that adding new data sources or updating schemas requires only a refresh of the GraphQL endpoint, after which the MCP server automatically reflects the changes to the AI assistant.
Standout Advantages
- Zero‑Code Data Access: Developers need not write custom connectors for each Fabric table; the GraphQL abstraction handles schema mapping.
- Immediate Discoverability: MCP’s dynamic discovery feature means the AI agent learns about new data tools on‑the‑fly, enabling adaptive conversations.
- Enterprise‑Ready Security: Built atop Azure’s authentication stack, the connector ensures that data access remains governed by existing IAM policies.
- Extensibility Beyond Fabric: The same MCP‑GraphQL pattern can be ported to other GraphQL services, making this approach broadly applicable across data platforms.
In summary, the Aifoundry Mcpconnector Fabricgraphql server empowers AI assistants to interact with Microsoft Fabric warehouses in a declarative, secure, and discoverable manner. By leveraging GraphQL as a unifying layer and MCP
Related Servers
MarkItDown MCP Server
Convert documents to Markdown for LLMs quickly and accurately
Context7 MCP
Real‑time, version‑specific code docs for LLMs
Playwright MCP
Browser automation via structured accessibility trees
BlenderMCP
Claude AI meets Blender for instant 3D creation
Pydantic AI
Build GenAI agents with Pydantic validation and observability
Chrome DevTools MCP
AI-powered Chrome automation and debugging
Weekly Views
Server Health
Information
Tags
Explore More Servers
MeasureSpace MCP Server
Weather, climate, and air quality data for AI assistants
Arc MCP Server
Easily deploy web apps via conversational guidance
PostgreSQL Products MCP Server
Query product data via SQL with an AI-friendly interface
Hologres MCP Server
Unified AI interface to Hologres databases
Moodle MCP Server
LLM-powered Moodle assistant for tasks and notifications
VirusTotal MCP Server
Comprehensive security insights from VirusTotal