MCPSERV.CLUB
JustTryAI

Databricks Permissions MCP Server

MCP Server

LLM‑powered Databricks permission & credential manager

Stale(50)
0stars
2views
Updated Mar 18, 2025

About

A Model Completion Protocol server that exposes Databricks REST APIs for managing permissions, service principals and Git credentials, enabling LLMs to automate Databricks security tasks efficiently.

Capabilities

Resources
Access data sources
Tools
Execute functions
Prompts
Pre-built templates
Sampling
AI model interactions

Databricks Permissions MCP Server in Action

The Databricks Permissions MCP Server bridges the gap between large‑language models (LLMs) and Databricks’ fine‑grained access control mechanisms. By exposing a full set of permission‑management tools over the Model Completion Protocol, it lets AI assistants such as Claude perform real‑world administrative tasks—listing, assigning, and revoking access to notebooks, clusters, jobs, Unity Catalog objects, and even service principals—without leaving the conversational interface. This capability is particularly valuable for data‑engineering teams that rely on automated workflows to enforce least‑privilege principles across dynamic, multi‑tenant environments.

At its core, the server implements the MCP specification to provide a secure, authenticated channel for LLMs to call Databricks REST APIs. It handles token exchange, request formatting, and response parsing so that the assistant can issue high‑level commands like “set cluster permissions for user X to read/write” and receive confirmation in natural language. The async, asyncio‑based architecture ensures that multiple permission queries can be processed concurrently, keeping latency low even when managing dozens of resources.

Key features include a comprehensive toolkit for permissions management across all Databricks resource types: clusters, jobs, SQL warehouses, notebooks, and Unity Catalog entities. Developers can retrieve current permission sets, modify them atomically, or query the available permission levels for a given object type. The server also exposes service‑principal operations—listing, creating, updating, and deleting principals—enabling automated identity provisioning for CI/CD pipelines. Git credential management is another standout, allowing the assistant to maintain repository access tokens directly within Databricks, streamlining code‑delivery workflows.

Real‑world scenarios range from onboarding new data scientists—where an AI assistant can automatically grant read/write access to relevant notebooks and clusters—to audit compliance checks, where the model enumerates all users with admin rights across the workspace. In a continuous‑integration pipeline, a Claude-powered bot could trigger permission updates after a merge or rollback, ensuring that only the intended users retain access. By integrating seamlessly with existing MCP‑compatible clients like Claude Desktop, teams can embed these operations into everyday conversations, reducing context switching and accelerating governance tasks.