About
A Model Context Protocol server that manages OpenStack resources—compute, networking, storage, and load balancing—within a single tenant. It enforces safety‑gated writes, provides bulk actions, and delivers real‑time monitoring for the active project.
Capabilities

MCP‑OpenStack‑Ops is a purpose‑built Model Context Protocol server that gives AI assistants direct, safe, and scoped access to OpenStack environments. By exposing a rich set of tools for compute, networking, storage, identity, and load‑balancing operations, it lets developers query state and orchestrate changes without leaving the conversational flow of an AI client. The server’s design enforces a single‑project boundary () and requires an explicit flag to permit write actions, ensuring that every operation is both tenant‑aware and auditable.
The core value lies in its project‑scoped safety gates. Every tool validates that the target resource belongs to the configured project, preventing accidental cross‑tenant modifications. Write operations are gated behind , keeping default deployments read‑only while still allowing developers to perform controlled updates when needed. This dual safety model protects production workloads and satisfies compliance requirements, making the server suitable for both development sandboxes and live environments.
Key capabilities include more than 90 specialized tools that cover the full OpenStack stack: compute instances, volumes, networking (subnets, ports), images, snapshots, keypairs, Heat stacks, and Octavia load balancers. Tools accept comma‑delimited lists or filter criteria for bulk actions, and all mutating commands return enriched feedback with emoji status checks, asynchronous timing hints, and follow‑up verification prompts. Monitoring tools expose service health, usage statistics, quota limits, and hypervisor details, giving developers a holistic view of resource consumption and capacity.
Typical use cases involve AI‑driven DevOps workflows: a chatbot can ask for the status of all instances, spin up a new node on demand, or resize a volume after a conversation about performance. Security teams can query quota usage and audit resource ownership, while operations staff can trigger load‑balancer reconfigurations or snapshot backups with a single prompt. Because the server supports both and transports, it can be deployed behind proxies or bastions, and the Docker image simplifies integration into CI/CD pipelines.
Unique advantages of MCP‑OpenStack‑Ops include its tight coupling to the latest OpenStack Epoxy release while maintaining backward compatibility, dynamic multi‑version support in development, and a unified result handler that standardizes output across all tools. These features give AI assistants a consistent, reliable interface to OpenStack, enabling smarter automation and faster troubleshooting in cloud environments.
Related Servers
MarkItDown MCP Server
Convert documents to Markdown for LLMs quickly and accurately
Context7 MCP
Real‑time, version‑specific code docs for LLMs
Playwright MCP
Browser automation via structured accessibility trees
BlenderMCP
Claude AI meets Blender for instant 3D creation
Pydantic AI
Build GenAI agents with Pydantic validation and observability
Chrome DevTools MCP
AI-powered Chrome automation and debugging
Weekly Views
Server Health
Information
Tags
Explore More Servers
MCP Server Schema Repository
Central hub for MCP server schemas and tool definitions
Express MCP Server
Fast, lightweight Express-based MCP server template
Gin-MCP
Zero‑config bridge from Gin to Model Context Protocol
Mac Shell MCP Server
Secure macOS shell command execution with whitelisting and approval.
TickTick MCP Server
AI‑powered task and project management via TickTick API
CodingBaby Browser MCP
AI‑driven Chrome automation via WebSocket