About
Provides AI agents with tools to query provider info, resource usage, module metadata, and Terraform Cloud resources via the Model Context Protocol.
Capabilities
The Terraform Registry MCP Server is a lightweight bridge that lets AI assistants query the Terraform ecosystem directly from their conversational context. By exposing the Terraform Registry API through MCP, developers can ask an AI for provider details, resource examples, or module recommendations without leaving their IDE or chat interface. This eliminates the need to manually browse documentation sites, run commands locally, or write custom integration scripts.
At its core, the server offers a suite of core registry tools that mirror common Terraform tasks: fetches metadata about any provider, while supplies example code snippets and related resources. and let the assistant surface relevant modules and their documentation, making it easier to discover reusable components. The server also supports data‑source introspection () and argument detail retrieval (), giving developers granular insight into provider capabilities. For advanced use, , , and policy‑related tools (, ) provide deeper context around provider functions, guides, and policy libraries.
Beyond the public registry, the server integrates with Terraform Cloud through a separate set of tools that require an API token. These include organization and workspace management (, , ), run orchestration (, , ), and resource inspection (). This tight coupling allows an AI to not only suggest code but also trigger infrastructure changes, lock or unlock workspaces, and audit run history—all within a single conversational flow.
Real‑world scenarios benefit from this integration: a developer can ask the AI to "show me how to configure an AWS S3 bucket with encryption" and receive a ready‑to‑paste Terraform snippet; a DevOps engineer can request "list all private modules in my organization" and get an up‑to‑date inventory; or a CI/CD pipeline can be triggered by the assistant after reviewing a module’s metadata. By embedding Terraform knowledge directly into AI workflows, teams reduce context switching, avoid documentation drift, and accelerate infrastructure provisioning.
Unique to this server is its dual‑mode operation—public registry access combined with authenticated Terraform Cloud operations—within a single MCP endpoint. This unified interface means that developers can write one set of prompts for both discovery and execution, while the underlying server handles authentication, rate‑limiting, and data formatting. The result is a seamless, AI‑driven Terraform experience that streamlines both code generation and infrastructure lifecycle management.
Related Servers
MarkItDown MCP Server
Convert documents to Markdown for LLMs quickly and accurately
Context7 MCP
Real‑time, version‑specific code docs for LLMs
Playwright MCP
Browser automation via structured accessibility trees
BlenderMCP
Claude AI meets Blender for instant 3D creation
Pydantic AI
Build GenAI agents with Pydantic validation and observability
Chrome DevTools MCP
AI-powered Chrome automation and debugging
Weekly Views
Server Health
Information
Explore More Servers
LIFX API MCP Server
Control LIFX lights with natural language via MCP
On Running MCP Server
FastAPI powered product data access for On Running
Sbb Mcp
MCP server for interacting with SBB.ch services
Nova Act MCP Server
Zero‑install browser automation for AI agents
CLI MCP Server
Secure command-line execution for LLMs
Discorevy Local MCP Servers
Standardized local MCP server registration for LLM tools