MCPSERV.CLUB
hashicorp

Terraform MCP Server

MCP Server

Seamless Terraform Registry integration for AI-powered IaC

Active(80)
990stars
5views
Updated 12 days ago

About

The Terraform MCP Server bridges Model Context Protocol clients with Terraform Registry APIs, enabling automated workspace management, provider and module discovery, and HCP/TFE operations directly from LLMs or other MCP-enabled tools.

Capabilities

Resources
Access data sources
Tools
Execute functions
Prompts
Pre-built templates
Sampling
AI model interactions

Terraform MCP Server in Action

Overview

The Terraform Model Context Protocol (MCP) Server bridges the gap between AI assistants and Terraform’s rich ecosystem, allowing language models to query, manipulate, and orchestrate infrastructure resources directly through a standardized protocol. By exposing Terraform Registry APIs—providers, modules, and policies—as well‑defined MCP resources, the server turns IaC operations into conversational commands that can be understood and executed by an LLM. This eliminates the need for developers to manually run Terraform CLI commands, reducing friction and accelerating prototype cycles.

For developers who already rely on AI assistants for code generation or troubleshooting, the server offers a single point of integration that handles authentication, workspace management, and policy enforcement. It supports both local (stdio) and HTTP transports, giving teams flexibility to embed the server in containerized environments or expose it as a microservice. Security is baked into the design: CORS policies, TLS configuration, and optional rate limiting ensure that only trusted origins can invoke Terraform actions, mitigating DNS rebinding and other cross‑origin attacks. The server also respects HCP Terraform and Terraform Enterprise, enabling full workspace CRUD operations, variable handling, tag assignment, and run lifecycle management.

Key capabilities include:

  • Provider & Module Discovery: Retrieve metadata, documentation, and source code for any public registry item.
  • Workspace Lifecycle Management: Create, update, delete workspaces and manage associated variables or tags.
  • Run Orchestration: Trigger plan, apply, and destroy operations with real‑time status streaming.
  • Policy Retrieval: Access Sentinel policies for compliance checks directly from the assistant.

Typical use cases span rapid IaC prototyping, continuous integration pipelines where an LLM drafts Terraform modules on demand, and DevOps support bots that can explain or modify existing infrastructure in natural language. By integrating with AI workflows, teams can iterate faster: a developer asks the assistant to "deploy a new S3 bucket with versioning," and the MCP server translates that into authenticated API calls, returning status updates and logs as part of the conversation.

What sets this server apart is its dual transport support combined with comprehensive Terraform Enterprise integration. While many MCP servers focus on generic API calls, this implementation speaks the native language of infrastructure—Terraform—providing developers with a powerful, AI‑driven tool that respects enterprise security models and delivers consistent, auditable changes to production environments.