MCPSERV.CLUB
Excoriate

Terraform AWS Provider MCP Server

MCP Server

AI-powered context for Terraform AWS resources

Stale(55)
1stars
2views
Updated May 1, 2025

About

A Model Context Protocol server built with Deno/TypeScript that supplies up-to-date documentation, configuration details, GitHub issue data, and example code for the Terraform AWS Provider, enabling AI agents to query AWS IaC information directly.

Capabilities

Resources
Access data sources
Tools
Execute functions
Prompts
Pre-built templates
Sampling
AI model interactions

Overview

The MCP server for Terraform AWS Provider Docs is a lightweight, Deno‑based service that exposes the latest documentation, configuration details, and real‑world examples of the HashiCorp AWS provider to AI assistants. By acting as a single, well‑defined endpoint, it allows LLMs such as Claude to fetch authoritative information about AWS resources, provider settings, and GitHub issues without needing to scrape or parse external websites. This means developers can ask their AI for up‑to‑date guidance on resource attributes, best‑practice patterns, or current bug reports and receive consistent, machine‑readable responses.

The core problem this server solves is the context gap that often plagues infrastructure‑as‑code (IaC) development. When building Terraform modules or writing complex AWS configurations, developers must repeatedly consult the provider’s documentation, search GitHub issues for known limitations, and reference example code snippets. Manually switching between a browser, IDE, and command line is time‑consuming and error‑prone. By integrating the documentation source directly into an AI workflow, the MCP server eliminates context switching: a single prompt can trigger the AI to pull in the exact resource reference, flag deprecated arguments, or surface recent issue discussions.

Key capabilities include:

  • Resource Documentation – Retrieve the full description, arguments, and examples for any AWS resource exposed by the provider.
  • Provider Configuration – Access details about global provider settings, authentication methods, and version constraints.
  • GitHub Issue Indexing – Query open, closed, or all issues from the provider’s GitHub repository to surface recent bugs or feature requests.
  • Example Snippets – Fetch ready‑to‑use Terraform code examples that demonstrate typical usage patterns for a given resource.

Typical use cases span the entire IaC lifecycle. A developer drafting a new module can ask the AI for “best‑practice attributes for ” and instantly receive a concise summary. An operations engineer troubleshooting an error can request the current status of related GitHub issues, or a CI pipeline might invoke the server to validate that the Terraform code adheres to the latest provider schema. Because the server serves JSON‑structured data, downstream tooling—such as an IDE extension or a custom chatbot—can parse and render the information in context, providing inline help or automated code generation.

Integration with AI workflows is straightforward: the MCP server exposes a standard set of tools that any MCP‑compliant client can call. A Claude Desktop session, for example, can trigger the “get resource docs” tool and receive a quick answer. The server’s design emphasizes up‑to‑date data by pulling directly from the provider registry and GitHub, ensuring that AI assistants always work with the most recent information. This real‑time connectivity gives developers a significant productivity boost and reduces the risk of deploying outdated or misconfigured AWS resources.