About
A Model Context Protocol server that provides standardized, multi‑API access to AWS Trusted Advisor checks and recommendations, supporting legacy Support API, modern TrustedAdvisor API, and organization‑wide resource management.
Capabilities
Overview
The MCP Trusted Advisor Server bridges the gap between AI assistants and AWS’s Trusted Advisor service by exposing a unified, protocol‑compliant interface. It aggregates functionality from both the legacy Support API and the newer TrustedAdvisor API, allowing clients to retrieve checks, recommendations, and resource details across single or multiple AWS accounts. This consolidation removes the need for developers to write separate adapters for each API version, streamlining integration into AI workflows that rely on standardized data sources.
For developers building intelligent assistants, the server offers a consistent set of resources that mirror Trusted Advisor’s core capabilities: listing checks, fetching detailed recommendation data, filtering by service or pillar, and managing recommendation lifecycles. By presenting these operations through MCP endpoints such as or , the server enables AI models to query and manipulate Trusted Advisor data without handling raw AWS SDK calls or authentication logic. This abstraction is especially valuable when building cross‑account monitoring tools, compliance checkers, or cost‑optimization assistants that must aggregate insights from an entire organization.
Key features include:
- Dual‑API support: Seamlessly access both legacy and modern Trusted Advisor endpoints, ensuring backward compatibility while leveraging newer performance improvements.
- Multi‑language and region handling: Parameters for language (en, ja, fr, zh) and region allow localized recommendations and global coverage.
- Organization‑wide operations: Enterprise support plans unlock organization‑level aggregation, multi‑account resource visibility, and lifecycle management across accounts.
- Fine‑grained filtering: Clients can filter checks by service, pillar, source, status, and more, reducing payload size and improving response times.
- Lifecycle management: Update recommendation states (in progress, dismissed, resolved) directly through MCP, enabling AI agents to automate remediation workflows.
Typical use cases include:
- AI‑driven cost optimization: A conversational assistant can ask for high‑cost resource recommendations, receive filtered data via MCP, and guide users to take corrective actions.
- Compliance auditing: Bots can aggregate security‑related Trusted Advisor checks across an organization, present findings to auditors, and mark issues as resolved.
- Operational monitoring: Integrate the server into DevOps pipelines to surface performance or availability recommendations in real time, triggering alerts or automated remediation scripts.
By exposing Trusted Advisor through the Model Context Protocol, the MCP Trusted Advisor Server offers developers a powerful, standardized gateway to AWS security, cost, and performance insights—enabling richer AI experiences without the overhead of managing multiple APIs.
Related Servers
n8n
Self‑hosted, code‑first workflow automation platform
FastMCP
TypeScript framework for rapid MCP server development
Activepieces
Open-source AI automation platform for building and deploying extensible workflows
MaxKB
Enterprise‑grade AI agent platform with RAG and workflow orchestration.
Filestash
Web‑based file manager for any storage backend
MCP for Beginners
Learn Model Context Protocol with hands‑on examples
Weekly Views
Server Health
Information
Tags
Explore More Servers
Prometeo MCP Server
Connect your LLMs to Mexican banking and identity data
AgentNull
AI Threat Catalog and PoC Repository for Red Teaming
ROS MCP Server
Bidirectional AI integration for ROS robots
Coinmarket MCP Server
Real‑time crypto data via a custom URI scheme
eMCP Server
Extendable MCP server with auth and middleware support
MCP GitHub Server
GitHub-powered MCP server for repository data integration