MCPSERV.CLUB
MCP-Mirror

TrueRag MCP Server

MCP Server

Fast GraphQL policy access via Model Context Protocol

Stale(50)
0stars
0views
Updated Dec 28, 2024

About

A Python‑based MCP server that connects to a TrueRag GraphQL API, enabling quick policy data retrieval for AI agents such as Claude Desktop.

Capabilities

Resources
Access data sources
Tools
Execute functions
Prompts
Pre-built templates
Sampling
AI model interactions

Overview

The Ad Veritas MCP Server for TrueRag bridges the gap between AI assistants and a specialized GraphQL policy service. By exposing the policies API through the Model Context Protocol, it lets Claude and other MCP‑compatible assistants retrieve, query, and manipulate policy data without leaving the conversation. This eliminates the need for developers to build custom adapters or write raw GraphQL queries, streamlining integration into existing AI workflows.

The server is built on the official Python MCP SDK and leverages the library to communicate with the TrueRag GraphQL endpoint. Once configured with an API key and endpoint, the server offers a single resource—shipping‑policies—that can be invoked via standard MCP commands. The integration is as simple as adding a short JSON snippet to the assistant’s configuration file, after which any command prefixed with triggers a request to the GraphQL API. The assistant can then return policy details, calculate shipping costs, or validate restrictions directly within the chat.

Key capabilities include:

  • Secure authentication through environment‑based API keys, keeping credentials out of source code.
  • Automatic query generation, allowing the assistant to send complex GraphQL queries without exposing the underlying schema.
  • Context‑aware responses, where the server can incorporate conversation history into query parameters, enabling dynamic policy lookups.
  • Extensible resource model, making it straightforward to add new GraphQL endpoints or additional tools later.

Typical use cases span e‑commerce platforms, customer support bots, and logistics planners. For example, a customer might ask for the shipping policy of a specific product; the assistant can fetch the relevant policy in real time, compute estimated delivery times, and present the result—all powered by a single MCP call. Similarly, internal tools can query policy compliance for batch shipments or audit purposes without manual API integration.

What sets this MCP server apart is its zero‑code configuration for end users. Developers need only supply environment variables and a tiny JSON tweak to their assistant’s config; the heavy lifting—GraphQL communication, error handling, and response formatting—is handled by the server. This approach promotes rapid prototyping, reduces boilerplate, and ensures that policy data remains up‑to‑date across all AI interactions.