About
A lightweight MCP implementation providing tool discovery, execution, and data retrieval for IOG personas and products. Designed to integrate seamlessly with Windsurf and extendable via a tool registry.
Capabilities

The IOG Model Context Protocol (MCP) Server is a lightweight, standalone implementation designed to bridge AI assistants with external data and functionality. By exposing a standardized set of MCP endpoints, it allows tools such as calculators, web search, and domain‑specific data retrieval to be discovered and invoked by AI clients like Claude or other MCP‑compatible assistants. This eliminates the need for custom integrations and gives developers a single, consistent interface to add new capabilities.
At its core, the server hosts a tool registry that includes ready‑made utilities for performing arithmetic calculations and querying web search results. Beyond these generic tools, it also offers data retrieval functions that tap into locally stored JSON files containing persona and product information. Each tool is described with a comprehensive JSON schema, ensuring that clients can validate parameters before execution and receive predictable responses. The standard MCP endpoints ( for discovery and for invocation) make it trivial to list available tools or run a specific one with the required arguments.
Developers benefit from this architecture in several ways. First, the server can be embedded into existing applications—such as Windsurf—without requiring a full AI stack. The provided integration example demonstrates how to register the MCP server with Windsurf, enabling downstream services to call tools through a familiar API. Second, because the tool definitions are JSON‑schema driven, adding new capabilities is as simple as extending a JavaScript object; the server automatically updates the discovery endpoint. This modularity encourages rapid prototyping and experimentation.
Typical use cases include building conversational agents that need to answer product‑related queries, fetch persona details for role‑playing scenarios, or perform quick calculations on the fly. In an e‑commerce setting, a customer support chatbot could retrieve product specifications from the file while simultaneously calculating discount totals. In a research environment, an AI assistant could pull persona attributes to tailor responses or execute web searches for up‑to‑date information. Because the MCP server exposes a clean REST interface, these scenarios can be integrated into existing CI/CD pipelines or microservice architectures with minimal friction.
What sets the IOG MCP Server apart is its focus on simplicity and extensibility. It ships with a fully functional example for Windsurf, demonstrates clear JSON schema usage, and follows the MCP standard rigorously. Developers who are already familiar with MCP concepts will find that this server reduces boilerplate, accelerates tool integration, and provides a reliable foundation for building sophisticated AI‑powered workflows.
Related Servers
MarkItDown MCP Server
Convert documents to Markdown for LLMs quickly and accurately
Context7 MCP
Real‑time, version‑specific code docs for LLMs
Playwright MCP
Browser automation via structured accessibility trees
BlenderMCP
Claude AI meets Blender for instant 3D creation
Pydantic AI
Build GenAI agents with Pydantic validation and observability
Chrome DevTools MCP
AI-powered Chrome automation and debugging
Weekly Views
Server Health
Information
Explore More Servers
MindMesh MCP Server
Quantum‑inspired swarm of Claude LLMs for coherent reasoning
Marshal MCP Vulnerability Scan Server
Automated vulnerability scanning via MCP and Marshal integration
MCP FHIR Integration Server
Seamless MCP to FHIR resource management
Mix Server
Fast, lightweight local time and browser opener service
Unix Manual Server
Instant Unix command docs in Claude chat
Mcp Calculator Server
Simple MCP-powered calculator with tool‑calling exploration