MCPSERV.CLUB
dylibso

Mcpx Py

MCP Server

Python client for MCP-run LLMs

Stale(55)
24stars
1views
Updated Sep 6, 2025

About

Mcpx Py is a Python library and CLI tool that lets developers interact with LLMs via the MCP‑run platform, supporting multiple providers (Claude, OpenAI, Gemini, Ollama) and structured responses.

Capabilities

Resources
Access data sources
Tools
Execute functions
Prompts
Pre-built templates
Sampling
AI model interactions

Overview

Mcpx‑Py is a Python client that bridges the Model Context Protocol (MCP) with popular large language model providers. It solves a common pain point for developers: the need to write custom adapters for each LLM service. By exposing a unified MCP interface, Mcpx‑Py lets an AI assistant—such as Claude, GPT‑4o, Gemini, or a locally hosted Ollama model—interact with external tools and data sources without having to handle provider‑specific authentication, request formatting, or response parsing.

The server works by wrapping any model supported through PydanticAI behind an MCP‑compliant endpoint. Developers can instantiate a object with the desired model name, and the library automatically negotiates authentication using environment variables or session IDs generated via . Once connected, the assistant can invoke arbitrary tools (JavaScript evaluation, REST calls, or custom scripts) and receive structured responses in the form of Pydantic models. This abstraction allows developers to focus on business logic instead of plumbing details.

Key capabilities include:

  • Provider Agnosticism – Switch between Anthropic, OpenAI, Gemini, Ollama, or any custom endpoint with a single line of code.
  • Structured Output – Specify a Pydantic model () to receive typed, validated data from the LLM.
  • Tool Execution – The MCP server exposes a registry of tools that can be called directly from the assistant, enabling dynamic data retrieval or computation.
  • Command‑line Utility – A lightweight CLI () allows quick experimentation: chat, list available tools, or evaluate JavaScript snippets.

Real‑world use cases abound. A data analyst can query a database through an MCP tool, pass the results to Claude for natural‑language summarization, and receive a typed summary model ready for downstream reporting. A DevOps engineer can trigger infrastructure scripts via the tool registry, while an AI assistant writes and tests code snippets in real time. Because Mcpx‑Py handles session management, API keys, and local model deployment (Ollama or Llamafile) behind the scenes, teams can rapidly prototype hybrid workflows that combine cloud‑based and on‑premises LLMs.

In short, Mcpx‑Py turns the MCP server into a versatile bridge that unifies model access, tool invocation, and structured output. It empowers developers to build sophisticated AI‑augmented applications without wrestling with provider quirks, making the integration of LLMs into production pipelines faster and more reliable.