MCPSERV.CLUB
hileamlakB

PRIMS – Python Runtime Interpreter MCP Server

MCP Server

Secure, isolated Python execution via a single MCP tool

Stale(55)
18stars
1views
Updated 25 days ago

About

PRIMS is an open‑source MCP server that lets LLM agents execute arbitrary Python code in a throw‑away sandbox. Each request spawns a fresh virtual environment, installs requested pip packages, mounts read‑only files, and streams stdout/stderr back to the client.

Capabilities

Resources
Access data sources
Tools
Execute functions
Prompts
Pre-built templates
Sampling
AI model interactions

PRIMS Logo

PRIMS – Python Runtime Interpreter MCP Server
PRIMS is a lightweight, open‑source Model Context Protocol (MCP) server that gives AI assistants the ability to execute arbitrary Python code in a sandboxed, disposable environment. By exposing a single, well‑defined tool (), PRIMS removes the complexity of managing runtimes, dependencies, and security for developers who want to embed code execution into conversational agents or automated workflows.

The core problem PRIMS solves is the friction of safely running user‑supplied code within an LLM‑driven application. Traditional approaches require maintaining long‑lived virtual environments, handling dependency resolution, and ensuring isolation from the host system. PRIMS eliminates these concerns by spinning up a fresh virtual environment for every invocation, installing requested pip packages on demand, mounting optional read‑only files, and then tearing the workspace down. This guarantees reproducibility, prevents state leakage between calls, and protects the host from malicious code.

Key capabilities include:

  • Secure sandboxing: Each execution runs in a freshly created environment that is destroyed afterward, ensuring no residual data or side effects.
  • Dynamic dependency management: Callers can specify pip packages to install for that run, allowing the assistant to adapt to new libraries without manual updates.
  • File handling tools: downloads a remote file once per session, while , , and provide convenient ways to inspect, preview, or persist results.
  • Zero configuration: The server can be launched via MCP/stdio for local testing or packaged in Docker for production deployments, requiring no special setup beyond the standard MCP interface.

Real‑world use cases span data science pipelines where an assistant fetches a dataset, runs analysis code, and returns visualizations; automated testing frameworks that need to execute user scripts in isolation; or educational platforms where learners can submit code snippets and see immediate, sandboxed output. By integrating PRIMS into an AI workflow, developers can expose code execution as a first‑class tool in the MCP ecosystem, enabling richer interactions and seamless orchestration of external computations.