About
PRIMS is an open‑source MCP server that lets LLM agents execute arbitrary Python code in a throw‑away sandbox. Each request spawns a fresh virtual environment, installs requested pip packages, mounts read‑only files, and streams stdout/stderr back to the client.
Capabilities

PRIMS – Python Runtime Interpreter MCP Server
PRIMS is a lightweight, open‑source Model Context Protocol (MCP) server that gives AI assistants the ability to execute arbitrary Python code in a sandboxed, disposable environment. By exposing a single, well‑defined tool (), PRIMS removes the complexity of managing runtimes, dependencies, and security for developers who want to embed code execution into conversational agents or automated workflows.
The core problem PRIMS solves is the friction of safely running user‑supplied code within an LLM‑driven application. Traditional approaches require maintaining long‑lived virtual environments, handling dependency resolution, and ensuring isolation from the host system. PRIMS eliminates these concerns by spinning up a fresh virtual environment for every invocation, installing requested pip packages on demand, mounting optional read‑only files, and then tearing the workspace down. This guarantees reproducibility, prevents state leakage between calls, and protects the host from malicious code.
Key capabilities include:
- Secure sandboxing: Each execution runs in a freshly created environment that is destroyed afterward, ensuring no residual data or side effects.
- Dynamic dependency management: Callers can specify pip packages to install for that run, allowing the assistant to adapt to new libraries without manual updates.
- File handling tools: downloads a remote file once per session, while , , and provide convenient ways to inspect, preview, or persist results.
- Zero configuration: The server can be launched via MCP/stdio for local testing or packaged in Docker for production deployments, requiring no special setup beyond the standard MCP interface.
Real‑world use cases span data science pipelines where an assistant fetches a dataset, runs analysis code, and returns visualizations; automated testing frameworks that need to execute user scripts in isolation; or educational platforms where learners can submit code snippets and see immediate, sandboxed output. By integrating PRIMS into an AI workflow, developers can expose code execution as a first‑class tool in the MCP ecosystem, enabling richer interactions and seamless orchestration of external computations.
Related Servers
Netdata
Real‑time infrastructure monitoring for every metric, every second.
Awesome MCP Servers
Curated list of production-ready Model Context Protocol servers
JumpServer
Browser‑based, open‑source privileged access management
OpenTofu
Infrastructure as Code for secure, efficient cloud management
FastAPI-MCP
Expose FastAPI endpoints as MCP tools with built‑in auth
Pipedream MCP Server
Event‑driven integration platform for developers
Weekly Views
Server Health
Information
Explore More Servers
Mcp Digitalocean Server
MCP Server: Mcp Digitalocean Server
Hot Update MCP Server
Dynamically update tools without restarting the server
Elfa MCP Server
Multi‑language implementation of the MCP protocol
Gridscale MCP Server
AI-driven infrastructure provisioning via Gridscale API
SQLite Explorer MCP Server
Safe, read‑only SQLite exploration via Model Context Protocol
Mocxykit
Developer-friendly proxy and mock middleware for frontend projects