About
A secure, Docker‑based MCP server that executes Python code and streams results via SSE. Ideal for LLM clients needing isolated, scalable model inference.
Capabilities
Overview
The MCP-WASMPython-Runner is a lightweight, Docker‑based MCP server that enables AI assistants to execute arbitrary Python code securely through the Model Context Protocol. By exposing a WebSocket endpoint (), it allows LLM clients to stream results in real time, giving developers a responsive interface for code evaluation and data manipulation. The server is built around WebAssembly (WASM) to sandbox execution, preventing malicious or accidental system access while still providing full Python functionality.
Problem Solved
Many AI assistants lack the ability to run user‑provided code in a controlled environment. Traditional approaches either rely on unsafe shell execution or require complex deployment pipelines. The MCP-WASmpython‑Runner addresses these gaps by offering a ready‑to‑deploy container that isolates execution, mitigates security risks, and integrates natively with MCP clients. This eliminates the need for custom runtime setups or manual sandbox configuration.
Core Value to Developers
For developers building AI‑augmented workflows, the server delivers a single point of contact for running Python scripts. Whether it’s data preprocessing, model inference, or simple arithmetic, the MCP client can send a request and receive streamed output without leaving its own context. This tight coupling means developers can prototype logic, debug directly in the assistant, and iterate quickly—all while keeping code execution contained.
Key Features
- WASM Sandbox: Runs Python in a WebAssembly environment, ensuring isolation from the host system and limiting resource usage.
- Streaming Output: Uses Server‑Sent Events () to push results incrementally, improving interactivity for long‑running tasks.
- Dockerized Deployment: A single setup brings the server online with minimal friction, making it suitable for both local development and production.
- Hot Reload in Development: The target watches for file changes, allowing rapid iteration during development cycles.
- Extensible MCP Interface: Supports the standard MCP resource, tool, prompt, and sampling schemas, enabling seamless integration with existing AI assistants.
Use Cases
- Interactive Coding Assistants: Users can write Python snippets in chat, have them executed instantly, and receive live feedback.
- Data Pipeline Orchestration: AI assistants can trigger data transformations, export results, and feed them into downstream services.
- Educational Tools: Students can experiment with code within an AI tutor, seeing immediate results without risking their local machine.
- Rapid Prototyping: Developers can test algorithmic ideas on the fly, using the assistant to generate code and then run it in a safe sandbox.
Integration with AI Workflows
The server exposes standard MCP endpoints, so any LLM client that supports MCP can invoke it by simply adding the URL to its tool list. Once registered, the assistant can pass code blocks as tool calls, receive streamed execution logs, and handle errors gracefully—all without leaving the conversational context. This tight integration streamlines the development cycle and enhances user experience by keeping everything in a single, coherent interface.
Unique Advantages
Unlike generic code execution services, the MCP-WASmpython‑Runner is specifically tailored for MCP clients, ensuring compatibility out of the box. Its WASM sandbox offers a higher security posture than traditional Docker‑only approaches, while still delivering full Python capabilities. The combination of Docker deployment and hot reload support makes it an ideal choice for both production deployments and rapid prototyping environments.
Related Servers
MindsDB MCP Server
Unified AI-driven data query across all sources
Homebrew Legacy Server
Legacy Homebrew repository split into core formulae and package manager
Daytona
Secure, elastic sandbox infrastructure for AI code execution
SafeLine WAF Server
Secure your web apps with a self‑hosted reverse‑proxy firewall
mediar-ai/screenpipe
MCP Server: mediar-ai/screenpipe
Skyvern
MCP Server: Skyvern
Weekly Views
Server Health
Information
Explore More Servers
React Vite MCP Server
Fast React dev with Vite, TS, and ESLint integration
Trello MCP Server
Seamless Trello board integration with rate limiting and type safety
Veeva MCP Server By CData
Read‑only MCP server exposing Veeva data via natural language queries
Dynamics 365 AI Agent MCP Server (.NET)
Gateway exposing D365 logic via the Model Context Protocol
Sequential Thinking MCP Server
Step‑by‑step problem solving for LLMs
MalwareBazaar MCP Server
Real‑time Malware Bazaar intelligence for automated research