About
Onyx is an MCP server that runs arbitrary code inside isolated Docker containers, supporting Python, Java, C/C++, Node.js, Rust and more. It is ideal for AI workflows and safe code execution in Claude Desktop or similar clients.
Capabilities
Onyx MCP Sandbox is a purpose‑built Model Context Protocol server that lets AI assistants such as Claude execute arbitrary code in a tightly controlled, containerised environment. By running each request inside a fresh Docker sandbox with network disabled, read‑only file systems, and strict resource limits, Onyx eliminates the security risks that normally accompany code execution while still offering the full flexibility of a multi‑language runtime. This makes it ideal for power users who want to prototype algorithms, run data transformations, or validate logic directly from a chat interface without exposing their host system to potentially malicious input.
At its core, Onyx exposes a single tool that accepts source code and a language identifier. The server selects the appropriate executor—Python, Java, C/C++, Node.js, Rust, or any future language added by the community—and spins up a Docker container using the corresponding official image. Inside the sandbox, the code is compiled or interpreted, executed with a limited CPU and memory budget, and its standard output, errors, and exit status are returned to the AI client. Because all logs stream to , clients can capture and display them safely, providing developers with immediate feedback on their code’s behaviour.
Key capabilities include:
- Multi‑language support out of the box, with a clear path to add new runtimes.
- Zero‑trust sandboxing: no network, non‑root users, and temporary file systems prevent data leakage.
- Resource isolation: configurable CPU/memory caps and process limits protect the host from runaway processes.
- CI‑ready architecture: automated tests validate each executor, ensuring reliability across updates.
Real‑world scenarios that benefit from Onyx are plentiful. A data scientist can run a quick Python script to sample a dataset directly within a chat, a software engineer can test a Java snippet for API usage, or a DevOps practitioner might prototype a Rust utility to process logs—all without leaving the AI interface. In continuous integration pipelines, Onyx can serve as a lightweight execution engine for language‑agnostic tests or code quality checks.
Integration is straightforward: once the server binary is available, any MCP‑compliant client—Claude Desktop being a primary example—can register it via a simple configuration entry. The client then exposes the tool, and developers can invoke it from prompts or tool calls. The seamless flow from chat to sandboxed execution, back to the assistant’s response, empowers developers to build richer, code‑aware AI workflows that are both powerful and secure.
Related Servers
MarkItDown MCP Server
Convert documents to Markdown for LLMs quickly and accurately
Context7 MCP
Real‑time, version‑specific code docs for LLMs
Playwright MCP
Browser automation via structured accessibility trees
BlenderMCP
Claude AI meets Blender for instant 3D creation
Pydantic AI
Build GenAI agents with Pydantic validation and observability
Chrome DevTools MCP
AI-powered Chrome automation and debugging
Weekly Views
Server Health
Information
Explore More Servers
Niklauslee Frame0 Mcp Server
MCP Server: Niklauslee Frame0 Mcp Server
MCP MSSQL Server
Seamless SQL Server integration via Model Context Protocol
Chatwork MCP Server
Control Chatwork via AI with Model Context Protocol
Code Sandbox MCP
Secure Docker-based code execution for AI apps
CivicNet MCP Tools
Modular utilities for managing and extending local MCP servers
Gemini gRPC Chat Assistant
AI chatbot backend using Gemini over gRPC