About
Radare2 MCP Server enables seamless binary analysis through AI agents by exposing radare2 functionality via the Model Context Protocol. It supports CLI, plugin, and MCP modes with fine‑grained tool configuration, sandboxing, and optional raw command access.
Capabilities
Radare2 MCP Server bridges the powerful reverse‑engineering capabilities of radare2 with modern AI assistants. By exposing radare2’s full API surface through the Model Context Protocol, developers can let language models query binaries, generate analysis reports, or even execute custom scripts without leaving their preferred IDE or chat interface. This eliminates the need for manual command‑line interaction and turns binary analysis into a conversational, programmable experience.
The server is written entirely in C and leverages radare2’s native libraries for maximum performance. It can run as a standalone CLI tool, as an r2 plugin, or most importantly, as an MCP server that speaks the standard stdio protocol. This flexibility means it can be invoked from any LLM‑powered workflow—Claude Desktop, VS Code Copilot Chat, Zed AI, or any custom client that supports MCP. The server supports both local and remote radare2 sessions via r2pipe, allowing analysis of binaries on a developer’s workstation or in isolated sandboxes.
Key capabilities include:
- Read‑only mode, sandbox lock, and tool restrictions to keep the analysis environment safe when exposed over a network or to untrusted agents.
- Fine‑grained tool configuration so users can expose only the commands or scripts they need.
- Direct stdin/stdout communication, giving agents a low‑latency, bidirectional channel to send radare2 commands or receive structured JSON results.
- Optional raw access for advanced users to run arbitrary r2 commands or JavaScript scripts when the higher‑level abstractions are insufficient.
Typical use cases involve security researchers automating vulnerability discovery, software developers asking an AI to explain the control flow of a compiled library, or educators building interactive reverse‑engineering tutorials. In an IDE, the assistant can fetch function signatures, disassemble code on demand, or even suggest patch snippets—all without leaving the chat pane. In a CI pipeline, an LLM could run radare2 analyses on new releases and summarize findings in natural language.
By integrating radare2 into the AI ecosystem, Radare2 MCP Server turns static binary analysis into an interactive, context‑aware service. Developers gain a single point of entry for all reverse‑engineering tasks, while AI assistants can deliver instant, actionable insights directly within their native workflows.
Related Servers
n8n
Self‑hosted, code‑first workflow automation platform
FastMCP
TypeScript framework for rapid MCP server development
Activepieces
Open-source AI automation platform for building and deploying extensible workflows
MaxKB
Enterprise‑grade AI agent platform with RAG and workflow orchestration.
Filestash
Web‑based file manager for any storage backend
MCP for Beginners
Learn Model Context Protocol with hands‑on examples
Weekly Views
Server Health
Information
Explore More Servers
MCP Server in .NET
Build a Model Context Protocol server with C#
MCP Analytics Middleware
Track, visualize, and optimize MCP server usage
Gorse MCP Server
Unified MVNO backend for eSIM, AI, and blockchain services
Cognee MCP Server
Build knowledge graphs and search with AI
Ollama MCP Bridge WebUI
Local LLMs, Universal Tools, Web Interface
WSB Analyst MCP Server
Real‑time WallStreetBets data for LLM analysis