About
Mcpterm is an MCP server that offers persistent, TUI-compatible terminal sessions. It lets AI agents run commands in a shared shell context and interact with text‑based interfaces like Vim or Python REPLs.
Capabilities

Overview
mcpterm is a lightweight Model Context Protocol (MCP) server that delivers fully stateful, terminal‑user‑interface (TUI) sessions to AI assistants. By exposing two core tools— and —it lets an assistant like Claude launch commands, maintain working directories, and interact with full-screen applications such as Vim or a Python REPL. This capability bridges the gap between an AI’s high‑level instructions and low‑level shell interactions, enabling developers to build sophisticated, end‑to‑end workflows that combine natural language reasoning with hands‑on terminal manipulation.
The server solves a common pain point for developers who wish to automate or assist with command‑line tasks: most MCP tools are stateless, meaning each invocation starts a fresh shell. keeps the session alive across calls, preserving environment variables, current working directories, and any open processes. This statefulness is crucial for realistic development scenarios—think of a Docker build that requires multiple and steps, or an interactive debugging session where the assistant must preserve context between queries.
Key features of include:
- Persistent terminal sessions: Commands run in the same environment, allowing sequential operations without re‑initializing the shell each time.
- Dual toolset: executes arbitrary commands, while captures the full screen output of TUI programs, making it possible to control editors or interactive scripts.
- Customizable key bindings: The server ships a comprehensive mapping of control sequences (e.g., for Ctrl+C, for Ctrl+X) that the assistant can inject into TUI sessions, ensuring smooth interaction with programs that rely on keyboard shortcuts.
- Seamless integration: By registering in the Claude Desktop configuration, developers can instantly expose its tools to the assistant without additional middleware.
In real‑world use cases, shines when an AI needs to write code, test it, and iterate—all within a single conversational thread. For example, an assistant can draft a Dockerfile in Vim, build the image, launch a container, and then open a Python REPL inside that container to execute tests—all while maintaining the same terminal context. This eliminates the friction of manually switching between windows or re‑entering commands, leading to a more fluid and productive developer experience.
Overall, stands out as a proof‑of‑concept that demonstrates how stateful terminal interactions can be harnessed within the MCP ecosystem. Its straightforward tool set, combined with robust key‑binding support and easy integration into existing AI workflows, makes it an invaluable asset for developers looking to embed deep shell capabilities into conversational agents.
Related Servers
n8n
Self‑hosted, code‑first workflow automation platform
FastMCP
TypeScript framework for rapid MCP server development
Activepieces
Open-source AI automation platform for building and deploying extensible workflows
MaxKB
Enterprise‑grade AI agent platform with RAG and workflow orchestration.
Filestash
Web‑based file manager for any storage backend
MCP for Beginners
Learn Model Context Protocol with hands‑on examples
Weekly Views
Server Health
Information
Explore More Servers
Exa Web Search MCP Server
Real-time web search and content extraction for Zed
Custom GitLab MCP Server
Seamless GitLab integration for AI assistants
Bash MCP Server
Minimalistic shell-based Model Context Protocol server
Agents MCP Usage Demo & Benchmarking Platform
LLM Agent framework integration and evaluation with MCP servers
LibSQL MCP Server
MCP interface for LibSQL databases
Homebrew Legacy Server
Legacy Homebrew repository split into core formulae and package manager