About
Lisply is a Model Context Protocol middleware that connects LLMs to Lisp‑based REPL environments, enabling AI agents like Claude Desktop to generate, evaluate, and manage Lisp code across multiple runtimes through a simple Docker‑enabled setup.
Capabilities

Lisply is a lightweight Model Context Protocol (MCP) middleware that bridges large language models with Lisp‑based and Lisp‑like development environments. By exposing a simple REPL‑style interface over HTTP, the server allows an AI assistant such as Claude Desktop to generate, evaluate, and manipulate arbitrary Lisp code on the fly. This capability turns an LLM into a powerful neuro‑symbolic programming partner, capable of creating functions, compiling files, loading libraries, and running tests—all within the same conversational session.
The primary problem Lisply solves is the disconnect between natural‑language reasoning and symbolic execution. Developers who work in Common Lisp, Emacs Lisp, or other dialects often need to prototype logic that is easier expressed as code rather than prose. Conversely, AI assistants excel at generating high‑level ideas but lack direct access to the runtime state of a Lisp system. Lisply fills this gap by translating MCP messages into REPL commands, returning results in a structured format that the assistant can interpret and display. This tight integration reduces context switching, speeds up iterative development, and enables AI‑driven debugging.
Key features of Lisply include:
- Multi‑dialect support – any Lisp environment that can implement the Lispy protocol becomes a first‑class MCP server, from Common Lisp to Emacs Lisp and beyond.
- Full REPL access – the server forwards evaluation requests, allowing the AI to run arbitrary expressions and receive back values, errors, or compiled bytecode.
- Project‑level manipulation – beyond single expressions, the server can load entire files, compile modules, and invoke test suites, giving the assistant a true development workflow.
- Extensible via MCP – additional capabilities such as resource management, prompt handling, or custom sampling can be added without changing the core middleware.
Real‑world use cases span a wide spectrum. A mechanical engineer can ask an AI to generate CAD‑generation scripts in Lisp, compile them on the server, and immediately see visual feedback. A Lisp enthusiast might use Lisply to experiment with new language features, letting the assistant suggest optimizations or refactorings. In educational settings, students can interact with an AI tutor that evaluates their code snippets in real time, providing instant hints and corrections.
Integrating Lisply into AI workflows is straightforward: an MCP‑capable client configures a server entry pointing to the Lisply executable. Once connected, every prompt that includes Lisp code is automatically routed to the server’s REPL, and responses are fed back into the conversation. This seamless loop turns a static language model into an interactive development partner, enabling rapid prototyping, automated testing, and neuro‑symbolic reasoning across diverse Lisp ecosystems.
Related Servers
n8n
Self‑hosted, code‑first workflow automation platform
FastMCP
TypeScript framework for rapid MCP server development
Activepieces
Open-source AI automation platform for building and deploying extensible workflows
MaxKB
Enterprise‑grade AI agent platform with RAG and workflow orchestration.
Filestash
Web‑based file manager for any storage backend
MCP for Beginners
Learn Model Context Protocol with hands‑on examples
Weekly Views
Server Health
Information
Explore More Servers
Barnsworthburning MCP
Search Barnsworthburning.net via Model Context Protocol
Coze MCP Server
Unified API for Coze bots and workflows
iOS Simulator MCP Server
Programmatic control of iOS simulators via MCP
Smithery Registry MCP Server
Discover and launch MCP servers with Smithery Registry
ComfyUI Selfie MCP Server
Generate images from prompts via ComfyUI workflows
Web Monitor Mcp Safepoint
Monitor web applications at safepoints with real-time insights