MCPSERV.CLUB
ahujasid

BlenderMCP

MCP Server

Claude AI meets Blender for instant 3D creation

Active(71)
13.8kstars
6views
Updated 11 days ago

About

BlenderMCP enables Claude AI to control Blender via the Model Context Protocol, allowing prompt‑assisted 3D modeling, scene manipulation, and real‑time object and material editing through a socket server.

Capabilities

Resources
Access data sources
Tools
Execute functions
Prompts
Pre-built templates
Sampling
AI model interactions

BlenderMCP – Bringing AI‑Assisted 3D Creation to Your Workflow

BlenderMCP solves a common bottleneck for developers and designers who want to harness the power of large language models in the realm of 3D graphics. Traditional Blender scripting requires manual code editing and a deep understanding of the Python API, while AI assistants like Claude can generate high‑level design ideas but lack direct control over the 3D scene. BlenderMCP bridges this gap by exposing a Model Context Protocol (MCP) server that lets Claude issue natural‑language commands, query scene state, and even run arbitrary Python code inside Blender. The result is a fluid, conversational workflow where an AI can draft models, tweak materials, and iterate on layouts without the user leaving their chat interface.

At its core, BlenderMCP consists of two tightly coupled components: a lightweight socket server embedded in Blender via an addon, and a Python MCP implementation that translates the assistant’s requests into Blender API calls. This architecture enables two‑way communication: Claude can both send commands (e.g., “create a cube at coordinates 1,2,3”) and receive rich responses (e.g., the current list of objects, camera settings, or a rendered image). Because the addon runs inside Blender’s own process space, commands are executed instantly and with full access to the scene, ensuring that AI‑generated changes are fully integrated into the native workflow.

Key capabilities include:

  • Object manipulation: Create, duplicate, delete, and transform meshes or other Blender objects on demand.
  • Material control: Apply predefined shaders, adjust colors and textures, or generate custom material nodes through scripted instructions.
  • Scene inspection: Retrieve detailed metadata about objects, lights, cameras, and scene hierarchy, enabling the assistant to make context‑aware suggestions.
  • Code execution: Run arbitrary Python snippets directly within Blender, giving developers the ultimate flexibility to prototype complex logic or automate repetitive tasks.

Real‑world use cases span from rapid prototyping—where an AI drafts a character rig or environmental layout based on textual prompts—to iterative design reviews, where developers ask Claude to highlight potential issues in lighting or topology. In game development pipelines, BlenderMCP can automate asset generation from Sketchfab models or fetch free Poly Haven textures, dramatically reducing manual labor. For educators and researchers, the ability to query scene data or execute custom scripts from an AI assistant opens new avenues for interactive learning and experimentation.

BlenderMCP’s integration with existing MCP‑compatible workflows is seamless. Once the server is registered in Claude’s configuration, any assistant that supports MCP can interact with Blender without additional plugins. The server’s socket‑based interface is platform‑agnostic, working on macOS, Linux, and Windows, making it a versatile tool for cross‑platform teams. Its standout advantage lies in the combination of low latency, full API access, and natural‑language control, empowering developers to let an AI act as a true co‑designer rather than just a code generator.