About
A Model Context Protocol server that connects LLMs to the Linear API, enabling creation, update, search, and commenting on issues within Linear’s project management platform.
Capabilities

The Linear MCP Server turns the popular project‑management platform Linear into a first‑class data source for AI assistants that speak Model Context Protocol (MCP). By exposing Linear’s rich API as a set of MCP tools and resources, developers can let Claude (or any other MCP‑compatible assistant) create, query, and manage issues directly from the AI interface. This bridges the gap between conversational AI workflows and real‑world task management, enabling smarter productivity pipelines without writing custom integrations.
At its core, the server implements three primary tool families. create‑issue allows an assistant to construct a new Linear ticket by supplying a title, team ID, and optional metadata such as priority, state, assignee, estimate, or labels. search‑issues provides a flexible query language that mirrors Linear’s own search syntax, letting the assistant filter by assignee, priority, state, team, or label, and even perform free‑text searches across titles and descriptions. Finally, read‑resource offers generic read access to any Linear entity via a simple URI scheme (e.g., ), making it trivial to pull organization, team, or issue details into the AI context.
The server is designed for developers who already use MCP‑based tools like Cursor. Once the Linear API key is set in an environment variable, a single shell script can launch the Node.js server. The MCP client (e.g., Cursor) then registers the command, and the assistant can invoke any of the exposed tools with natural language prompts. The rate‑limiting logic (1,000 requests per hour) protects both the Linear API and the user’s quota, while detailed error messages help diagnose authentication failures or malformed requests.
Typical use cases include:
- Automated issue triage – an assistant can read a backlog, suggest priorities, and create new tickets on the fly.
- Context‑aware reporting – pull team or project metrics into a conversation to generate status updates.
- Rapid prototyping – developers can experiment with Linear workflows directly from the AI interface without leaving their IDE or terminal.
Because the server follows MCP conventions, it can be swapped out or extended with minimal friction. The straightforward URI scheme and query syntax mean that developers familiar with Linear’s UI can translate their knowledge into MCP calls, while the error handling and metrics give confidence in production deployments. In short, the Linear MCP Server turns a traditional issue tracker into an interactive, AI‑powered knowledge base that fits naturally into modern development pipelines.
Related Servers
Netdata
Real‑time infrastructure monitoring for every metric, every second.
Awesome MCP Servers
Curated list of production-ready Model Context Protocol servers
JumpServer
Browser‑based, open‑source privileged access management
OpenTofu
Infrastructure as Code for secure, efficient cloud management
FastAPI-MCP
Expose FastAPI endpoints as MCP tools with built‑in auth
Pipedream MCP Server
Event‑driven integration platform for developers
Weekly Views
Server Health
Information
Explore More Servers
Kafka Schema Registry MCP Server
MCP-powered Kafka schema management for Claude Desktop
Model Context Protocol Server
Powering AI with Claude’s MCP SDK
Prometheus MCP Server
LLM‑powered Prometheus metric querying and analysis
Universal Database MCP Server
Read‑only database insight via Model Context Protocol
Aip MCP Server
Local and SSE-based Model Context Protocol server samples for quick prototyping
Omeka S MCP Sample
Integrate Omeka S with Claude Desktop via MCP