MCPSERV.CLUB
dvcrn

Siri Shortcuts MCP Server

MCP Server

Control macOS Shortcuts directly from an LLM

Active(80)
0stars
0views
Updated May 7, 2025

About

This server exposes the macOS Shortcuts app via MCP, allowing listing, opening, and running any shortcut programmatically. It generates dynamic tools for each shortcut, enabling seamless automation in conversational AI workflows.

Capabilities

Resources
Access data sources
Tools
Execute functions
Prompts
Pre-built templates
Sampling
AI model interactions

screenshot

Overview

The Siri Shortcuts MCP Server bridges the gap between macOS automation and AI assistants by exposing every shortcut stored in the Shortcuts app through the Model Context Protocol (MCP). Instead of manually invoking shortcuts via the graphical interface, an LLM such as Claude can query, open, or execute any shortcut directly from its natural‑language prompt. This capability eliminates the need for developers to write custom scripts or use command‑line utilities, enabling seamless integration of personal workflows into conversational AI.

At its core, the server provides three foundational tools: , , and . These allow an assistant to enumerate all available shortcuts, bring a specific shortcut into the Shortcuts editor for inspection or editing, and trigger execution with optional text or file input. The server’s design ensures that the LLM can discover and interact with shortcuts in a type‑safe manner, as each tool’s input schema is clearly defined. Beyond these base tools, the server auto‑generates a dedicated tool for every shortcut on the system. This dynamic generation means that a user’s entire automation library becomes immediately accessible without additional configuration, providing granular control over individual workflows.

Developers benefit from the server’s ability to treat shortcuts as first‑class actions within AI pipelines. For example, a user can ask the assistant to “schedule my next meeting” and the LLM will automatically call a pre‑configured shortcut that creates an event in Calendar, sends a reminder email, and updates the user’s task list—all with a single utterance. Similarly, data‑processing shortcuts can be invoked to transform CSV files or generate reports, allowing AI agents to orchestrate complex, multi‑step tasks that rely on macOS automation.

Integration with Claude or other MCP‑compatible assistants is straightforward: a single entry in the configuration file registers the server, after which the assistant automatically discovers all exposed tools. The assistant can then embed these tools in its responses, enabling a conversational loop where the user provides high‑level intent and the assistant translates it into precise shortcut executions. Because the server returns output when available, LLMs can surface results directly back to the user, closing the feedback loop.

In summary, this MCP server turns every Siri shortcut into an AI‑executable action. It offers a unified, discoverable interface for macOS automation, dramatically expands the functional scope of AI assistants, and empowers developers to weave personal or organizational workflows into natural‑language interactions with minimal effort.