About
A Go‑based MCP server that exposes Splunk operations (saved searches, alerts, indexes, macros) over STDIO or SSE, enabling LLMs to query Splunk directly through standardized RPC calls.
Capabilities

The Splunk MCP server bridges the gap between AI assistants and one of the most widely used SIEM platforms. By exposing Splunk’s core data structures—saved searches, alerts, fired alerts, indexes, and macros—as JSON‑RPC tools, the server lets language models query real‑time security telemetry without writing custom API calls. For developers building diagnostic or investigative workflows, this eliminates the need to embed Splunk SDKs directly into application code; instead, the AI can request a list of active alerts or recent fired alerts and receive structured results instantly.
At its core, the server implements a small but powerful set of tools. Each tool accepts pagination parameters and optional filters (e.g., searching for alerts containing a keyword or limiting results to the last 24 hours). The tool also allows pattern matching on search names, giving analysts fine‑grained control over which incidents to surface. Because the server speaks the MCP protocol, any client that understands JSON‑RPC—whether it’s a custom script, a web UI, or an AI platform like Claude—can invoke these tools seamlessly.
Real‑world use cases abound. Security analysts can ask an AI assistant to “show me the top 10 recent fired alerts that mention ‘GitHub’,” and the assistant will call , filter by keyword, and return a concise table. Incident responders can query to confirm data retention settings before launching a forensic search. Even non‑security developers can use the tool to audit reusable search logic across environments. The included MCP prompts further streamline common queries by chaining multiple tools, ensuring the assistant gathers all necessary context before delivering an answer.
Integration is straightforward. The server supports both STDIO and SSE transports, allowing it to run as a lightweight local process or as a long‑running HTTP endpoint behind Smithery. Once configured in Cursor, the assistant automatically discovers available tools and can embed their outputs directly into the conversation context. This tight coupling means that AI‑driven investigations become more accurate, faster, and less error‑prone—an essential advantage in today’s fast‑paced threat landscape.
Related Servers
n8n
Self‑hosted, code‑first workflow automation platform
FastMCP
TypeScript framework for rapid MCP server development
Activepieces
Open-source AI automation platform for building and deploying extensible workflows
MaxKB
Enterprise‑grade AI agent platform with RAG and workflow orchestration.
Filestash
Web‑based file manager for any storage backend
MCP for Beginners
Learn Model Context Protocol with hands‑on examples
Weekly Views
Server Health
Information
Explore More Servers
Binance Alpha MCP Server
Track Binance Alpha trades in real‑time for AI agent optimization
MCP Server
Build Model Context Protocol servers in .NET
MaxMSP MCP Server
LLMs that understand and create Max patches in real time
DB MCP Server
Unified multi-database access for AI assistants
UI Builder MCP Server
Generate UI components from structured definitions
Bifrost VSCode Dev Tools MCP Server
Expose VSCode's dev tools to AI assistants via MCP