About
A TypeScript‑based Model Context Protocol server that lets large language models interact with an Obsidian vault via the Local REST API. It offers atomic file operations, full‑text search, YAML frontmatter handling, and secure authenticated access.
Capabilities

Overview
The Obsidian MCP Server bridges the gap between local knowledge bases and conversational AI assistants by exposing a read‑only view of an Obsidian vault through the Model Context Protocol. It taps directly into a user’s LiveSync CouchDB instance, translating database rows into MCP resources that AI models can query in real time. This solves the long‑standing problem of keeping an assistant’s context up to date with a user’s evolving notes without requiring the model to store or re‑index that data locally.
At its core, the server implements MCP version 2025‑03‑26 and offers a rich set of features tailored to developers who want seamless, low‑latency access to markdown content. It performs highly optimised resource listings—returning the ten most recent notes—and provides powerful search tools that respect Obsidian’s metadata schema, including frontmatter tags, aliases and internal links. For small result sets (three notes or fewer) the server automatically expands the content, reducing unnecessary back‑and‑forth exchanges and improving conversational flow.
Developers benefit from a plug‑in‑ready architecture that requires only a running LiveSync instance and CouchDB credentials. The server exposes its API over both standard MCP transports (stdio) and HTTP Server‑Sent Events, allowing it to fit naturally into existing AI workflows. Whether a model is queried via Claude, Gemini or any MCP‑compliant client, the server presents a consistent interface: resources are identified by unique paths, metadata is harvested automatically, and content can be reassembled from chunked notes when necessary.
Key capabilities include handling encrypted vaults (when a passphrase is supplied), Docker‑friendly deployment, and extensive configuration through environment variables. The server’s design ensures that developers can integrate it into CI/CD pipelines or containerised services with minimal friction, while still providing the fine‑grained control needed for production environments.
In practice, the Obsidian MCP Server is ideal for knowledge‑intensive applications such as personal assistants that need to surface recent meeting notes, code documentation, or research articles on demand. It also serves as a foundation for building domain‑specific AI tools—like a customer support bot that pulls the latest product docs from an internal vault or a research assistant that surfaces relevant literature snippets—all without compromising on security, performance or developer ergonomics.
Related Servers
MarkItDown MCP Server
Convert documents to Markdown for LLMs quickly and accurately
Context7 MCP
Real‑time, version‑specific code docs for LLMs
Playwright MCP
Browser automation via structured accessibility trees
BlenderMCP
Claude AI meets Blender for instant 3D creation
Pydantic AI
Build GenAI agents with Pydantic validation and observability
Chrome DevTools MCP
AI-powered Chrome automation and debugging
Weekly Views
Server Health
Information
Explore More Servers
Apple Calendar MCP Server
Generate calendar events via Claude or other clients
Kubernetes MCP Server
Manage K8s resources and logs via the Model Context Protocol
MCP Claude Desktop Demo Server
Interactive Tool API for Calculator, Notes, and Currency Exchange
OLETools Secure MCP Server
Secure Office file analysis via FastMCP
Hyperliquid MCP Server
Full-featured AI trading interface for Hyperliquid
Codevideo MCP
Generate video lessons from natural language