About
A Model Context Protocol server that lets language models query, list, and retrieve Readwise documents and highlights via the Readwise API.
Capabilities
Kiseki Labs Readwise MCP Server
The Kiseki‑Labs‑Readwise‑MCP server bridges the gap between conversational AI assistants and the rich knowledge base stored in Readwise. By exposing a set of well‑defined tools, it allows language models to search, retrieve, and manipulate documents and highlights directly from Readwise, turning passive text into an active resource for AI‑driven workflows. This capability is especially valuable for developers building knowledge‑management, summarization, or recommendation systems that rely on user‑generated annotations.
The server implements four primary tools that mirror common Readwise operations. The find tool locates a document by its exact title, enabling precise queries such as “What is the summary of The Lean Startup?” The list tool enumerates documents filtered by category or date range, giving the model a dynamic view of recent reading activity. The get tools retrieve highlights either by specific document IDs or via broader filters like tags and dates, allowing the assistant to pull contextually relevant excerpts for summarization or Q&A. Each tool returns native and objects, ensuring consistent data structures that can be further processed or displayed by the client.
Developers integrate this MCP into their AI pipelines by registering it in the assistant’s configuration. Once available, the model can invoke these tools as part of its reasoning process, for example by first searching for a relevant document and then extracting highlights to answer a user’s question. Because the server runs locally, latency is minimal, and sensitive reading data never leaves the user’s environment. This design aligns with privacy‑first principles while still offering powerful content retrieval.
Real‑world use cases include:
- Personal knowledge bases where a model pulls recent highlights to generate daily digests.
- Educational assistants that fetch textbook excerpts to explain concepts in context.
- Content creators who want instant access to source material for research or citation generation.
- Productivity tools that surface recent reading activity to remind users of unfinished notes.
What sets this MCP apart is its tight coupling with Readwise’s API and the thoughtful filtering options. The ability to query by tags, dates, or categories means developers can craft highly targeted requests without fetching extraneous data. Additionally, the server’s lightweight implementation in Python and its compatibility with fastmcp make it easy to deploy on a variety of platforms, from local development machines to cloud‑based assistants. Overall, the Kiseki Labs Readwise MCP provides a streamlined, privacy‑respecting bridge between AI models and users’ curated reading insights.
Related Servers
Netdata
Real‑time infrastructure monitoring for every metric, every second.
Awesome MCP Servers
Curated list of production-ready Model Context Protocol servers
JumpServer
Browser‑based, open‑source privileged access management
OpenTofu
Infrastructure as Code for secure, efficient cloud management
FastAPI-MCP
Expose FastAPI endpoints as MCP tools with built‑in auth
Pipedream MCP Server
Event‑driven integration platform for developers
Weekly Views
Server Health
Information
Explore More Servers
MSSQL MCP Server
Unified API for Microsoft SQL Server databases
Astro MCP Server
Fast, type-safe Model Context Protocol for Astro projects
LibSQL MCP Server
MCP interface for LibSQL databases
Unreal Engine MCP Python Bridge
Connect AI agents to Unreal Engine via Model Context Protocol
MCP Agents Basic
Simple MCP server with Gradio agent demo
MCP Advisor
Your gateway to the Model Context Protocol specification