About
A lightweight MCP server that exposes your WeChat Read bookshelf, notes, highlights, and reviews to LLM clients such as Cursor or Claude Desktop.
Capabilities

Overview
WeRead MCP Server is a dedicated Model Context Protocol service that bridges the popular Chinese reading platform WeChat Reading (微信读书) with large‑language‑model clients such as Claude Desktop and Cursor. By exposing a set of intuitive tools—, , , and —the server turns a user’s personal library into structured, query‑able data that can be consumed directly by an AI assistant. This eliminates the need for manual export or screen scraping, allowing developers to build intelligent reading assistants, recommendation engines, and study aids that respond in real time to a user’s library state.
What Problem Does It Solve?
Many LLM‑powered assistants lack access to a user’s private content. Reading apps typically lock data behind authentication, and developers must either store the data locally or rely on unofficial APIs. WeRead MCP Server removes these friction points by handling authentication (via cookies or CookieCloud), maintaining session validity, and providing a clean MCP interface. Developers can now ask an AI to “show me my most recent notes on The Great Gatsby” or “list all books by a specific author,” and the assistant will fetch the information directly from WeChat Reading without exposing credentials in code.
Key Features & Capabilities
- Book Shelf Retrieval – returns a comprehensive list of all books in the user’s library, including titles, authors, translators, and categories.
- Search Functionality – supports fuzzy and exact matching, optional detail flags, and configurable result limits.
- Notes & Highlights – organizes annotations by chapter, filters highlight styles, and outputs structured JSON for easy parsing.
- Best Reviews – fetches popular reviews, pagination support, and metadata such as ratings and likes.
- Cookie Management – Seamless integration with CookieCloud automates cookie refresh, reducing downtime caused by session expiration.
- MCP Compatibility – The server can be launched via or installed globally, making it straightforward to add to any MCP‑enabled client.
Real‑World Use Cases
- Personal Knowledge Base – An AI assistant can pull a user’s notes and generate summaries or study flashcards on demand.
- Reading Recommendation System – By querying the library and reviews, the assistant can suggest new titles that match a user’s interests.
- Academic Research Aid – Researchers can retrieve annotated passages from multiple sources and merge them into a single research document.
- Productivity Integration – Teams can share notes and highlights through the assistant, facilitating collaborative reading sessions.
Integration with AI Workflows
Adding WeRead MCP Server to an LLM workflow is as simple as configuring the server in the client’s MCP settings. Once configured, the assistant automatically presents the available tools as options during a conversation. When a user requests library information, the assistant calls the relevant tool, receives structured JSON, and formats it into a natural‑language response. Because the server handles authentication behind the scenes, developers can focus on building higher‑level logic—such as context‑aware summarization or dynamic question answering—without worrying about credential management.
Unique Advantages
- Zero‑Code Authentication – By leveraging CookieCloud, the server keeps user cookies secure and refreshed without manual intervention.
- Native WeChat Reading Integration – The server is tuned specifically for the platform’s data structures, ensuring accurate and complete information retrieval.
- Open‑Source & Extensible – Built on the MCP framework, developers can extend or customize tools to suit niche requirements, such as fetching audio notes or exporting data to external formats.
WeRead MCP Server turns a private reading library into a live, AI‑friendly data source, enabling developers to create rich, contextually aware reading experiences that feel natural and responsive.
Related Servers
n8n
Self‑hosted, code‑first workflow automation platform
FastMCP
TypeScript framework for rapid MCP server development
Activepieces
Open-source AI automation platform for building and deploying extensible workflows
MaxKB
Enterprise‑grade AI agent platform with RAG and workflow orchestration.
Filestash
Web‑based file manager for any storage backend
MCP for Beginners
Learn Model Context Protocol with hands‑on examples
Weekly Views
Server Health
Information
Explore More Servers
Databricks MCP Server
MCP-powered bridge to Databricks APIs
Seta MCP
Local Docs, Offline AI Context
Tradermcp
Fast, lightweight MCP server built with Bun for trading applications.
AgentOps MCP Server
Observability and tracing for AI agent debugging
Fireflies MCP Server
Unlock meeting insights with Fireflies transcript tools
DeepSeek MCP Server
Proxy your DeepSeek API for MCP-compatible apps