About
A minimal Model Context Protocol server that enables creating, updating, listing, diffing, reviewing, and commenting on GitHub pull requests with dual PAT and GitHub CLI authentication support.
Capabilities

Overview
The mcp‑gh‑pr‑mini server is a lightweight Model Context Protocol (MCP) gateway that bridges AI assistants with GitHub pull‑request workflows. It solves the common pain point of needing to manually switch between code editors, command lines, and GitHub’s web UI by exposing a concise set of pull‑request operations as MCP tools. Developers can therefore embed sophisticated GitHub interactions directly into AI‑driven development pipelines, enabling continuous integration of code reviews, issue triage, and collaboration without leaving the assistant’s context.
At its core, the server implements a suite of pull‑request tools that cover the full lifecycle: creating new PRs, updating titles or states, listing open requests, and fetching diffs. It also supports fine‑grained review mechanics such as adding reviewers, posting general comments, and inserting inline code‑review annotations. Each tool returns structured JSON that the AI can parse and act upon, allowing for seamless chaining of actions—e.g., an assistant could generate a diff‑based summary and automatically open a PR with relevant reviewers.
A standout feature is its dual authentication strategy. The server can authenticate via a Personal Access Token (PAT) or by leveraging the user’s existing GitHub CLI credentials. An intelligent auto‑detection layer chooses the best method on startup, and if one fails it falls back to the other without interrupting the workflow. This guarantees high reliability in varied environments, from local development machines to CI runners where only one form of authentication may be available.
Real‑world use cases include automated code generation and review: an AI assistant writes a new feature, creates a PR, assigns senior reviewers, and leaves explanatory comments—all in one turn. Another scenario is continuous deployment pipelines where the assistant monitors open PRs, triggers automated tests on diffs, and closes stale requests. Because the server exposes a consistent MCP interface, these patterns can be replicated across teams and projects with minimal configuration.
By integrating mcp‑gh‑pr‑mini into an AI development stack, teams gain a single point of control for pull‑request management that is both developer‑friendly and highly extensible. The server’s minimal footprint, comprehensive feature set, and resilient authentication make it a practical choice for anyone looking to fuse AI tooling with GitHub’s collaborative workflow.
Related Servers
n8n
Self‑hosted, code‑first workflow automation platform
FastMCP
TypeScript framework for rapid MCP server development
Activepieces
Open-source AI automation platform for building and deploying extensible workflows
MaxKB
Enterprise‑grade AI agent platform with RAG and workflow orchestration.
Filestash
Web‑based file manager for any storage backend
MCP for Beginners
Learn Model Context Protocol with hands‑on examples
Weekly Views
Server Health
Information
Explore More Servers
Airbnb Search & Listings MCP Server
Discover Airbnb listings with advanced filtering and detailed insights
DeepL MCP Server
Seamless translation via DeepL API in any conversation
Git Prompts MCP Server
Generate Git PR prompts via Model Context Protocol
Azure DevOps MCP Server
Integrate Azure DevOps with ease and power
Framelink Figma MCP Server
AI-powered access to Figma designs for instant code generation
Aleph-10 Vector Memory MCP Server
Weather data meets semantic memory in one MCP service