MCPSERV.CLUB
pskill9

Website Downloader MCP Server

MCP Server

Download entire sites locally with wget

Stale(50)
140stars
1views
Updated 20 days ago

About

A tool that recursively downloads a website using wget, preserving structure and converting links for offline use.

Capabilities

Resources
Access data sources
Tools
Execute functions
Prompts
Pre-built templates
Sampling
AI model interactions

Website Downloader MCP Server

Overview

The Website Downloader MCP Server is a lightweight tool that enables AI assistants to fetch complete, offline copies of web pages and sites. By leveraging the powerful command-line utility, it preserves the original site structure while rewiring internal links so that the downloaded content can be browsed locally without internet access. This capability is invaluable for developers who need to analyze, document, or test web pages in a controlled environment—especially when the target site is dynamic, behind authentication, or prone to change.

When invoked, the server exposes a single tool. The assistant can supply the target URL, an optional output directory, and a recursion depth parameter that limits how many levels of linked pages are fetched. By default the tool downloads everything reachable from the root URL, including CSS files, images, scripts, and other assets, while automatically appending appropriate file extensions. It also restricts downloads to the same domain to avoid unintentionally harvesting external resources.

Key features include:

  • Recursive downloading with unlimited depth, ensuring a faithful replica of the site.
  • Local link conversion, turning relative URLs into file paths that work offline.
  • Domain restriction to keep the download scoped and avoid cross-site data leakage.
  • Automatic file extension handling, making sure assets are correctly named for browsers to interpret.

Developers can integrate this server into AI workflows in several ways. For example, a Claude-powered assistant could ask the user to provide a URL and then use the tool to generate a static backup for documentation or compliance purposes. In testing scenarios, the assistant can download a staging site, run static analysis tools on the local copy, and report findings—all without requiring direct network access during evaluation. Moreover, the server’s simplicity allows it to be bundled into larger MCP ecosystems where multiple tools collaborate: a crawler might first gather URLs, the downloader fetches them, and subsequent analysis tools process the resulting files.

What sets this MCP server apart is its seamless coupling of a mature, battle-tested command-line downloader () with the declarative MCP interface. This combination delivers robust, repeatable downloads while keeping the integration lightweight and secure. For any project that needs reliable offline access to web content—whether for archival, testing, or data extraction—the Website Downloader MCP Server provides a quick, dependable solution that plugs directly into AI-driven development pipelines.