MCPSERV.CLUB
louisfghbvc

LeetCode Interview Question Crawler

MCP Server

Harvest Google interview questions from LeetCode discussions

Stale(50)
3stars
3views
Updated Jul 15, 2025

About

A command‑line tool that crawls LeetCode discussion forums for interview questions tagged with a company (default Google), organizes results by month, and exports them to CSV or Google Sheets for analysis.

Capabilities

Resources
Access data sources
Tools
Execute functions
Prompts
Pre-built templates
Sampling
AI model interactions

Overview

The Mcp Leetcode Crawler is an MCP server that provides a ready‑made data ingestion pipeline for developers building AI assistants that need up‑to‑date interview content from LeetCode. By automatically crawling the discussion forums, filtering by company tags, and exporting structured data to CSV or Google Sheets, it eliminates the manual effort of gathering interview questions and allows AI models to retrieve fresh, well‑organized information on demand.

What problem does it solve? Interview preparation often relies on community‑generated question lists that are scattered across web pages and forums. Manually compiling these lists is time‑consuming, error‑prone, and quickly becomes outdated. The crawler automates this entire process: it visits the relevant discussion threads, extracts key details such as problem titles, links, and posting dates, groups them by month, and writes the results to a format that can be ingested directly into an AI knowledge base or analytics dashboard. This means developers can keep their assistant’s database of interview questions current without writing custom web‑scraping code.

Key features are presented in plain language:

  • Company filtering – default to Google, but any company tag can be specified.
  • Pagination control – choose how many pages of discussions to crawl, balancing completeness with speed.
  • Structured output – single CSV for all data or monthly files that preserve temporal context.
  • Google Sheets export – a convenient way to share results with teams or feed them into other tools that consume Google Sheets.
  • Command‑line interface – all options are exposed via flags, making the tool scriptable and easy to integrate into CI/CD or scheduled jobs.

Real‑world use cases include:

  • Building a knowledge graph of interview questions for an AI tutor.
  • Generating analytics dashboards that track question trends over time.
  • Feeding a chatbot with the latest company‑specific questions so it can answer user queries instantly.
  • Syncing data to a shared Google Sheet that multiple recruiters or hiring managers can access.

Integration into AI workflows is straightforward. Once the CSV or Google Sheet is produced, an MCP client can request the data via a resource call. The server’s sampling capability allows the AI to pull only the most recent or relevant questions, and prompts can be crafted to ask the assistant for summaries or practice exercises based on that data. The crawler’s modular design also makes it easy to extend—future versions may add support for multiple tags, scheduling, or visualizations—ensuring that the tool grows with the needs of its users.