MCPSERV.CLUB
gkenios

Mcpbot

MCP Server

FastAPI MCP client and server for local or Azure deployment

Stale(60)
2stars
1views
Updated Aug 11, 2025

About

Mcpbot implements an MCP client and server using FastAPI, enabling local or Azure-hosted interactions. It supports vector databases via Chromadb and can be configured with OAuth2 for secure authentication.

Capabilities

Resources
Access data sources
Tools
Execute functions
Prompts
Pre-built templates
Sampling
AI model interactions

Overview

MCPBot is a ready‑to‑use Model Context Protocol (MCP) server built on FastAPI, designed to bridge AI assistants with external data stores and tools in a flexible, local or cloud‑based deployment. By exposing MCP endpoints for resources, tools, prompts and sampling, it enables Claude or other AI clients to query structured data, run custom logic, and retrieve contextual information without leaving the chat flow. This solves the common developer pain point of stitching together disparate services—such as vector databases, authentication mechanisms and custom APIs—into a single, well‑defined interface that AI assistants can consume seamlessly.

The server’s core value lies in its lightweight, pluggable architecture. Developers can spin up a local instance for rapid prototyping or scale it to Azure, leveraging the same code base. MCPBot bundles a Chroma vector store integration and provides an OAuth2 flow (with a planned upgrade to the new MCP standard) so that sensitive data can be accessed securely. The ability to replace the vector store with a custom one or add new tools via FastAPI routes means that teams can tailor the server to their own knowledge bases, code repositories or internal APIs without rewriting client logic.

Key capabilities include:

  • Resource discovery – The MCP endpoint lists available datasets and tools, allowing AI clients to dynamically adapt their queries.
  • Tool execution – Custom FastAPI endpoints can be exposed as MCP tools, enabling the assistant to trigger business logic or external services directly from the conversation.
  • Prompt and sampling management – Pre‑defined prompts can be served, and text generation parameters are exposed for fine‑tuning responses on the fly.
  • Vector search integration – By embedding documents into a Chroma store, the server can return highly relevant snippets or full documents in response to natural language queries.

Typical use cases span from internal knowledge base assistants that pull up policy documents or code snippets, to customer support bots that need to retrieve product specifications from a vector index. In research settings, MCPBot can serve as a local testbed for experimenting with new prompt designs or tool integrations before deploying to production. Because it follows the MCP specification, any compliant AI client—Claude, Gemini, or others—can interact with it using the same protocol, ensuring portability across platforms.

What sets MCPBot apart is its minimal footprint coupled with a clear separation of concerns. Developers can focus on building domain‑specific tools or data ingestion pipelines while relying on MCPBot to handle protocol compliance, authentication and routing. The optional Azure deployment path further extends its reach, allowing enterprises to keep data on-premises or in a regulated cloud environment without changing the client code. This combination of simplicity, extensibility and compliance makes MCPBot a practical choice for teams looking to embed AI assistants into their existing workflows with confidence.