MCPSERV.CLUB
redbuilding

Ollama Chat with MCP

MCP Server

Local LLMs, web search, and SQL via MCP

Stale(55)
4stars
2views
Updated Sep 19, 2025

About

A FastAPI/React application that extends locally run Ollama models with real‑time web search and MySQL querying using the Model Context Protocol, while persisting conversations in MongoDB.

Capabilities

Resources
Access data sources
Tools
Execute functions
Prompts
Pre-built templates
Sampling
AI model interactions

Overview

Ollama Chat with MCP demonstrates how a locally hosted language model can be turned into a versatile AI assistant that reaches beyond its training data. By integrating web search and database querying through the Model Context Protocol, the server gives the model real‑time access to fresh information and structured data. This solves a common problem for developers: keeping a local LLM up‑to‑date without sacrificing privacy or the performance benefits of running everything on their own hardware.

The server is built around a FastAPI backend that orchestrates three core services: the local Ollama model, an MCP‑enabled web search service powered by Serper.dev, and an optional MCP SQL server that can query a MySQL database. A React frontend provides a clean, responsive chat interface where users can type questions, view search results formatted as structured JSON, and even issue SQL queries. Conversation history is persisted in MongoDB, enabling users to resume long‑running discussions or audit past interactions.

Key capabilities include:

  • Real‑time web search: The model can request up‑to‑date facts, news, or statistics via the web search MCP tool, ensuring answers reflect current knowledge.
  • Structured data access: The SQL MCP tool allows the model to retrieve and manipulate records from a MySQL database, opening doors for business intelligence or internal tooling.
  • Local execution: All model inference runs on the user’s machine through Ollama, preserving data sovereignty and eliminating latency associated with cloud calls.
  • Persistent, searchable conversations: MongoDB stores every message pair and conversation metadata, supporting features like renaming, deleting, or listing threads.
  • Extensible architecture: The backend can launch and manage any MCP service, making it straightforward to add new tools (e.g., file system access or API calls) without changing the core logic.

In practice, this server is ideal for developers building internal assistants that need to answer policy questions with up‑to‑date references, pull reports from a corporate database on demand, or prototype new tool integrations before deploying them at scale. By combining local LLMs with MCP‑managed external services, it delivers a powerful, privacy‑preserving AI experience that can be adapted to a wide range of real‑world scenarios.