MCPSERV.CLUB
Ginga1402

MCP Demo With LangChain and Ollama

MCP Server

Demo server integrating MCP, LangChain, and Ollama for LLM tool access

Stale(50)
2stars
1views
Updated Jun 19, 2025

About

A lightweight demo server that exposes Math and Weather services via the Model Context Protocol, enabling LangChain-powered LLMs to call external tools through Ollama's model interface. It showcases how MCP translates tool calls into LLM-friendly messages.

Capabilities

Resources
Access data sources
Tools
Execute functions
Prompts
Pre-built templates
Sampling
AI model interactions

MCP Server Workflow

The Model Context Protocol (MCP) Demo With Langchain MCP Adapters Ollama server demonstrates how an MCP‑enabled service can bridge the gap between a language model and external data or tools. In practice, an AI assistant that merely generates text is limited; it cannot query a database, call an API, or execute code. This MCP demo solves that limitation by exposing a set of tool endpoints—such as simple math operations or weather data—that the model can invoke through a standardized, language‑agnostic protocol. Developers benefit from this abstraction because they no longer need to write bespoke integration code for each external resource; instead, the MCP server translates the model’s intent into concrete API calls and returns structured results.

At its core, the demo server is built with Python, leveraging the LangChain framework to orchestrate interactions between the model and the MCP infrastructure. The server itself is a lightweight implementation of the MCP specification, exposing tools that perform arithmetic and simulate weather queries. When an LLM receives a user prompt that requires external data, the MCP client in the chat application identifies the need for a tool, queries the server’s capability list, and forwards the request. The server then performs the operation (e.g., adding two numbers or fetching a weather snapshot) and streams the result back to the model, which incorporates it into its final response. This flow is fully compliant with the MCP protocol, ensuring secure, structured communication and enabling future extensions without breaking existing integrations.

Key capabilities of this demo include:

  • Standardized Tool Exposure – The server declares its available tools via a well‑defined schema, allowing any MCP‑compliant client to discover and invoke them without custom adapters.
  • LangChain Compatibility – By using the library, developers can plug the MCP server directly into LangChain pipelines or LangGraph workflows, simplifying agent construction and state management.
  • Extensibility – New tools can be added by implementing additional endpoints; the MCP protocol guarantees backward compatibility, so existing clients continue to function seamlessly.
  • Scalable Architecture – The separation of concerns (client, server, service) means the server can be deployed behind load balancers or within container orchestration platforms, supporting high‑throughput workloads.

Real‑world scenarios that benefit from this setup include: building a customer support chatbot that can query ticket databases, creating an educational tutor that performs live calculations or fetches up‑to‑date weather forecasts, and developing internal tooling assistants that retrieve system metrics or trigger CI/CD pipelines. In each case, the MCP server abstracts away API heterogeneity and authentication concerns, letting developers focus on crafting intelligent conversational flows rather than plumbing the model to external services.