MCPSERV.CLUB
elbruno

Aspire MCP Sample Server

MCP Server

Demo MCP server and Blazor chat client with Azure AI integration

Stale(50)
45stars
1views
Updated 22 days ago

About

The Aspire MCP Sample Server demonstrates a Model Context Protocol server and client built with .NET 9 and Aspire. It showcases MCP communication, function calling, and a Blazor chat interface that can use Azure AI Foundry, GitHub Models or Ollama LLMs.

Capabilities

Resources
Access data sources
Tools
Execute functions
Prompts
Pre-built templates
Sampling
AI model interactions

Chat Demo

Overview

Aspire.MCP.Sample is a reference implementation that brings the Model Context Protocol (MCP) to life within an Aspire orchestrated environment. It demonstrates how a lightweight, container‑managed MCP server can be paired with an interactive Blazor chat client to enable function calling and tool execution from large language models (LLMs). By exposing MCP resources, tools, prompts, and sampling options over HTTP, the sample shows how developers can treat any LLM that supports function calling as a first‑class tool in their AI workflows.

The core problem this server solves is the friction that typically exists when integrating external services into an AI assistant. Traditional approaches require custom adapters, manual serialization of function calls, and bespoke error handling. MCP abstracts these concerns into a standardized protocol: the client sends a structured request, the server validates and executes the function, and returns the result in a predictable format. This allows developers to focus on business logic rather than protocol plumbing, while still enabling fine‑grained control over model behavior and tool usage.

Key features of the sample include:

  • Aspire Integration – The server and client are bundled into a single Aspire AppHost, giving developers instant container orchestration, health monitoring, and zero‑configuration networking.
  • Modular MCP Server – Exposes a set of pre‑defined functions (e.g., data retrieval, calculation) that can be invoked by any compliant LLM. The server handles authentication, logging, and telemetry automatically.
  • Blazor Chat Client – A web UI that lets users select an LLM from Azure AI Foundry, GitHub Models, or local Ollama. The client streams chat messages, triggers function calls, and renders tool results inline.
  • Function Calling & Tool Result Display – When the model calls a function, the server returns JSON that is displayed in the chat as a “Tool Result” section, providing immediate feedback to the user.
  • Azure Deployment – With a single command, the entire stack can be provisioned to Azure App Service and Container Apps, enabling production‑ready scaling without manual configuration.

Real‑world scenarios that benefit from this approach include:

  • Customer support bots that need to query internal knowledge bases or ticketing systems via MCP functions.
  • Data analysis assistants that invoke statistical tools or database queries directly from the chat interface.
  • DevOps automation where a model can trigger deployment pipelines or infrastructure changes through safe, audited MCP endpoints.

By integrating seamlessly with existing LLM providers and leveraging Aspire’s robust service lifecycle management, the sample offers a clear blueprint for developers looking to embed programmable tools into AI assistants. It showcases how MCP can reduce boilerplate, enforce consistent communication contracts, and accelerate the delivery of intelligent, function‑aware applications.