MCPSERV.CLUB
jamesfalkner

Quarkus MCP Agentic

MCP Server

Java agentic assistant powered by Quarkus and MCP

Stale(50)
2stars
1views
Updated May 13, 2025

About

A lightweight Java application built with Quarkus that demonstrates agentic AI workflows using the Model Context Protocol. It orchestrates multiple MCP services such as web search, maps, Slack and LLMs to answer queries and perform tasks.

Capabilities

Resources
Access data sources
Tools
Execute functions
Prompts
Pre-built templates
Sampling
AI model interactions

Quarkus MCP Agentic Demo

The Quarkus + MCP = Agentic project turns a conventional Java application into an intelligent, agent‑driven assistant by combining Quarkus, the Model Context Protocol (MCP), and LangChain4j. At its core, it solves a common pain point for developers: wiring together disparate external services—search engines, mapping APIs, messaging platforms, and LLMs—into a single, coherent workflow that can be queried by an AI assistant. Instead of writing custom integration code for each API, the MCP server exposes a uniform resource‑based interface that lets an LLM reason about tool availability and invoke them in sequence, enabling true agentic behavior.

Developers benefit from a plug‑and‑play architecture. The server bundles several pre‑configured MCP services: Brave web search, Google Maps for geocoding and directions, Slack for team communication, and a filesystem provider for local data access. Each service is exposed as an MCP resource with well‑defined prompts and sampling strategies, allowing the LLM to decide when to call which tool. The integration with LangChain4j gives developers a familiar Java‑centric SDK to compose chains, manage context, and inject environment variables securely. The result is a highly modular system where adding or swapping out services requires minimal code changes.

Key capabilities include:

  • Agentic reasoning: The LLM can plan multi‑step tasks, invoking tools in the right order based on its internal state.
  • Unified prompt management: Each MCP resource carries a default system message and sampling configuration, ensuring consistent behavior across tools.
  • Secure credential handling: API keys are injected via environment variables, encouraging best practices for secrets management.
  • Telemetry and observability: When a container runtime is available, built‑in metrics expose how often each tool is called and latency profiles.

Real‑world scenarios that shine with this stack are:

  • Team coordination: As shown in the demo prompt, an assistant can search for a restaurant that satisfies dietary constraints, generate a Slack invitation, and produce calendar invites—all from a single user query.
  • Dynamic data retrieval: A chatbot can pull the latest web content, parse it, and present summarized insights without manual data pipelines.
  • Cross‑platform workflows: Combining mapping, scheduling, and messaging services lets developers build assistants that operate across devices and ecosystems.

Integration into AI workflows is straightforward: the MCP server acts as a backend endpoint that any LLM‑powered client can query. Developers embed the server’s URL into their LangChain4j configuration or directly call its REST endpoints, allowing existing chat interfaces—web UIs, mobile apps, or voice assistants—to leverage the same agentic logic. The result is a scalable, maintainable assistant that can grow with new tools simply by adding more MCP services.