MCPSERV.CLUB
rainu

Ask Mai MCP Server

MCP Server

Scriptable LLM assistant as a Model Context Protocol server

Stale(60)
6stars
1views
Updated Aug 17, 2025

About

Ask Mai provides a command‑line chat interface for multiple LLM providers and exposes its built‑in and custom tools via MCP, enabling other applications to invoke the assistant programmatically.

Capabilities

Resources
Access data sources
Tools
Execute functions
Prompts
Pre-built templates
Sampling
AI model interactions

Ask m' AI Demo

Ask m' AI is a lightweight, script‑friendly desktop chat client that turns any large language model (LLM) into an interactive assistant you can invoke from the command line. By exposing its functionality as a Model Context Protocol (MCP) server, it bridges the gap between traditional terminal workflows and modern AI services. Instead of typing prompts into a web interface, developers can embed Ask m' AI directly into scripts, CI pipelines, or custom tooling, allowing the LLM to read logs, manipulate files, and run commands without leaving their shell.

The server’s core value lies in its tool‑centric design. Each LLM provider—OpenAI, Anthropic, Google Gemini, LocalAI, Ollama, Mistral, DeepSeek, and others—can be selected with a single flag. Once connected, the assistant can call built‑in system tools such as , , or , and even user‑defined tools, enabling the LLM to perform real actions on the host machine. This capability turns the assistant from a passive chatbot into an active agent that can, for example, parse build logs and automatically fix configuration errors or generate documentation from code comments.

Key features include:

  • Multi‑provider support: Switch between cloud or local models with minimal configuration.
  • Custom tool creation: Define any JSON‑serialisable operation and expose it to the LLM.
  • MCP server mode: Other MCP‑compatible applications can discover and use Ask m' AI’s tool set, creating a composable ecosystem of AI agents.
  • Scriptability: All options are configurable via YAML, environment variables, or command‑line flags, and the conversation is streamed to stdout for easy parsing.
  • Theming & localization: Choose light/dark themes and switch between English and German, making it comfortable for diverse user bases.

Typical use cases include automated code reviews (the assistant reads diffs and suggests fixes), system monitoring (it can query metrics and trigger alerts), and data‑driven workflows where the LLM fetches, processes, and stores information directly on disk. By integrating Ask m' AI into existing command‑line toolchains, developers can harness the power of large language models without abandoning their familiar workflows, thereby accelerating productivity and reducing context switches.