About
Yokai is a lightweight, modular Go framework that streamlines building production-grade backend applications by providing built-in logging, tracing, metrics, and dependency injection with extensible modules for HTTP, gRPC, workers, and more.
Capabilities

The Yokai MCP server is a lightweight, modular framework that turns any Go application into a fully observable, extensible backend service. It tackles the common pain points of building production‑grade systems—boilerplate configuration, dependency wiring, and monitoring—by providing a unified core that automatically injects logging, tracing, metrics, and health‑check endpoints. Developers can then focus on business logic while Yokai handles the plumbing.
At its heart, Yokai exposes a private HTTP server for infrastructure and debugging. This server hosts built‑in health checks, OpenTelemetry metrics, and structured logs, giving operators instant visibility into application state without any custom instrumentation. The core is built on proven libraries such as Echo for HTTP, gRPC‑go for RPC services, Viper for configuration, and Uber FX for dependency injection. By integrating these components under a single umbrella, Yokai removes the need to manage each library independently.
Extensibility is achieved through a plugin‑style extension system. The framework ships with a set of “built‑in” modules—public HTTP or gRPC servers, background workers, and ORM adapters—that can be dropped into a project with minimal ceremony. For advanced scenarios, developers can pull in community‑maintained contrib modules or write their own extensions that plug seamlessly into the dependency graph. This modularity means a Yokai‑powered service can evolve from a simple API endpoint to a complex microservice ecosystem without refactoring core logic.
Real‑world use cases include building microservices that need rapid iteration, monitoring‑first deployments, or services that must expose both internal diagnostics and external APIs. For example, a data‑processing pipeline can be wrapped in Yokai to automatically emit metrics on job throughput and expose a health endpoint for orchestrators. Similarly, an e‑commerce backend can use Yokai’s gRPC extension to expose a performant API while still benefiting from the same observability stack used by its internal services.
Integrating Yokai into an AI workflow is straightforward: the MCP client can invoke Yokai’s exposed HTTP or gRPC endpoints, and the framework will automatically surface request traces and logs to OpenTelemetry collectors. This allows AI assistants to gather telemetry data in real time, providing insights into latency, error rates, and usage patterns. The built‑in dependency injection also makes it easy to inject custom AI models or services as modules, enabling hybrid applications that combine traditional backend logic with machine‑learning inference.
Related Servers
MarkItDown MCP Server
Convert documents to Markdown for LLMs quickly and accurately
Context7 MCP
Real‑time, version‑specific code docs for LLMs
Playwright MCP
Browser automation via structured accessibility trees
BlenderMCP
Claude AI meets Blender for instant 3D creation
Pydantic AI
Build GenAI agents with Pydantic validation and observability
Chrome DevTools MCP
AI-powered Chrome automation and debugging
Weekly Views
Server Health
Information
Explore More Servers
Claude Desktop API MCP Server
Bridge Claude Desktop to the Anthropic API
Mcp Runner
Efficiently run and manage MCP servers with reuse and cleanup
Ezmcp
FastAPI‑style MCP server for SSE
Finmap MCP Server
Global stock exchange data for analysis and visualization
Agents MCP Usage Demo & Benchmarking Platform
LLM Agent framework integration and evaluation with MCP servers
Zed Brave Search Context Server
Brave Search integration for Zed AI assistants