MCPSERV.CLUB
ankorstore

Yokai

MCP Server

Modular Go framework for backend observability

Active(72)
765stars
1views
Updated 11 days ago

About

Yokai is a lightweight, modular Go framework that streamlines building production-grade backend applications by providing built-in logging, tracing, metrics, and dependency injection with extensible modules for HTTP, gRPC, workers, and more.

Capabilities

Resources
Access data sources
Tools
Execute functions
Prompts
Pre-built templates
Sampling
AI model interactions

Yokai Server in Action

The Yokai MCP server is a lightweight, modular framework that turns any Go application into a fully observable, extensible backend service. It tackles the common pain points of building production‑grade systems—boilerplate configuration, dependency wiring, and monitoring—by providing a unified core that automatically injects logging, tracing, metrics, and health‑check endpoints. Developers can then focus on business logic while Yokai handles the plumbing.

At its heart, Yokai exposes a private HTTP server for infrastructure and debugging. This server hosts built‑in health checks, OpenTelemetry metrics, and structured logs, giving operators instant visibility into application state without any custom instrumentation. The core is built on proven libraries such as Echo for HTTP, gRPC‑go for RPC services, Viper for configuration, and Uber FX for dependency injection. By integrating these components under a single umbrella, Yokai removes the need to manage each library independently.

Extensibility is achieved through a plugin‑style extension system. The framework ships with a set of “built‑in” modules—public HTTP or gRPC servers, background workers, and ORM adapters—that can be dropped into a project with minimal ceremony. For advanced scenarios, developers can pull in community‑maintained contrib modules or write their own extensions that plug seamlessly into the dependency graph. This modularity means a Yokai‑powered service can evolve from a simple API endpoint to a complex microservice ecosystem without refactoring core logic.

Real‑world use cases include building microservices that need rapid iteration, monitoring‑first deployments, or services that must expose both internal diagnostics and external APIs. For example, a data‑processing pipeline can be wrapped in Yokai to automatically emit metrics on job throughput and expose a health endpoint for orchestrators. Similarly, an e‑commerce backend can use Yokai’s gRPC extension to expose a performant API while still benefiting from the same observability stack used by its internal services.

Integrating Yokai into an AI workflow is straightforward: the MCP client can invoke Yokai’s exposed HTTP or gRPC endpoints, and the framework will automatically surface request traces and logs to OpenTelemetry collectors. This allows AI assistants to gather telemetry data in real time, providing insights into latency, error rates, and usage patterns. The built‑in dependency injection also makes it easy to inject custom AI models or services as modules, enabling hybrid applications that combine traditional backend logic with machine‑learning inference.