MCPSERV.CLUB
MCP-Mirror

Eyevinn OSC MCP Server

MCP Server

Open‑source cloud services for databases, storage, and VOD pipelines

Stale(65)
7stars
0views
Updated Aug 29, 2025

About

The Eyevinn OSC MCP Server exposes the Eyevinn Open Source Cloud API, allowing users to create SQL/NoSQL/memory databases, S3‑compatible storage buckets, and VOD transcoding pipelines via simple MCP commands.

Capabilities

Resources
Access data sources
Tools
Execute functions
Prompts
Pre-built templates
Sampling
AI model interactions

Eyevinn Open Source Cloud Server MCP server

The Eyevinn Open Source Cloud (OSC) MCP server bridges the gap between AI assistants and cloud‑based storage services that require local machine access. Traditional remote MCP endpoints expose high‑level tooling but often lack the ability to perform actions that depend on a local environment, such as uploading large media files directly from a developer’s workstation to an OSC bucket. This server solves that limitation by running locally on the same machine as the AI client, enabling secure and efficient file operations without exposing credentials to external networks.

At its core, the server offers a focused set of storage‑centric tools: creating MinIO buckets, uploading files to those buckets, and listing bucket contents. These capabilities are exposed through the MCP framework, allowing Claude or other AI assistants to invoke them as if they were native functions. For developers building media pipelines, this means an assistant can automatically spin up a storage bucket for a new project, transfer video assets from the local machine, and retrieve inventory listings—all without manual CLI commands or web console interactions.

Key features include:

  • Local execution: Commands run on the client’s machine, eliminating latency and reducing exposure of sensitive API tokens.
  • Secure token handling: The server reads the OSC access token from an environment variable, ensuring credentials are never hard‑coded in configuration files.
  • Simple tool set: Focused on storage operations keeps the interface lightweight and reduces cognitive load for users.
  • Dual server support: A complementary remote MCP client can be configured alongside the local server, giving developers flexibility to choose the appropriate endpoint for each task.

Real‑world scenarios that benefit from this server are plentiful. A video editor could ask an AI assistant to “create a new bucket named and upload my latest render,” receiving confirmation and a direct link to the uploaded file. A data scientist might ask the assistant to “list all files in ,” quickly obtaining a catalog of available data without leaving the chat interface. In continuous‑integration pipelines, automated tests can trigger the server to clean up buckets or verify upload integrity before deployment.

Integrating the OSC MCP server into AI workflows is straightforward: developers add a single configuration entry to their Claude Desktop settings, pointing the assistant to the local MCP executable. Once configured, every tool call that matches one of the server’s capabilities is routed through the local process, ensuring fast, secure interactions. The server’s design emphasizes minimalism and reliability, making it an attractive choice for teams that need tight control over local resources while leveraging the conversational power of AI assistants.