MCPSERV.CLUB
andybrandt

MCP Simple OpenAI Assistant

MCP Server

Manage and chat with OpenAI assistants via MCP

Stale(60)
37stars
1views
Updated 15 days ago

About

A lightweight MCP server that enables tools like Claude Desktop to create, list, update, and converse with OpenAI assistants. It offers real‑time streaming responses and local thread persistence for seamless multi‑session conversations.

Capabilities

Resources
Access data sources
Tools
Execute functions
Prompts
Pre-built templates
Sampling
AI model interactions

Overview

The MCP Simple OpenAI Assistant server bridges Claude‑powered assistants with the OpenAI Assistants API, turning a standard AI client into a full‑featured assistant manager. It solves the common pain point of having to manually create, track, and reuse OpenAI assistant threads from within an AI workflow. By exposing a set of well‑defined tools over the Model Context Protocol, developers can orchestrate assistant creation, maintenance, and real‑time conversation without writing custom API wrappers.

At its core, the server offers a toolset that mirrors every operation available in OpenAI’s assistant API: creating assistants, listing and retrieving them, updating their properties, and managing conversation threads. The standout feature is the streaming conversation tool (), which forwards user messages to a chosen assistant and streams the response back to the client as it is generated. This eliminates long blocking calls, prevents timeouts in chat‑like interactions, and gives users immediate visual feedback.

Because the OpenAI API does not provide a way to list existing conversation threads, the server implements local persistence using SQLite. Each thread’s ID, name, description, and last‑used timestamp are stored locally, enabling developers to easily locate, resume, or delete threads across sessions. This persistence layer is especially valuable for applications that require context continuity—such as project management bots or customer support assistants—where conversations must be revisited later.

The server’s integration with AI workflows is straightforward: a client like Claude Desktop can declare it as an MCP server, configure the OpenAI API key in its environment, and then call any of the exposed tools via the MCP command interface. Developers can chain these tools to build sophisticated assistant pipelines—for example, automatically creating a new assistant for each project, spawning a thread with a descriptive name, and then engaging in real‑time dialogue while the assistant’s responses are streamed back to the user interface.

Unique advantages include zero‑code configuration for most use cases, a clean separation of concerns (assistant management vs. conversation), and robust streaming that keeps UI responsive even when the assistant takes several seconds to generate a reply. For teams building AI‑augmented workflows, this server provides a reliable, standards‑based gateway to OpenAI’s evolving assistant capabilities without the overhead of maintaining custom integration code.