MCPSERV.CLUB
cablehead

gpt2099

MCP Server

Scriptable AI assistant in Nushell with persistent threads

Active(80)
16stars
1views
Updated 15 days ago

About

gpt2099 is a Nushell‑based MCP client that lets you run large language models through a single, provider‑agnostic API. It stores editable conversation threads, supports document uploads, and can be extended with custom MCP servers for flexible tooling.

Capabilities

Resources
Access data sources
Tools
Execute functions
Prompts
Pre-built templates
Sampling
AI model interactions

gpt2099 in action

Overview

is a lightweight, scriptable MCP client built on Nushell that unifies access to multiple large‑language‑model providers—Anthropic, Cerebras, Cohere, Gemini, and OpenAI—through a single, consistent API. By hiding the idiosyncrasies of each provider behind a common interface, it lets developers prototype and deploy AI‑powered workflows without being locked into a particular vendor. The client stores conversation threads in , enabling persistent, editable context that survives across sessions and can be inspected or modified directly in the terminal.

The server solves a common pain point for developers: the difficulty of managing context windows when using different models. keeps a clear, editable history of messages and tool calls, so users can review, edit, or prune the conversation before sending it to a model. This transparency eliminates the “black‑box” history that many hosted APIs provide, giving developers fine‑grained control over what the model sees. It also supports document ingestion—PDFs, images, and plain text can be uploaded once and referenced in subsequent prompts with automatic content‑type detection and optional caching, making it easy to build knowledge bases or reference materials into a conversation.

Key capabilities include:

  • Provider agnostic calls: A single command () works with any configured model, and new providers can be added via the provider API.
  • Threaded conversations: Each thread is a first‑class object in ; users can bookmark, continue, or delete threads at will.
  • Tool integration: The client itself exposes MCP endpoints for local file editing, and it can consume other MCP servers, effectively turning the terminal into a programmable AI assistant.
  • Document handling: Uploaded files are stored in a structured format, and the client can embed relevant excerpts into prompts automatically.

Real‑world use cases span rapid prototyping of code generation tools, building command‑line chatbots that remember user preferences across sessions, and creating research assistants that pull in academic PDFs or internal documentation. Because all interactions happen in the Nushell environment, developers can chain commands with other shell utilities—filtering, parsing, or transforming data—without leaving the terminal. This tight integration makes it an attractive choice for dev‑ops, data scientists, and anyone who wants a fully scriptable, inspectable AI workflow that remains independent of cloud provider lock‑in.