MCPSERV.CLUB
cablehead

gpt2099

MCP Server

Scriptable AI in Nushell with persistent, editable conversations

Active(80)
16stars
1views
Updated 14 days ago

About

gpt2099 is a Nushell-based MCP client that connects to multiple LLM providers through a unified interface, stores and edits conversation threads locally via cross.stream, and supports document upload and tool integration for a flexible terminal AI workflow.

Capabilities

Resources
Access data sources
Tools
Execute functions
Prompts
Pre-built templates
Sampling
AI model interactions

gpt2099 in action

gpt2099.Nu – A Scriptable MCP Server for Nushell

The gpt2099 server addresses a common pain point in AI‑assisted development: the lack of a lightweight, fully scriptable bridge between local shell workflows and multiple large‑language‑model (LLM) providers. Traditional LLM integrations often lock developers into a single vendor or require cumbersome SDKs, making it difficult to swap models or incorporate custom tooling. gpt2099 solves this by exposing a unified MCP interface that lets any AI client (Claude, OpenAI, Anthropic, Cohere, Gemini, etc.) communicate through a single, consistent API. This abstraction lets developers focus on their workflow rather than vendor quirks.

At its core, gpt2099 runs as a cross.stream‑powered MCP server. It stores conversation threads in editable, persistent context windows that survive restarts and can be inspected or modified directly from the terminal. This feature turns the shell into a living knowledge base: you can review past exchanges, prune irrelevant context, or inject new information on the fly—eliminating opaque “black‑box” histories that plague many LLM services. The server also natively supports document ingestion, automatically detecting file types and optionally caching content so that PDFs, images, or plain text can be referenced in a prompt without manual preprocessing.

Key capabilities include:

  • Provider agnosticism – Add or swap LLM providers through a simple configuration API, keeping your codebase independent of any single vendor.
  • Tool extensibility – Connect additional MCP servers to enrich functionality, mirroring Claude Code’s local‑file editing while retaining full provider flexibility.
  • Event‑driven processing – Built on cross.stream, gpt2099 leverages an event pipeline that lets you compose complex conversational flows (e.g., chaining prompts, logging, or conditional logic) entirely within Nushell scripts.
  • Rich document handling – Upload and reference external files directly in the conversation, with optional caching to speed up repeated accesses.

Real‑world scenarios where gpt2099 shines include:

  • Code review and generation – Pull a repository, generate structured context, and ask the model to refactor or document code while keeping the conversation thread intact.
  • Automated documentation – Feed PDF specs or markdown files into the context and let the model produce summaries or Q&A pairs that can be reused across projects.
  • Rapid prototyping – Swap between a lightweight “milli” model for quick drafts and a larger model for final polishing without changing the underlying workflow.
  • Custom tool integration – Attach external MCP servers (e.g., a database query engine or a custom CLI) to the conversation, enabling the model to invoke real‑world actions through the same prompt interface.

By combining provider neutrality, persistent context, and native document support, gpt2099 gives developers a powerful, terminal‑centric AI companion that scales from quick experimentation to production‑grade automation.