MCPSERV.CLUB
nodetec

Nostr Code Snippet MCP

MCP Server

Generate and share code snippets via Nostr in seconds

Stale(50)
2stars
3views
Updated Sep 1, 2025

About

The server lets Claude create, store, and retrieve short code snippets directly on Nostr using a private key and specified relays. Ideal for quick sharing of reusable code blocks within the Nostr ecosystem.

Capabilities

Resources
Access data sources
Tools
Execute functions
Prompts
Pre-built templates
Sampling
AI model interactions

Nostr Code Snippet MCP in Action

The Nostr Code Snippet MCP is a lightweight server that bridges Claude (or any MCP‑compatible AI assistant) with the Nostr decentralized network, enabling developers to retrieve and post code snippets directly through the AI interface. By exposing a set of tools that query Nostr relays for events tagged as code, the server turns the AI into a first‑class citizen of the Nostr ecosystem. This solves a common pain point for developers who want to keep code examples, libraries, or documentation snippets on a censorship‑resistant platform while still being able to fetch them with natural language queries.

When a user asks the assistant for “a JavaScript example that sorts an array,” the MCP server forwards a request to one or more Nostr relays, searching for events that match the specified tags and content type. The assistant then receives a concise snippet, which it can present or further manipulate. Conversely, developers can use the server to publish new code snippets by sending an event with the appropriate tags. This bidirectional flow means that the AI can act as both consumer and curator of code in a decentralized, immutable store—an invaluable capability for open‑source projects, collaborative learning, and distributed documentation.

Key capabilities of the server include:

  • Relay discovery: The server accepts a comma‑separated list of relay URLs, allowing it to connect to multiple Nostr backends and aggregate results.
  • Secure authentication: A single NSEC key is used to sign outgoing events, ensuring that only authorized users can publish snippets while keeping the network transparent.
  • Tag‑based filtering: By specifying tags such as or language identifiers, the assistant can narrow searches to relevant snippets.
  • Tool abstraction: The MCP interface presents these operations as tools, so developers can invoke them in prompts without needing to understand Nostr’s event format.

Typical use cases include:

  • Rapid prototyping: Quickly fetch reusable code blocks while coding, reducing context switching.
  • Decentralized documentation: Host library examples on Nostr so they survive platform shutdowns or censorship.
  • Collaborative learning: Share small code snippets in a peer‑reviewed, tamper‑proof way.
  • Research reproducibility: Store experimental code with immutable hashes, enabling reproducible results.

Integrating the server into an AI workflow is straightforward: add its configuration to Claude’s , and the assistant will automatically expose new tools. Developers can then craft prompts that explicitly call these tools, such as “,” and the assistant will return a vetted snippet. Because all data flows through MCP, the integration remains consistent with other AI tools and does not require custom SDKs.

What sets this MCP apart is its focus on a decentralized code repository. While many AI assistants rely on centralized APIs or local caches, the Nostr Code Snippet MCP leverages a censorship‑resistant network that guarantees persistence and authenticity. For teams building resilient, community‑driven code libraries, this server offers a seamless bridge between human intent and distributed data—making the AI not just a helper but an active participant in the open‑source ecosystem.