MCPSERV.CLUB
belokolek

n8n AI Agent for DVM MCP

MCP Server

Discover and use MCP tools over Nostr with n8n

Stale(55)
1stars
2views
Updated Sep 1, 2025

About

An n8n-based AI agent that queries the Nostr network for Data Vending Machine MCP servers, sends requests, waits for responses, and replies to users—enabling LLMs to access tools without local installation.

Capabilities

Resources
Access data sources
Tools
Execute functions
Prompts
Pre-built templates
Sampling
AI model interactions

DVMCP Agent v1.0 Workflow

The n8n AI Agent for DVM MCP is an end‑to‑end solution that lets a language model discover, invoke, and consume Model Context Protocol (MCP) tools without having those tools pre‑installed locally. By turning every MCP server into a Data Vending Machine (DVM) on the Nostr network, the agent can query a global catalogue of available tools, send requests to any remote server that advertises them, and incorporate the returned data into its responses. This removes the traditional dependency on a tightly coupled local infrastructure, enabling truly distributed AI workflows that can scale across multiple nodes and services.

At its core, the agent is a low‑code workflow built in n8n. When an LLM receives a prompt that requires external data or computation, the agent automatically searches the Nostr network for MCP servers advertising the needed capability. It then posts a structured request to those servers, waits asynchronously for their replies, and merges the results back into the assistant’s answer. Because all communication happens over Nostr, the agent can tap into any publicly or privately available MCP tool—whether it’s a weather API, a database query engine, or a custom business logic service—without modifying the LLM itself.

Key capabilities include:

  • Dynamic tool discovery – The agent queries the Nostr network for MCP servers, eliminating hard‑coded tool lists.
  • Asynchronous execution – Requests are posted and the agent waits for responses, allowing long‑running or rate‑limited services to be used seamlessly.
  • Modular sub‑workflows – Separate n8n nodes handle finding servers, posting queries, waiting, and reading responses, making the system highly extensible.
  • Secure authentication – Credentials for OpenAI, SerpAPI, Nostr private keys, and database connections are managed centrally within the workflow.
  • Low‑code integration – By leveraging n8n’s visual interface and community nodes for Nostr, developers can assemble or extend the agent without writing code.

Typical use cases include:

  • Multi‑service data aggregation – An assistant can pull real‑time market data, weather forecasts, and internal business metrics from distinct MCP servers to answer a complex query.
  • Remote execution of custom logic – A company can expose proprietary algorithms as MCP tools on a private Nostr relay, letting the assistant invoke them securely from anywhere.
  • Scalable AI pipelines – Multiple agents can share a pool of MCP tools across different regions, balancing load and reducing latency.
  • Rapid prototyping – Developers can expose new tools as MCP servers, immediately making them available to any agent that discovers the Nostr relay.

By bridging n8n’s workflow engine with the decentralized DVM ecosystem, this MCP server provides a powerful, plug‑and‑play infrastructure that empowers AI assistants to become truly autonomous agents—capable of finding and using any external capability on demand, regardless of where that capability resides.