About
An n8n-based AI agent that automatically discovers Data Vending Machine MCP servers on the Nostr network, queries them for tools, and integrates responses into conversational workflows.
Capabilities

The n8n AI Agent for DVM MCP is a ready‑made workflow that turns an LLM into a network‑aware tool user. It solves the long‑standing problem of how an AI assistant can discover, request, and consume tools that are not locally installed. By leveraging the Nostr network’s Data Vending Machine (DVM) protocol, the agent can search for any MCP‑compliant tool published on the network, send a query to it, wait asynchronously for a response, and then relay that answer back to the user. This capability removes the need for each LLM host to bundle every possible tool, enabling a truly modular and scalable AI ecosystem.
At its core, the agent is built on n8n, a low‑code automation platform that makes it straightforward to compose LangChain‑style tool agents. The workflow imports five sub‑workflows: one for discovering DVMCP servers, another for posting a query to a chosen server, a waiting step that handles the asynchronous nature of Nostr messages, and a final reader that extracts the tool’s output. Together they form a seamless request‑response cycle that can be triggered by any user prompt requiring external data or computation.
Key features include:
- Network‑wide tool discovery – The agent queries Nostr relays for available MCP tools, turning the public DVM network into a searchable marketplace.
- Asynchronous handling – Because Nostr messages are delivered asynchronously, the agent’s wait and read steps ensure reliable retrieval of responses without blocking the LLM.
- Extensibility – Developers can add new tool sub‑workflows or replace existing ones without touching the main agent logic, thanks to n8n’s modular node architecture.
- Secure credentials management – The workflow integrates OpenAI, SerpAPI, Nostr private keys, and database credentials in a single place, simplifying secure deployment.
In real‑world scenarios this agent shines wherever an AI assistant needs to pull in external data on demand: fetching live weather, querying a financial API, or invoking custom business logic exposed as an MCP tool. Because the tools are discovered over Nostr, any organization can publish its own DVMCP server and have all agents automatically see it, creating a distributed marketplace of AI‑usable services.
For developers familiar with MCP concepts, the n8n AI Agent for DVM MCP demonstrates a powerful pattern: decouple tool discovery from the LLM, let the network surface available capabilities, and orchestrate interactions through a low‑code workflow. This approach not only reduces duplication of tool implementations but also opens the door to collaborative, cross‑organization AI ecosystems where tools can be shared and reused at scale.
Related Servers
MarkItDown MCP Server
Convert documents to Markdown for LLMs quickly and accurately
Context7 MCP
Real‑time, version‑specific code docs for LLMs
Playwright MCP
Browser automation via structured accessibility trees
BlenderMCP
Claude AI meets Blender for instant 3D creation
Pydantic AI
Build GenAI agents with Pydantic validation and observability
Chrome DevTools MCP
AI-powered Chrome automation and debugging
Weekly Views
Server Health
Information
Explore More Servers
Agentic Developer MCP
Codex CLI wrapped as an MCP server for seamless AI development
MediaWiki MCP Adapter
Programmatic access to MediaWiki via MCP
Glue MCP Server
MCP server for AWS Glue Data Catalog
MLX Whisper MCP Server
Apple Silicon Whisper transcription on demand
MCP for Beginners
Learn Model Context Protocol with hands‑on examples
Forgejo MCP Server
Integrate Forgejo with Model Context Protocol chat interfaces