MCPSERV.CLUB
Horizon-Digital-Engineering

FPE Demo MCP

MCP Server

Secure Format‑Preserving Encryption for LLMs

Active(74)
0stars
2views
Updated Sep 25, 2025

About

A lightweight MCP server that demonstrates authentication and FF3 format‑preserving encryption over digit strings, offering both stdio and HTTP transports for easy integration with LLM tools.

Capabilities

Resources
Access data sources
Tools
Execute functions
Prompts
Pre-built templates
Sampling
AI model interactions

Overview

The FPE Demo MCP is a lightweight, production‑ready server that demonstrates how to expose format‑preserving encryption (FPE) via the Model Context Protocol (MCP). By combining MCP’s JSON‑RPC transport with a simple FF3 FPE implementation, it allows large language models (LLMs) such as Claude or ChatGPT to securely encrypt and decrypt sensitive numeric data—like Social Security numbers or credit card digits—while preserving the original format. This is especially valuable for developers building AI‑powered applications that must comply with data privacy regulations or internal security policies, yet still need to manipulate user‑supplied identifiers.

At its core, the server offers two tools: and . The encryption tool accepts a digit‑only string, applies the FF3 algorithm in radix‑10 mode, and returns an prefixed ciphertext that is unmistakably marked as encrypted. The decryption tool reverses the process, taking an payload and restoring the original plaintext. Because the ciphertext keeps the same length and character set, downstream systems can continue to handle it as if it were unencrypted, eliminating the need for additional parsing logic.

Security is a first‑class concern. The server supports multiple authentication modes—, , , and . In test mode, a shared secret or JWT is required in the header; production mode enforces JWT only, ensuring that only authenticated clients can invoke encryption operations. This layered approach lets developers experiment locally with minimal friction while still enforcing strict access controls in staging or production environments.

The MCP server can be run over stdio for local, desktop‑based LLM clients or as an HTTP service for web‑connected assistants. The HTTP variant follows the MCP Streamable protocol, allowing browsers and cloud LLM connectors to establish persistent sessions, list available tools, and invoke them in real time. With optional CORS support, the server can be integrated into browser playgrounds or custom front‑ends without additional configuration.

Real‑world use cases abound: a fintech app can encrypt customer account numbers before sending them to an LLM for natural‑language queries; a healthcare platform can mask patient identifiers while still enabling data analysis by AI models; or a compliance team can audit encryption usage through the server’s clear log prefixes. By abstracting FPE behind a simple, well‑documented MCP interface, the FPE Demo MCP empowers developers to prototype secure AI workflows quickly and confidently.