MCPSERV.CLUB
r3-yamauchi

Kintone OAuth MCP Server on Cloudflare Workers

MCP Server

Secure, serverless kintone access via OAuth for AI tools

Active(71)
0stars
1views
Updated 29 days ago

About

A Cloudflare Workers‑based Model Context Protocol server that authenticates to kintone using OAuth, enabling AI applications like Claude or ChatGPT to read and write records without storing secrets locally.

Capabilities

Resources
Access data sources
Tools
Execute functions
Prompts
Pre-built templates
Sampling
AI model interactions

OAuthクライアントを追加

The Kintone OAuth MCP Server CFW is a Cloudflare Workers‑based implementation of the Model Context Protocol (MCP) that bridges AI assistants such as Claude or ChatGPT with a kintone instance using OAuth authentication. By exposing an MCP endpoint that can be called from any AI client, the server eliminates the need to host a local kintone connector or store sensitive API keys on client machines. Instead, users grant permission through the standard OAuth flow, and the MCP server securely stores the resulting tokens in Cloudflare’s KV store. This design keeps credentials out of the client side while allowing a single, centrally deployed server to serve all users within a given cybozu.com domain.

The core value for developers lies in the “one‑click, no‑local‑setup” experience. After deploying the Worker once, every user of the same kintone domain can add the MCP server as an integration in their Claude or ChatGPT web interface. The server handles all kintone API interactions—record CRUD, file operations, and app‑setting management—through the MCP protocol’s resource, tool, and prompt abstractions. This means AI agents can request data or perform actions in kintone without exposing credentials, and developers can focus on building higher‑level prompts rather than plumbing authentication.

Key capabilities include:

  • OAuth 2.0 integration with kintone, supporting scopes for record read/write, app settings, and file operations.
  • Cloudflare KV persistence of access tokens, ensuring secure storage and easy revocation if needed.
  • MCP resource exposure for kintone APIs, allowing AI agents to discover available actions via the MCP metadata endpoint.
  • SSE (Server‑Sent Events) support for real‑time updates, useful in interactive workflows where AI agents need to react to kintone events.

Typical use cases are plentiful. A product manager can ask an AI assistant to pull the latest sales records from a kintone app, update status fields, or attach supporting documents—all through natural language prompts. A support team can automate ticket creation in kintone from chat logs, while a developer can prototype data‑driven prompts that query or mutate kintone data without writing custom code. The server’s ability to share across an entire domain makes it ideal for organizations that want a single, auditable integration point rather than per‑user credentials.

Integrating the MCP server into an AI workflow is straightforward: configure the integration URL () in the AI client, grant OAuth consent once, and then use MCP tool calls or resource references within prompts. The server handles token renewal automatically, so developers can rely on a stable connection to kintone throughout the AI session. Its unique advantage is that it leverages Cloudflare Workers’ edge deployment, providing low‑latency responses and built‑in scalability for teams that may scale from a handful to thousands of users.