MCP Server
Expose ClicheFactory extraction and conversion tools to AI coding assistants via the Model Context Protocol.
What is MCP
The Model Context Protocol (MCP) lets AI assistants call external tools. The ClicheFactory MCP server exposes extract, to_markdown, and doctor tools — your assistant can extract data from documents and convert them to markdown as part of your conversation.
Setup — Cursor
Add to your .cursor/mcp.json (project-level) or Cursor settings (global):
Local Mode (BYOK)
"mcpServers": {
"clichefactory": {
"command": "uvx",
"args": ["clichefactory-mcp"],
"env": {
"LLM_MODEL_NAME": "gemini/gemini-3-flash-preview",
"LLM_API_KEY": "your-gemini-key"
}
}
}
}
Service Mode
"mcpServers": {
"clichefactory": {
"command": "uvx",
"args": ["clichefactory-mcp"],
"env": {
"CLICHEFACTORY_MODE": "service",
"CLICHEFACTORY_API_KEY": "cliche-your-key"
}
}
}
}
After saving, restart Cursor. The tools should appear in your MCP tools list.
Setup — Claude Desktop
Edit your Claude Desktop config at ~/Library/Application Support/Claude/claude_desktop_config.json (macOS) or %APPDATA%\Claude\claude_desktop_config.json (Windows):
Local Mode (BYOK)
"mcpServers": {
"clichefactory": {
"command": "uvx",
"args": ["clichefactory-mcp"],
"env": {
"LLM_MODEL_NAME": "gemini/gemini-3-flash-preview",
"LLM_API_KEY": "your-gemini-key"
}
}
}
}
Service Mode
"mcpServers": {
"clichefactory": {
"command": "uvx",
"args": ["clichefactory-mcp"],
"env": {
"CLICHEFACTORY_MODE": "service",
"CLICHEFACTORY_API_KEY": "cliche-your-key"
}
}
}
}
Restart Claude Desktop after saving. The ClicheFactory tools will be available in your conversations.
Setup — OpenClaw
For OpenClaw, install the ClicheFactory skill — see the OpenClaw page for full setup. The MCP server config is the same as the Cursor/Claude Desktop examples above.
Tools Reference
The MCP server exposes three tools:
extract
Extract structured data from a document using a JSON schema.
| Parameter | Type | Required | Description |
|---|---|---|---|
file | string | Yes | Absolute path to the document file. |
schema | string or object | Yes | JSON Schema (file path or inline object). |
mode | string | No | Execution mode: local or service. |
extraction_mode | string | No | fast, robust. Trained mode auto-resolves when artifact_id is provided. |
artifact_id | string | No | Trained pipeline artifact ID. |
model | string | No | LLM model override. |
model_api_key | string | No | API key for the model override. |
ocr_model | string | No | Separate model for OCR/VLM tasks. |
ocr_api_key | string | No | API key for the OCR model. |
to_markdown
Convert a document to markdown text.
| Parameter | Type | Required | Description |
|---|---|---|---|
file | string | Yes | Absolute path to the document file. |
mode | string | No | Client mode: local or service. |
conversion_mode | string | No | default (full OCR pipeline) or fast (VLM-only, no OCR). Service mode only. |
model | string | No | LLM model override. |
model_api_key | string | No | API key for the model override. |
ocr_model | string | No | Separate model for OCR/VLM tasks. |
ocr_api_key | string | No | API key for the OCR model. |
doctor
Check ClicheFactory configuration, dependencies, and system binaries. No parameters. Call this when extraction fails or to verify setup.
Execution Modes
The MCP server supports both local and service modes, controlled by environment variables in the config. In local mode, the LLM call uses your API key. In service mode, ClicheFactory handles everything. See Core Concepts → Execution Modes for details.