MCP Server

Expose ClicheFactory extraction and conversion tools to AI coding assistants via the Model Context Protocol.

What is MCP

The Model Context Protocol (MCP) lets AI assistants call external tools. The ClicheFactory MCP server exposes extract, to_markdown, and doctor tools — your assistant can extract data from documents and convert them to markdown as part of your conversation.

Setup — Cursor

Add to your .cursor/mcp.json (project-level) or Cursor settings (global):

Local Mode (BYOK)
{
  "mcpServers": {
    "clichefactory": {
      "command": "uvx",
      "args": ["clichefactory-mcp"],
      "env": {
        "LLM_MODEL_NAME": "gemini/gemini-3-flash-preview",
        "LLM_API_KEY": "your-gemini-key"
      }
    }
  }
}
Service Mode
{
  "mcpServers": {
    "clichefactory": {
      "command": "uvx",
      "args": ["clichefactory-mcp"],
      "env": {
        "CLICHEFACTORY_MODE": "service",
        "CLICHEFACTORY_API_KEY": "cliche-your-key"
      }
    }
  }
}

After saving, restart Cursor. The tools should appear in your MCP tools list.

Setup — Claude Desktop

Edit your Claude Desktop config at ~/Library/Application Support/Claude/claude_desktop_config.json (macOS) or %APPDATA%\Claude\claude_desktop_config.json (Windows):

Local Mode (BYOK)
{
  "mcpServers": {
    "clichefactory": {
      "command": "uvx",
      "args": ["clichefactory-mcp"],
      "env": {
        "LLM_MODEL_NAME": "gemini/gemini-3-flash-preview",
        "LLM_API_KEY": "your-gemini-key"
      }
    }
  }
}
Service Mode
{
  "mcpServers": {
    "clichefactory": {
      "command": "uvx",
      "args": ["clichefactory-mcp"],
      "env": {
        "CLICHEFACTORY_MODE": "service",
        "CLICHEFACTORY_API_KEY": "cliche-your-key"
      }
    }
  }
}

Restart Claude Desktop after saving. The ClicheFactory tools will be available in your conversations.

Setup — OpenClaw

For OpenClaw, install the ClicheFactory skill — see the OpenClaw page for full setup. The MCP server config is the same as the Cursor/Claude Desktop examples above.

Tools Reference

The MCP server exposes three tools:

extract

Extract structured data from a document using a JSON schema.

ParameterTypeRequiredDescription
filestringYesAbsolute path to the document file.
schemastring or objectYesJSON Schema (file path or inline object).
modestringNoExecution mode: local or service.
extraction_modestringNofast, robust. Trained mode auto-resolves when artifact_id is provided.
artifact_idstringNoTrained pipeline artifact ID.
modelstringNoLLM model override.
model_api_keystringNoAPI key for the model override.
ocr_modelstringNoSeparate model for OCR/VLM tasks.
ocr_api_keystringNoAPI key for the OCR model.
to_markdown

Convert a document to markdown text.

ParameterTypeRequiredDescription
filestringYesAbsolute path to the document file.
modestringNoClient mode: local or service.
conversion_modestringNodefault (full OCR pipeline) or fast (VLM-only, no OCR). Service mode only.
modelstringNoLLM model override.
model_api_keystringNoAPI key for the model override.
ocr_modelstringNoSeparate model for OCR/VLM tasks.
ocr_api_keystringNoAPI key for the OCR model.
doctor

Check ClicheFactory configuration, dependencies, and system binaries. No parameters. Call this when extraction fails or to verify setup.

Execution Modes

The MCP server supports both local and service modes, controlled by environment variables in the config. In local mode, the LLM call uses your API key. In service mode, ClicheFactory handles everything. See Core Concepts → Execution Modes for details.