# Directory Structure
```
├── client.py
├── comfyui_client.py
├── comfyui_workflow_test.py
├── LICENSE
├── README.md
├── server.py
└── workflows
├── basic_api_test.json
└── basic.json
```
# Files
--------------------------------------------------------------------------------
/README.md:
--------------------------------------------------------------------------------
```markdown
1 | # ComfyUI MCP Server
2 |
3 | A lightweight Python-based MCP (Model Context Protocol) server that interfaces with a local [ComfyUI](https://github.com/comfyanonymous/ComfyUI) instance to generate images programmatically via AI agent requests.
4 |
5 | ## Overview
6 |
7 | This project enables AI agents to send image generation requests to ComfyUI using the MCP protocol over WebSocket. It supports:
8 | - Flexible workflow selection (e.g., `basic_api_test.json`).
9 | - Dynamic parameters: `prompt`, `width`, `height`, and `model`.
10 | - Returns image URLs served by ComfyUI.
11 |
12 | ## Prerequisites
13 |
14 | - **Python 3.10+**
15 | - **ComfyUI**: Installed and running locally (e.g., on `localhost:8188`).
16 | - **Dependencies**: `requests`, `websockets`, `mcp` (install via pip).
17 |
18 | ## Setup
19 |
20 | 1. **Clone the Repository**:
21 | git clone <your-repo-url>
22 | cd comfyui-mcp-server
23 |
24 | 2. **Install Dependencies**:
25 |
26 | pip install requests websockets mcp
27 |
28 |
29 | 3. **Start ComfyUI**:
30 | - Install ComfyUI (see [ComfyUI docs](https://github.com/comfyanonymous/ComfyUI)).
31 | - Run it on port 8188:
32 | ```
33 | cd <ComfyUI_dir>
34 | python main.py --port 8188
35 | ```
36 |
37 | 4. **Prepare Workflows**:
38 | - Place API-format workflow files (e.g., `basic_api_test.json`) in the `workflows/` directory.
39 | - Export workflows from ComfyUI’s UI with “Save (API Format)” (enable dev mode in settings).
40 |
41 | ## Usage
42 |
43 | 1. **Run the MCP Server**:
44 | python server.py
45 |
46 | - Listens on `ws://localhost:9000`.
47 |
48 | 2. **Test with the Client**:
49 | python client.py
50 |
51 | - Sends a sample request: `"a dog wearing sunglasses"` with `512x512` using `sd_xl_base_1.0.safetensors`.
52 | - Output example:
53 | ```
54 | Response from server:
55 | {
56 | "image_url": "http://localhost:8188/view?filename=ComfyUI_00001_.png&subfolder=&type=output"
57 | }
58 | ```
59 |
60 | 3. **Custom Requests**:
61 | - Modify `client.py`’s `payload` to change `prompt`, `width`, `height`, `workflow_id`, or `model`.
62 | - Example:
63 | ```
64 | "params": json.dumps({
65 | "prompt": "a cat in space",
66 | "width": 768,
67 | "height": 768,
68 | "workflow_id": "basic_api_test",
69 | "model": "v1-5-pruned-emaonly.ckpt"
70 | })
71 | ```
72 |
73 | ## Project Structure
74 |
75 | - `server.py`: MCP server with WebSocket transport and lifecycle support.
76 | - `comfyui_client.py`: Interfaces with ComfyUI’s API, handles workflow queuing.
77 | - `client.py`: Test client for sending MCP requests.
78 | - `workflows/`: Directory for API-format workflow JSON files.
79 |
80 | ## Notes
81 |
82 | - Ensure your chosen `model` (e.g., `v1-5-pruned-emaonly.ckpt`) exists in `<ComfyUI_dir>/models/checkpoints/`.
83 | - The MCP SDK lacks native WebSocket transport; this uses a custom implementation.
84 | - For custom workflows, adjust node IDs in `comfyui_client.py`’s `DEFAULT_MAPPING` if needed.
85 |
86 | ## Contributing
87 |
88 | Feel free to submit issues or PRs to enhance flexibility (e.g., dynamic node mapping, progress streaming).
89 |
90 | ## License
91 |
92 | Apache License
```
--------------------------------------------------------------------------------
/client.py:
--------------------------------------------------------------------------------
```python
1 | import asyncio
2 | import websockets
3 | import json
4 |
5 | payload = {
6 | "tool": "generate_image",
7 | "params": json.dumps({
8 | "prompt": "an english mastiff dog sitting on a large boulder, bright shiny day",
9 | "width": 512,
10 | "height": 512,
11 | "workflow_id": "basic_api_test",
12 | "model": "v1-5-pruned-emaonly.ckpt" # No extra quote
13 | })
14 | }
15 |
16 | async def test_mcp_server():
17 | uri = "ws://localhost:9000"
18 | try:
19 | async with websockets.connect(uri) as ws:
20 | print("Connected to MCP server")
21 | await ws.send(json.dumps(payload))
22 | response = await ws.recv()
23 | print("Response from server:")
24 | print(json.dumps(json.loads(response), indent=2))
25 | except Exception as e:
26 | print(f"WebSocket error: {e}")
27 |
28 | if __name__ == "__main__":
29 | print("Testing MCP server with WebSocket...")
30 | asyncio.run(test_mcp_server())
```
--------------------------------------------------------------------------------
/comfyui_workflow_test.py:
--------------------------------------------------------------------------------
```python
1 | import json
2 | from urllib import request
3 |
4 | #This is the ComfyUI api prompt format.
5 |
6 | #If you want it for a specific workflow you can "enable dev mode options"
7 | #in the settings of the UI (gear beside the "Queue Size: ") this will enable
8 | #a button on the UI to save workflows in api format.
9 |
10 | #keep in mind ComfyUI is pre alpha software so this format will change a bit.
11 |
12 | #this is the one for the default workflow
13 |
14 | def queue_prompt(prompt):
15 | p = {"prompt": prompt}
16 | data = json.dumps(p).encode('utf-8')
17 | req = request.Request("http://localhost:8188/prompt", data=data)
18 | request.urlopen(req)
19 |
20 |
21 | with open("workflows/basic_api_test.json", "r") as f:
22 | prompt = json.load(f)
23 | #set the text prompt for our positive CLIPTextEncode
24 | prompt["6"]["inputs"]["text"] = "beautiful scenery nature glass bottle landscape, purple galaxy bottle"
25 |
26 | #set the seed for our KSampler node
27 | prompt["3"]["inputs"]["seed"] = 5
28 |
29 |
30 | queue_prompt(prompt)
```
--------------------------------------------------------------------------------
/workflows/basic_api_test.json:
--------------------------------------------------------------------------------
```json
1 | {
2 | "3": {
3 | "inputs": {
4 | "seed": 156680208700286,
5 | "steps": 20,
6 | "cfg": 8,
7 | "sampler_name": "euler",
8 | "scheduler": "normal",
9 | "denoise": 1,
10 | "model": [
11 | "4",
12 | 0
13 | ],
14 | "positive": [
15 | "6",
16 | 0
17 | ],
18 | "negative": [
19 | "7",
20 | 0
21 | ],
22 | "latent_image": [
23 | "5",
24 | 0
25 | ]
26 | },
27 | "class_type": "KSampler",
28 | "_meta": {
29 | "title": "KSampler"
30 | }
31 | },
32 | "4": {
33 | "inputs": {
34 | "ckpt_name": "v1-5-pruned-emaonly.ckpt"
35 | },
36 | "class_type": "CheckpointLoaderSimple",
37 | "_meta": {
38 | "title": "Load Checkpoint"
39 | }
40 | },
41 | "5": {
42 | "inputs": {
43 | "width": 512,
44 | "height": 512,
45 | "batch_size": 1
46 | },
47 | "class_type": "EmptyLatentImage",
48 | "_meta": {
49 | "title": "Empty Latent Image"
50 | }
51 | },
52 | "6": {
53 | "inputs": {
54 | "text": "beautiful scenery nature glass bottle landscape, , purple galaxy bottle,",
55 | "clip": [
56 | "4",
57 | 1
58 | ]
59 | },
60 | "class_type": "CLIPTextEncode",
61 | "_meta": {
62 | "title": "CLIP Text Encode (Prompt)"
63 | }
64 | },
65 | "7": {
66 | "inputs": {
67 | "text": "text, watermark",
68 | "clip": [
69 | "4",
70 | 1
71 | ]
72 | },
73 | "class_type": "CLIPTextEncode",
74 | "_meta": {
75 | "title": "CLIP Text Encode (Prompt)"
76 | }
77 | },
78 | "8": {
79 | "inputs": {
80 | "samples": [
81 | "3",
82 | 0
83 | ],
84 | "vae": [
85 | "4",
86 | 2
87 | ]
88 | },
89 | "class_type": "VAEDecode",
90 | "_meta": {
91 | "title": "VAE Decode"
92 | }
93 | },
94 | "9": {
95 | "inputs": {
96 | "filename_prefix": "ComfyUI",
97 | "images": [
98 | "8",
99 | 0
100 | ]
101 | },
102 | "class_type": "SaveImage",
103 | "_meta": {
104 | "title": "Save Image"
105 | }
106 | }
107 | }
```
--------------------------------------------------------------------------------
/server.py:
--------------------------------------------------------------------------------
```python
1 | import asyncio
2 | import json
3 | import logging
4 | from typing import AsyncIterator
5 | from contextlib import asynccontextmanager
6 | import websockets
7 | from mcp.server.fastmcp import FastMCP
8 | from comfyui_client import ComfyUIClient
9 |
10 | # Configure logging
11 | logging.basicConfig(level=logging.INFO)
12 | logger = logging.getLogger("MCP_Server")
13 |
14 | # Global ComfyUI client (fallback since context isn’t available)
15 | comfyui_client = ComfyUIClient("http://localhost:8188")
16 |
17 | # Define application context (for future use)
18 | class AppContext:
19 | def __init__(self, comfyui_client: ComfyUIClient):
20 | self.comfyui_client = comfyui_client
21 |
22 | # Lifespan management (placeholder for future context support)
23 | @asynccontextmanager
24 | async def app_lifespan(server: FastMCP) -> AsyncIterator[AppContext]:
25 | """Manage application lifecycle"""
26 | logger.info("Starting MCP server lifecycle...")
27 | try:
28 | # Startup: Could add ComfyUI health check here in the future
29 | logger.info("ComfyUI client initialized globally")
30 | yield AppContext(comfyui_client=comfyui_client)
31 | finally:
32 | # Shutdown: Cleanup (if needed)
33 | logger.info("Shutting down MCP server")
34 |
35 | # Initialize FastMCP with lifespan
36 | mcp = FastMCP("ComfyUI_MCP_Server", lifespan=app_lifespan)
37 |
38 | # Define the image generation tool
39 | @mcp.tool()
40 | def generate_image(params: str) -> dict:
41 | """Generate an image using ComfyUI"""
42 | logger.info(f"Received request with params: {params}")
43 | try:
44 | param_dict = json.loads(params)
45 | prompt = param_dict["prompt"]
46 | width = param_dict.get("width", 512)
47 | height = param_dict.get("height", 512)
48 | workflow_id = param_dict.get("workflow_id", "basic_api_test")
49 | model = param_dict.get("model", None)
50 |
51 | # Use global comfyui_client (since mcp.context isn’t available)
52 | image_url = comfyui_client.generate_image(
53 | prompt=prompt,
54 | width=width,
55 | height=height,
56 | workflow_id=workflow_id,
57 | model=model
58 | )
59 | logger.info(f"Returning image URL: {image_url}")
60 | return {"image_url": image_url}
61 | except Exception as e:
62 | logger.error(f"Error: {e}")
63 | return {"error": str(e)}
64 |
65 | # WebSocket server
66 | async def handle_websocket(websocket, path):
67 | logger.info("WebSocket client connected")
68 | try:
69 | async for message in websocket:
70 | request = json.loads(message)
71 | logger.info(f"Received message: {request}")
72 | if request.get("tool") == "generate_image":
73 | result = generate_image(request.get("params", ""))
74 | await websocket.send(json.dumps(result))
75 | else:
76 | await websocket.send(json.dumps({"error": "Unknown tool"}))
77 | except websockets.ConnectionClosed:
78 | logger.info("WebSocket client disconnected")
79 |
80 | # Main server loop
81 | async def main():
82 | logger.info("Starting MCP server on ws://localhost:9000...")
83 | async with websockets.serve(handle_websocket, "localhost", 9000):
84 | await asyncio.Future() # Run forever
85 |
86 | if __name__ == "__main__":
87 | asyncio.run(main())
```
--------------------------------------------------------------------------------
/workflows/basic.json:
--------------------------------------------------------------------------------
```json
1 | {"last_node_id":9,"last_link_id":9,"nodes":[{"id":7,"type":"CLIPTextEncode","pos":[413,389],"size":[425.27801513671875,180.6060791015625],"flags":{},"order":3,"mode":0,"inputs":[{"name":"clip","localized_name":"clip","type":"CLIP","link":5}],"outputs":[{"name":"CONDITIONING","localized_name":"CONDITIONING","type":"CONDITIONING","links":[6],"slot_index":0}],"properties":{"cnr_id":"comfy-core","ver":"0.3.22","Node name for S&R":"CLIPTextEncode"},"widgets_values":["text, watermark"]},{"id":6,"type":"CLIPTextEncode","pos":[415,186],"size":[422.84503173828125,164.31304931640625],"flags":{},"order":2,"mode":0,"inputs":[{"name":"clip","localized_name":"clip","type":"CLIP","link":3}],"outputs":[{"name":"CONDITIONING","localized_name":"CONDITIONING","type":"CONDITIONING","links":[4],"slot_index":0}],"properties":{"cnr_id":"comfy-core","ver":"0.3.22","Node name for S&R":"CLIPTextEncode"},"widgets_values":["beautiful scenery nature glass bottle landscape, , purple galaxy bottle,"]},{"id":5,"type":"EmptyLatentImage","pos":[473,609],"size":[315,106],"flags":{},"order":0,"mode":0,"inputs":[],"outputs":[{"name":"LATENT","localized_name":"LATENT","type":"LATENT","links":[2],"slot_index":0}],"properties":{"cnr_id":"comfy-core","ver":"0.3.22","Node name for S&R":"EmptyLatentImage"},"widgets_values":[512,512,1]},{"id":3,"type":"KSampler","pos":[863,186],"size":[315,262],"flags":{},"order":4,"mode":0,"inputs":[{"name":"model","localized_name":"model","type":"MODEL","link":1},{"name":"positive","localized_name":"positive","type":"CONDITIONING","link":4},{"name":"negative","localized_name":"negative","type":"CONDITIONING","link":6},{"name":"latent_image","localized_name":"latent_image","type":"LATENT","link":2}],"outputs":[{"name":"LATENT","localized_name":"LATENT","type":"LATENT","links":[7],"slot_index":0}],"properties":{"cnr_id":"comfy-core","ver":"0.3.22","Node name for S&R":"KSampler"},"widgets_values":[156680208700286,"randomize",20,8,"euler","normal",1]},{"id":8,"type":"VAEDecode","pos":[1209,188],"size":[210,46],"flags":{},"order":5,"mode":0,"inputs":[{"name":"samples","localized_name":"samples","type":"LATENT","link":7},{"name":"vae","localized_name":"vae","type":"VAE","link":8}],"outputs":[{"name":"IMAGE","localized_name":"IMAGE","type":"IMAGE","links":[9],"slot_index":0}],"properties":{"cnr_id":"comfy-core","ver":"0.3.22","Node name for S&R":"VAEDecode"},"widgets_values":[]},{"id":9,"type":"SaveImage","pos":[1451,189],"size":[210,58],"flags":{},"order":6,"mode":0,"inputs":[{"name":"images","localized_name":"images","type":"IMAGE","link":9}],"outputs":[],"properties":{"cnr_id":"comfy-core","ver":"0.3.22"},"widgets_values":["ComfyUI"]},{"id":4,"type":"CheckpointLoaderSimple","pos":[26,474],"size":[315,98],"flags":{},"order":1,"mode":0,"inputs":[],"outputs":[{"name":"MODEL","localized_name":"MODEL","type":"MODEL","links":[1],"slot_index":0},{"name":"CLIP","localized_name":"CLIP","type":"CLIP","links":[3,5],"slot_index":1},{"name":"VAE","localized_name":"VAE","type":"VAE","links":[8],"slot_index":2}],"properties":{"cnr_id":"comfy-core","ver":"0.3.22","Node name for S&R":"CheckpointLoaderSimple"},"widgets_values":["v1-5-pruned-emaonly.ckpt"]}],"links":[[1,4,0,3,0,"MODEL"],[2,5,0,3,3,"LATENT"],[3,4,1,6,0,"CLIP"],[4,6,0,3,1,"CONDITIONING"],[5,4,1,7,0,"CLIP"],[6,7,0,3,2,"CONDITIONING"],[7,3,0,8,0,"LATENT"],[8,4,2,8,1,"VAE"],[9,8,0,9,0,"IMAGE"]],"groups":[],"config":{},"extra":{"ds":{"scale":1,"offset":[0,0]}},"version":0.4}
```
--------------------------------------------------------------------------------
/comfyui_client.py:
--------------------------------------------------------------------------------
```python
1 | import requests
2 | import json
3 | import time
4 | import logging
5 |
6 | logging.basicConfig(level=logging.INFO)
7 | logger = logging.getLogger("ComfyUIClient")
8 |
9 | DEFAULT_MAPPING = {
10 | "prompt": ("6", "text"),
11 | "width": ("5", "width"),
12 | "height": ("5", "height"),
13 | "model": ("4", "ckpt_name")
14 | }
15 |
16 | class ComfyUIClient:
17 | def __init__(self, base_url):
18 | self.base_url = base_url
19 | self.available_models = self._get_available_models()
20 |
21 | def _get_available_models(self):
22 | """Fetch list of available checkpoint models from ComfyUI"""
23 | try:
24 | response = requests.get(f"{self.base_url}/object_info/CheckpointLoaderSimple")
25 | if response.status_code != 200:
26 | logger.warning("Failed to fetch model list; using default handling")
27 | return []
28 | data = response.json()
29 | models = data["CheckpointLoaderSimple"]["input"]["required"]["ckpt_name"][0]
30 | logger.info(f"Available models: {models}")
31 | return models
32 | except Exception as e:
33 | logger.warning(f"Error fetching models: {e}")
34 | return []
35 |
36 | def generate_image(self, prompt, width, height, workflow_id="basic_api_test", model=None):
37 | try:
38 | workflow_file = f"workflows/{workflow_id}.json"
39 | with open(workflow_file, "r") as f:
40 | workflow = json.load(f)
41 |
42 | params = {"prompt": prompt, "width": width, "height": height}
43 | if model:
44 | # Validate or correct model name
45 | if model.endswith("'"): # Strip accidental quote
46 | model = model.rstrip("'")
47 | logger.info(f"Corrected model name: {model}")
48 | if self.available_models and model not in self.available_models:
49 | raise Exception(f"Model '{model}' not in available models: {self.available_models}")
50 | params["model"] = model
51 |
52 | for param_key, value in params.items():
53 | if param_key in DEFAULT_MAPPING:
54 | node_id, input_key = DEFAULT_MAPPING[param_key]
55 | if node_id not in workflow:
56 | raise Exception(f"Node {node_id} not found in workflow {workflow_id}")
57 | workflow[node_id]["inputs"][input_key] = value
58 |
59 | logger.info(f"Submitting workflow {workflow_id} to ComfyUI...")
60 | response = requests.post(f"{self.base_url}/prompt", json={"prompt": workflow})
61 | if response.status_code != 200:
62 | raise Exception(f"Failed to queue workflow: {response.status_code} - {response.text}")
63 |
64 | prompt_id = response.json()["prompt_id"]
65 | logger.info(f"Queued workflow with prompt_id: {prompt_id}")
66 |
67 | max_attempts = 30
68 | for _ in range(max_attempts):
69 | history = requests.get(f"{self.base_url}/history/{prompt_id}").json()
70 | if history.get(prompt_id):
71 | outputs = history[prompt_id]["outputs"]
72 | logger.info("Workflow outputs: %s", json.dumps(outputs, indent=2))
73 | image_node = next((nid for nid, out in outputs.items() if "images" in out), None)
74 | if not image_node:
75 | raise Exception(f"No output node with images found: {outputs}")
76 | image_filename = outputs[image_node]["images"][0]["filename"]
77 | image_url = f"{self.base_url}/view?filename={image_filename}&subfolder=&type=output"
78 | logger.info(f"Generated image URL: {image_url}")
79 | return image_url
80 | time.sleep(1)
81 | raise Exception(f"Workflow {prompt_id} didn’t complete within {max_attempts} seconds")
82 |
83 | except FileNotFoundError:
84 | raise Exception(f"Workflow file '{workflow_file}' not found")
85 | except KeyError as e:
86 | raise Exception(f"Workflow error - invalid node or input: {e}")
87 | except requests.RequestException as e:
88 | raise Exception(f"ComfyUI API error: {e}")
```