- Go to
ComfyUI/custom_nodes
- Clone this repo, path should be
ComfyUI/custom_nodes/x-flux-comfyui/*
, where * is all the files in this repo - Go to
ComfyUI/custom_nodes/x-flux-comfyui/
and runpython setup.py
- Run ComfyUI after installing and enjoy!
After the first launch, the ComfyUI/models/xlabs/loras
and ComfyUI/models/xlabs/controlnets
folders will be created automatically.
So, to use lora or controlnet just put models in these folders.
After that, you may need to click "Refresh" in the user-friendly interface to use the models.
For controlnet you need install https://github.com/Fannovel16/comfyui_controlnet_aux
You can launch Flux utilizing 12GB VRAM memory usage.
- Follow installation as described in repo https://github.com/city96/ComfyUI-GGUF
- Use flux1-dev-Q4_0.gguf from repo https://github.com/city96/ComfyUI-GGUF
- Launch ComfyUI with parameters:
python3 main.py --lowvram --preview-method auto --use-split-cross-attention
In our workflows, replace "Load Diffusion Model" node with "Unet Loader (GGUF)"
We trained Canny ControlNet, Depth ControlNet, HED ControlNet and LoRA checkpoints for FLUX.1 [dev]
You can download them on HuggingFace: