Skip to content

Latest commit

 

History

History
34 lines (32 loc) · 1.43 KB

README.md

File metadata and controls

34 lines (32 loc) · 1.43 KB

FourierDiff

Official implement of Fourier Priors-Guided Diffusion for Zero-Shot Joint Low-Light Enhancement and Deblurring

Installation

Environment

conda env create --file environment.yml
conda activate FourierDiff

Pre-Trained Models

download this model(from guided-diffusion) and put it into FourierDiff/exp/logs/imagenet/.

wget https://openaipublic.blob.core.windows.net/diffusion/jul-2021/256x256_diffusion_uncond.pt

Quick Start

The input image should be in FourierDiff/exp/datasets/test/low. The results should be in FourierDiff/exp/image_samples/output.

 python main.py --config llve.yml --path_y test -i output

TODO

  • low-light enhancement branch
  • deblurring branch

Citation

@inproceedings{lv2024fourier,
  title={Fourier Priors-Guided Diffusion for Zero-Shot Joint Low-Light Enhancement and Deblurring},
  author={Lv, Xiaoqian and Zhang, Shengping and Wang, Chenyang and Zheng, Yichen and Zhong, Bineng and Li, Chongyi and Nie, Liqiang},
  booktitle={Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition},
  pages={25378--25388},
  year={2024}
}