Skip to content

Hibbert133/ShadowDiffsuion_plus

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

14 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

[APSIPA ASC 2025] An improved method for Image Shadow Removal by Combining Deterministic and Stochastic Models

Introduction

ShadowDiffusion addresses shadow removal by proposing a unified diffusion framework, dubbed ShadowDiffusion, that integrates both the image and degradation priors for highly effective shadow removal. ShadowDiffusion progressively refines the estimated shadow mask as an auxiliary task of the diffusion generator, which leads to more accurate and robust shadow-free image generation. For more details, please refer to our original paper.

To improve both efficiency and effectiveness, we propose two key contributions: (1) integrating the Nonlinear Activation Free Network (NAFNet) as the degradation prior within the diffusion framework. This combines NAFNet’s efficient deterministic modeling with the stochastic generative capacity of diffusion models, leveraging their complementary strengths for high-quality, detail-preserving reconstructions. (2) optimizing the penalty parameter in the unrolling process, yielding improved restoration quality without additional training cost. Our improved model, ShadowDiffusion+, is evaluated on two public shadow removal datasets. On the SRD dataset, it improves PSNR from 34.73 dB to 36.01 dB and SSIM from 0.970 to 0.979, demonstrating both accuracy and efficiency.

ShadowDiffusion framework:

NAFNet framework as estimtetion network:

Requirement

  • Python 3.7
  • Pytorch 1.7
  • CUDA 11.1
pip install -r requirements.txt

Datasets

Pretrained NAFNet

[Link]

Pretrained diffsuion

[Link]

Please download the corresponding pretrained model and modify the resume_state and degradation_model_path (optional) in shadow.json.

Test

You can directly test the performance of the pre-trained model as follows

  1. Modify the paths to dataset and pre-trained model. You need to modify the following path in the shadow.json
resume_state # pretrain model or training state -- Line 12
dataroot # validation dataset path -- Line 30
  1. Test the model
python sr.py -p val -c config/shadow_SRD.json

We use the DDIM sampling to speed up the inference stage. The number of steps can be set by T_sampling as 5 or 25.

Train

  1. Download datasets and set the following structure
|-- SRD_Dataset
    |-- train
        |-- train_A # shadow image
        |-- train_B # shadow mask
        |-- train_C # shadow-free GT
    |-- test
        |-- test_A # shadow image
        |-- test_B # shadow mask
        |-- test_C # shadow-free GT
  1. You need to modify the following terms in option.py
"resume_state": null # if train from scratch
"dataroot"   # training and testing set path
"gpu_ids": [0] # Our model can be trained using a single RTX A5000 GPU. You can also train the model using multiple GPUs by adding more GPU ids in it.
  1. Train the diffsuion network
python sr.py -p train -c config/shadow.json
  1. Train the NAFNet network
python train_NAFNet.py -p train -c config/shadow.json --dataset SRD

Evaluation

The results reported in the paper are calculated by the matlab script used in previous method. Details refer to evaluation.m.

Results

Visual results

Reference

[CVPR 2023] ShadowDiffusion: When Degradation Prior Meets Diffusion Model for Shadow Removal [Paper]

ShadowDiffusion: When Degradation Prior Meets Diffusion Model for Shadow Removal
Lanqing Guo, Chong Wang, Wenhan Yang, Siyu Huang, Yufei Wang, Hanspeter Pfister, Bihan Wen
In CVPR'2023

Citation

Hongjun Sheng, Lanqing Guo, Xinggan Peng, Zhiping Lin and Bihan Wen, "An improved method for Image Shadow Removal by Combining Deterministic and Stochastic Models", Asia Pacific Signal and Information Processing Association Annual Summit and Conference (APSIPA ASC) 2025. [Link]

About

[APSIPA ASC 2025] An improved method for Image Shadow Removal by Combining Deterministic and Stochastic Models

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors