Skip to content

a-elrawy/SPA-GAN

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

12 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Spatial Attention GAN for Image-to-Image Translation

PyTorch Implementation of SPA-GAN.

Overview

alt text

Architecture

img.png

CycleGAN (a) and SPA-GAN (b) architecture

Prerequites


To install these packages, run the following command:

pip install -r requirements.txt

Usage

To run the script, open a terminal and navigate to the directory containing the script. Then, run the following command:

python main.py [OPTIONS]

The following options are available:

  • --dataset: specify the dataset name (string, default='facades')
  • --epochs: specify the number of epochs (integer, default=10)
  • --lr: specify the learning rate (float, default=0.0002)
  • --beta1: specify the beta1 parameter for the Adam optimizer (float, default=0.5)
  • --beta2: specify the beta2 parameter for the Adam optimizer (float, default=0.999)
  • --generate_source: specify the source for generating images (string, default=None)
  • --generate_target: specify the target for generating images (string, default=None)
  • --save_checkpoint: specify whether to save the model checkpoints during training (Boolean, default=True)
  • --checkpoint_dir: specify the directory where the model checkpoints will be saved (string, default='checkpoints')
  • --load_checkpoint: specify whether to load the model checkpoint before training (Boolean, default=True)
  • --wandb: specify whether to use the Weights and Biases platform for visualization and logging (Boolean, default=False)
  • --evaluate: specify whether to evaluate the model (Boolean, default=False)

For example, to train the model on the Facades dataset for 100 epochs, run the following command:

python main.py --dataset facades --epochs 100

To generate images from the trained model, run the following command:

python main.py --generate_source [PATH_TO_SOURCE] --generate_target [PATH_TO_TARGET]

For example, to generate images from the trained model on the Facades dataset, run the following command:

python main.py --generate_source datasets/facades/testA/1.jpg --generate_target datasets/facades/testB/1.jpg

To visualize the training process using the Weights and Biases platform, run the following command:

python main.py --wandb True

Results

1) Facades

From Facades to Map From Map to Facades
alt text alt text

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages