Skip to content

imics-lab/EnsAug

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

3 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

EnsAug: Augmentation-Driven Ensembles for Human Motion Sequence Analysis

EnsAug is a research framework for ensemble learning with augmentation-driven pipelines, aimed at analyzing human motion sequences such as sign language and multi-modal action datasets. It provides reproducible experiments with advanced data augmentations and Transformer models.


Data Description

The data folder contains preprocessed datasets:

  • data_100_xy: WLASL 100 (Sign Language)
  • data_300_xy: WLASL 300 (Sign Language)
  • data_signum: Signum (Sign Language)
  • data_mhad: UTD_MHAD
  • Note: The sign language data has been removed due to space constraints

Code Structure

  • All augmentation techniques and the Transformer architecture are defined in the src folder.
  • Example experiment and training scripts are located in the scripts folder, subdivided per dataset.

Dependencies

This project requires the following packages (see requirements.txt for exact versions):

  • Python >= 3.7
  • PyTorch
  • scikit-learn
  • transformers
  • matplotlib
  • numpy
  • pandas
  • tqdm

Running Experiments

1. Sign Language Experiments (WLASL / Signum)

  • Navigate to the scripts/sign_language folder:

    cd scripts/sign_language
    
  • Configure your experiment in config.py:

    • Set the data folder and select the desired augmentation (numbered as in the paper).
  • To run with augmentations:

    • Execute sign_transformer_augment.py, selecting a different augmentation each run.
  • After running all augmentation experiments:

    • Run sign_ensemble.py to compute ensemble results.
  • Other scripts:

    • sign_augment_traditional.py: runs traditional augmentations.
    • sign_transformer_baseline.py: runs the standard Transformer model without augmentations.

2. MHAD Experiments

  • Navigate to the scripts/mhad folder:

    cd scripts/mhad
    
  • Adjust config.py to select the data folder and augmentation.

  • To run with augmentations:

    • Execute mhad_transformer_augment.py, selecting a different augmentation each run.
  • After all augmentations:

    • Run mhad_ensemble.py for ensemble results.
  • Other scripts:

    • mhad_augment_traditional.py: runs traditional augmentations.
    • mhad_transformer_baseline.py: runs the standard Transformer model.

Notes

  • Ensure you edit config.py before each experiment to select the correct data and augmentation.
  • Augmentation modules are numbered corresponding to those described in the associated paper.

For details about preprocessing, training hyperparameters, or reproducing results, please refer to the comments in individual scripts or reach out to the maintainers.

About

EnsAug: Augmentation-Driven Ensembles for Human Motion Sequence Analysis

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 2

  •  
  •  

Languages