This is the official implementation of the paper STEP: Spatial Temporal Graph Convolutional Networks for Emotion Perception from Gaits. Please use the following citation if you find our work uesful:
@inproceedings{bhattacharya2020step,
author = {Bhattacharya, Uttaran and Mittal, Trisha and Chandra, Rohan and Randhavane, Tanmay and Bera, Aniket and Manocha, Dinesh},
title = {STEP: Spatial Temporal Graph Convolutional Networks for Emotion Perception from Gaits},
year = {2020},
publisher = {AAAI Press},
booktitle = {Proceedings of the Thirty-Fourth AAAI Conference on Artificial Intelligence},
pages = {1342–1350},
numpages = {9},
series = {AAAI’20}
}
We have also released the Emotion-Gait dataset with this code, which is available for download here: https://go.umd.edu/emotion-gait.
-
generator_cvae
is the generator. -
classifier_stgcn_real_only
is the baseline classifier using only the real 342 gaits. -
classifier_stgcn_real_and_synth
is the baseline classifier using both real 342 and N synthetic gaits. -
clasifier_hybrid
is the hybrid classifier using both deep and physiologically-motivated features. -
compute_aff_features
consists of the set of scripts to compute the affective features from 16-joint pose sequences. Callingmain.py
with the correct data path computes the features, and save them in theaffectiveFeatures<f_type>.h5
file, wheref_type
is the desired type of features:''
original data (default)4DCVAEGCN
data generated by the CVAE.