This is the official implementation of the paper STEP: Spatial Temporal Graph Convolutional Networks for Emotion Perception from Gaits. Please use the following citation if you find our work uesful:
@inproceedings{bhattacharya2020step,
author = {Bhattacharya, Uttaran and Mittal, Trisha and Chandra, Rohan and Randhavane, Tanmay and Bera, Aniket and Manocha, Dinesh},
title = {STEP: Spatial Temporal Graph Convolutional Networks for Emotion Perception from Gaits},
year = {2020},
publisher = {AAAI Press},
booktitle = {Proceedings of the Thirty-Fourth AAAI Conference on Artificial Intelligence},
pages = {1342–1350},
numpages = {9},
series = {AAAI’20}
}
We have also released the Emotion-Gait dataset with this code, which is available for download here: https://go.umd.edu/emotion-gait.
-
generator_cvaeis the generator. -
classifier_stgcn_real_onlyis the baseline classifier using only the real 342 gaits. -
classifier_stgcn_real_and_synthis the baseline classifier using both real 342 and N synthetic gaits. -
clasifier_hybridis the hybrid classifier using both deep and physiologically-motivated features. -
compute_aff_featuresconsists of the set of scripts to compute the affective features from 16-joint pose sequences. Callingmain.pywith the correct data path computes the features, and save them in theaffectiveFeatures<f_type>.h5file, wheref_typeis the desired type of features:''original data (default)4DCVAEGCNdata generated by the CVAE.