Skip to content

Exploring Versatile Prior for Human Motion via Motion Frequency Guidance (3DV2021)

License

Notifications You must be signed in to change notification settings

JchenXu/human-motion-prior

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

24 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Exploring Versatile Prior for Human Motion via Motion Frequency Guidance

[Video Demo] [Paper]

Installation

Requirements

  • Python 3.6
  • PyTorch 1.1.0

Please clone this repository and run the following command:

pip install -r requirements.txt

Then, download some checkpoints and human body models, and place them in the human_motion_prior/models.

Specifically,

  • Download the pretrained VPoser v1.0 model (2.5 MB) here.
  • Download the SMPLH model here.
  • Download the SMPL model here for MALE and FEMALE and here for NEUTRAL.
  • Download the regressor weights files 1) J_regressor_h36m.npy and 2) J_regressor_extra.npy here.

The models tree is like:

human_motion_prior/models
├── smpl
│   ├── SMPL_FEMALE.pkl
│   └── SMPL_MALE.pkl
│   └── SMPL_NEUTRAL.pkl
│   └── J_regressor_h36m.npy
│   └── J_regressor_extra.npy
├── smplh
│   ├── SMPLH_FEMALE_AMASS.npz
│   └── SMPLH_MALE_AMASS.npz
|   └── SMPLH_MEUTRAL_AMASS.npz
├── pre_trained
    └── vposer_v1_0

Data Pre-process

Follow the instruction here.

Note that, because we take the joints as the input to our motion prior, we pre-compute our generated AMASS sequence to generate the global orientation normalized joints sequence and store them for speed.

You can download the pre-compute joints here and put them into human_motion_prior/data. Also, you can generate them on-the-fly by uncommenting L355-L361 and L439.

Training

cd human_motion_prior/train
sh run_script.sh 4

We train our human motion prior on 4 GTX 1080Ti gpus with batch size 15 per gpu.

You can modify the human_motion_prior/train/motion_prior_defaults.ini for different training setting.

Inference

You can use the motion prior trained on AMASS to evaluate the VAE reconstruction loss on unseen 3DPW dataset as follows:

cd human_motion_prior/test
export PYTHONPATH=../../
python test_3dpw.py

Application

Citation

@inproceedings{human_motion_prior,
  title = {Exploring Versatile Prior for Human Motion via Motion Frequency Guidance},
  author = {Jiachen Xu, Min Wang, Jingyu Gong, Wentao Liu, Chen Qian, Yuan Xie, Lizhuang Ma},
  booktitle = {2021 international conference on 3D vision (3DV)},
  year = {2021}
}

Acknowledgments

We thank the authors of VPoser for their released code, and this base codes are largely borrowed from them.

About

Exploring Versatile Prior for Human Motion via Motion Frequency Guidance (3DV2021)

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published