Implementation for paper Wen Guo, Xiaoyu Bie, Xavier Alameda-Pineda, Francesc Moreno-Noguer, Multi-Person Extreme Motion Prediction, CVPR2022.
[paper] [Project page]
This repo been tested on CUDA9, Python3.6, Pytorch1.5.1.
ROOT
|-- datasets
|-- pi
|-- acro1
`-- acro2
|-- run_exps
|-- main
|-- model
|-- utils
|-- checkpoint
|--pretrain_ckpt
|-- tensorboard
`-- outputs
Please request and download data from ExPI and put the data in /datasets.
Note: If you are NOT affiliated with an institution from a country offering an adequate level of data protection (most countries without EU, please check the list), you have to sign the "Standard Contractual Clauses" when applying for the data. Please follow the instructions in the downloading website.
- Please download pretrained models from model and put them in ./checkpoint/pretrain_ckpt/.
- Run ./run_exps/run_pro1.sh to test on Common-Action-Split. (To test on Single-Action-Split and Unseen-Action-Split, please run run_pro2.sh and run_pro3.sh respectively.)
- To train/test on Common-Action-Split, please look at ./run_exps/run_pro1.sh and uncommand the corresponding lines.
- When testing, '--save_result' option could be used to save the result of different experiments in a same file ./outputs/results.json. Than ./outputs/write_results.py could be used to easily generate the result table as shown in our paper.
- Same for Single-Action-Split/Unseen-Action-Split.
If you find our code or data helpful, please cite our work
@article{guo2021multi, title={Multi-Person Extreme Motion Prediction}, author={Wen,Guo and Xiaoyu, Bie and Xavier, Alameda-Pineda, Francesc,Moreno-Noguer}, journal={arXiv preprint arXiv:2105.08825}, year={2021} }
Some codes are adapted from HisRepItself.
GPL