3D Clothed Human Reconstruction in the Wild,
Gyeongsik Moon*, Hyeongjin Nam*, Takaaki Shiratori, Kyoung Mu Lee (* equal contribution)
European Conference on Computer Vision (ECCV), 2022
- We recommend you to use an Anaconda virtual environment. Install PyTorch >=1.8.0 and Python >= 3.7.0.
- Install Pytorch3d following here depending on your environment.
- Then, run
sh requirements.sh
. You should slightly changetorchgeometry
kernel code following here.
- Download the pre-trained weight from here and place it in
demo
folder. - Prepare
base_data
folder following belowDirectory
part. - Prepare
input.png
and edit itsbbox
ofdemo/demo.py
. - Prepare SMPL parameter, as
pose2pose_result.json
. You can get the SMPL parameter by running the off-the-shelf method [code]. - Run
python demo.py --gpu 0
.
Refer to here.
In the main/config.py
, you can change datasets to use.
cd ${ROOT}/main
python train.py --gpu 0
Place trained model at the output/model_dump
and follow below.
To evaluate CD (Chamfer Distance) on 3DPW, run
cd ${ROOT}/main
python test.py --gpu 0 --test_epoch 7 --type cd
To evaluate BCC (Body-Cloth Correspondence) on MSCOCO, run
cd ${ROOT}/main
python test.py --gpu 0 --test_epoch 7 --type bcc
You can download the checkpoint trained on MSCOCO+DeepFashion2 from here.
Refer to the paper's main manuscript and supplementary material for diverse qualitative results!
@InProceedings{Moon_2022_ECCV_ClothWild,
author = {Moon, Gyeongsik and Nam, Hyeongjin and Shiratori, Takaaki and Lee, Kyoung Mu},
title = {3D Clothed Human Reconstruction in the Wild},
booktitle = {European Conference on Computer Vision (ECCV)},
year = {2022}
}