This module contains scripts that can fit SMPLH or SMPLH+D models to 3D scans or point clouds captured by Kinects.
- Organizing SMPLH model files
- Fit SMPLH to scans
- Fit SMPLH+D to scans
- Fit SMPLH to point clouds
- Fit SMPLH+D to point clouds using IP-Net
smplh model file structure:
|model root
|--grab # folder containing hand priors computed from grab dataset
|----lh_prior.pkl
|----rh_prior.pkl
|--regressors # folder for body, face, and hand regressors
|--SMPLH_female.pkl # SMPLH female model blending weights
|--SMPLH_male.pkl
|--SMPLH_neutral.pkl
|--template # folder for the template mesh files
With the model files ready, you can run fitting with:
python smpl_registration/fit_SMPLH.py [scan_path] [pose_file] [save_path]
[-gender male/female/neutral] [-mr root path to SMPLH model]
Fitting SMPLH+D is based on fitting SMPLH, hence the command is very similar, except you can provide existing SMPLH parameters as input.
python smpl_registration/fit_SMPLH+D.py [scan_path] [pose_file] [save_path]
[-smpl_pkl existing SMPLH parameters]
[-gender male/female/neutral] [-mr root path to SMPLH model]
The fitting procedure is very similar to scan fitting. But Kinect point clouds are noisy and incomplete and the person pose captured by Kinects can be much more diverse than scans, we recommend to provide 3d pose estimation to initialize the SMPL model. These initial pose estimations can be obtained from monocular pose estimation methods, for example, FrankMocap.
Run fitting:
python smpl_registration/fit_SMPLH_pcloud.py [pc_path] [j3d_file] [pose_init] [save_path]
[-gender male/female/neutral] [-mr root path to SMPLH model]
This fitting is based on the IP-Net project. You can download the pretrained IP-Net model here. The SMPLH model structure is the same as before. Run fitting:
python smpl_registration/fit_SMPLH_IPNet.py [pc_path] [checkpoint path] [save path]
[-gender male/female/neutral] [-mr root path to SMPLH model]