The official implementation for our CVPR 2025 paper.
paper | project page | arXiv | use in your projects! | bibtex
Run this on a system with a GPU (i.e. $ nvidia-smi works)
conda env create -f environment-minimal-geomstyle.yml
NOTE there is a small chance you may need micromamba instead of conda to solve this environment in a reasonable amount of time. Your cluster may not have this installed; follow instructions here (I prefer the manual installation, putting the
micromambaexecutable in your home directory somewhere if you don't have root permissions such as on a cluster.) Afterwards run conda commands usingmicromambainstead ofconda.
Optimizations were tested to work on an NVIDIA A40 (48GB) GPU. Your GPU should
ideally have capacity to hold both DeepFloyd/IF-I-XL-v1.0 and
DeepFloyd/IF-II-L-v1.0 models, though configuring --deform_by_csd.guidance.cpu_offload true
may help with lower memory capacities.
Either your CPU or a much less powerful GPU can be used to run apply_saved_deform_qty_npz.py
which applies a saved deformation (a nrmls-*.npz file) to its associated mesh, with a tunable λ
hyperparameter.
No discrete GPU is required nor used for playing back psrec-*.npz
Polyscope recording files of results, as long as the playback machine has a
screen (so you can run it on your local machine, but not on a headless system.)
To set up HuggingFace Hub and your account for downloading the DeepFloyd stages (instructions from DeepFloyd IF):
- If you do not already have one, create a Hugging Face account
- Accept the license on the model card of DeepFloyd/IF-I-XL-v1.0
- Log in to Hugging face locally. In the conda environment you just created, install
huggingface_hub
pip install huggingface_hub --upgrade
run the login function in a python shell
from huggingface_hub import login
login()
and enter your Hugging Face Hub access token.
variants/deformations_MINIMAL.py is a self-contained variant of deformations_dARAP.py with minimal dependencies for easy inclusion in your projects (specifically no dependency on pytorch3d; the only dependencies are numpy, scipy, torch, cholespy, and libigl). See the docstring at the top of the file for info and usage.
Usage in a nutshell:
(check deformations_MINIMAL.py for more information.)
# supports batching. for example, operating on a batch of two meshes (mesh1, mesh2) not necessarily of the same vertex and face counts
(verts1, faces1) = mesh1
(verts2, faces2) = mesh2
verts_list = [verts1, verts2]
faces_list = [faces1, faces2]
# prepare the solver
solvers = SparseLaplaciansSolvers.from_meshes(
verts_list, faces_list,
pin_first_vertex=True,
compute_poisson_rhs_lefts=False,
compute_igl_arap_rhs_lefts=None,
)
# find per-vertex matrices with local step
procrustes_precompute = ProcrustesPrecompute.from_meshes(
local_step_procrustes_lambda=lamb,
arap_energy_type="spokes_and_rims_mine",
laplacians_solvers=solvers,
verts_list=verts_list,
faces_list=faces_list
)
per_vertex_3x3matrices_packed = calc_rot_matrices_with_procrustes(
procrustes_precompute,
torch.cat(verts_list),
torch.cat(target_normals_list)
)
# (assuming target_normals_list is a list of tensors each (n_verts, 3), each tensor is the target per-vertex normals of a single mesh in the batch)
per_vertex_3x3matrices_list = torch.split(per_vertex_3x3matrices_packed, [len(v) for v in verts_list])
# solve for deformations given per-vertex 3x3 matrices
deformed_verts_list = calc_ARAP_global_solve(
verts_list, faces_list, solvers, per_vertex_3x3matrices_list,
arap_energy_type="spokes_and_rims_mine",
postprocess="recenter_only"
)- Read and edit the variables at the top of
prep_sds_run_configs.sh(up until the line# END VARIABLES FOR CONFIG) for this batch of runs - Run
bash prep_sds_run_configs.sh. This will open an interactive prompt where you will enter the prompt and shortname (a no-space nickname to describe the prompt-and-shape pair in the filename) for each mesh in the dataset. - Once you've entered prompts and shortnames for every mesh in the set, you will have a bunch of config JSON files as reported by the printout; each JSON file is the configuration for one run.
- Use your preferred sbatch/job scheduling method to run each of these configs with
python deform_with_csd_dARAP.py -c CONFIGFILEwhereCONFIGFILEis the JSON config filename for the run.
- Read
confg-base.jsonthe current latest base config. - Edit fields with your prompts, paths to your meshes, and run descriptions in filenames
- Polyscope recording and result save filenames must have a valid parent directory (the code will not automatically
mkdirany parent directory structure to fulfill requested save paths!) - To run, just do
python deform_with_csd_dARAP.py -c confg-base.json(or whatever your config filename is) - On the cluster (where polyscope init is not available), export the environment variable
NO_POLYSCOPE=1before running.
Overriding fields on the command line is supported. For instance, you can do
python deform_with_csd_dARAP.py -c CONFIGFILE --deform_by_csd.n_iters 1000to override thedeform_by_csd.n_itersfield to be 1000 instead of the value inCONFIGFILE.
(replace /usr/local/cuda-12.1 with your system's installation of the cuda dev toolkit if not that)
NO_POLYSCOPE=1 CUDA_HOME=/usr/local/cuda-12.1/ python deform_with_csd_dARAP.py -c example-run/confg-example-cuteanimalthemedchair.json- After this is done, you should get the recording file
example-run/psrec-deform-csdv3-example-cuteanimalthemedchair.npzand a saved deformation quantityexample-run/nrmls-deform-csdv3-example-cuteanimalthemedchair.npz. - You can extract the final mesh out of the psrec recording using
python scratch_extract_final_mesh_from_psrec.py example-run/psrec-deform-csdv3-example-cuteanimalthemedchair.npz- (that should save
example-run/reslt-deform-csdv3-example-cuteanimalthemedchair.obj. You can compare that result withexample-run/reslt-expected-cuteanimalthemedchair.obj. We don't fix the seed so there may be minor aesthetic differences but the result (namely the ears) should be mostly the same.) - You can also play back the Polyscope recording (on a system with a display, not a headless server) with
python thlog.py replay psrec-deform-csdv3-example-cuteanimalthemedchair.npz-
(This will show the usual Polyscope window but with a "Playback controls" window. Step through the
show-frames of the recording by clicking the button on the "Playback controls" window that shows up.) -
To apply the saved deformation (an optimized set of normals, saved alongside the original source mesh) such as the provided
nrmls-expected-cuteanimalthemedchair.npzor thenrmls-deform-csdv3-example-cuteanimalthemedchair.npzfile that is saved after the example optimization run, simply do
python apply_saved_deform_qty_npz.py example-run/nrmls-expected-cuteanimalthemedchair.npz- You can also add
--cpu trueto run on a local, CPU-only machine; you can add--lamb xto set a different lambda hyperparameter (deformation strength) with which to apply this saved deformation (higher is stronger.) The saved lambda used during optimization is 8.0; try setting it to 5 or 10. - Add
NO_POLYSCOPE=1to the start of the command to run this on a headless system; you might also want to specify the argument--save_fname YOUR_RESULT_FNAME.objto save the resulting mesh to a file.
-
We use pytorch3d's cotangent laplacian function which happens to use
cot a + cot bweights rather than0.5 * (cot a + cotb)like libigl'scotmatrixfunction. This had no effect on our implementation of the ARAP global solve right-hand-side since the same weights are in the matrix and the right-hand-side construction, but when using the prefactored IGLarap_rhsright-hand-side constructor, a2 *correction is needed on the resultingrhsassuming no rescaling back to the source shape's bounding box diagonal extent.- In practice, since we rescale the solved deformed shape to keep the same bounding box diagonal length as the source shape's bounding box diagonal length, this doesn't matter.
- This does, however, affect the scale of the
lambdahyperparameter for the Procrustes local step. Thelambdavalues we report and use are with respect to thesecot a + cot blaplacian weights, not theigl.cotmatrixweights which would require a 0.5x adjustment to the lambda hyperparameter. - To keep parity with the pytorch3d cot laplacian that we've been using, the
deformations_MINIMAL.pyfile (with no pytorch3d dependencies, for ease of use in other projects) will compute the laplacian with-2 * igl.cotmatrix(v,f). (The-is also because igl's cotmatrix follows the negative-definite convention, but we need the positive definite matrix for the solve.)
-
In the paper we use a fixed FOV of 60 degrees for all runs, but a FOV range of (55, 80) is also good and can lessen any "global shearing" effect of the deformation in some cases.
-
(Mentioned in the paper, but noted here for convenience) Note that for human shapes (and tall/slim shapes like candles), different view settings are preferred over the defaults: for human shapes, try
"elev_minmax": [0, 30]and"dist_minmax": [1.4, 2.6]. These are mentioned inconfg_base.jsonbut commented out.
- In the
cholespy,pytorch3d,iglfolders are type stubs (.pyifiles) containing type signatures for the functions from those libraries we use (for a better experience with static type checkers). Feel free to use them in your own projects involving these libraries if you use a static type checker.
@InProceedings{Dinh_2025_CVPR,
author = {Dinh, Nam Anh and Lang, Itai and Kim, Hyunwoo and Stein, Oded and Hanocka, Rana},
title = {Geometry in Style: 3D Stylization via Surface Normal Deformation},
booktitle = {Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)},
month = {June},
year = {2025},
pages = {28456-28467}
}