High-fidelity Voxel Reconstruction via Neural Architecture Search and Hierarchical Implicit Representation
This repository contains the implementation of the paper (under review):
High-fidelity Voxel Reconstruction via Neural Architecture Search and Hierarchical Implicit Representation.
Authors: Yulong Wang, Yongdong Huang, Yujie Lu, Nayu Ding, Siyu Zhang, Xianan Xu, Shen Cai*, Ting Lu.
We propose a novel neural architecture search (NAS) based hierarchical voxel reconstruction technique. Leveraging NAS, our method searches a tailored multi-layer perceptron (MLP) network to accurately predict binary classification probabilities of voxels, enabling efficient end-to-end reconstruction of individual voxel models at
The initial conference version of this paper (Huang et al., 2022) [code], presented as an oral representation at ICPR 2022, was limited to a single-stage voxel reconstruction process exclusively for watertight objects. This journal version explores several enhancements aimed at facilitating high-fidelity reconstruction of a broad range of models, including those that are not watertight.
| Bird Cage | T-shirt | Room1 |
|---|---|---|
![]() |
![]() |
![]() |
| Ship | Pants | Room2 |
|---|---|---|
![]() |
![]() |
![]() |
Other voxel reconstruction results at
Setup conda environments
conda env create -f environment.ymlInstall the dependencies:
cd dependencies/libdualVoxel
pip install .The mesh models are loaded by trimesh. The voxel is output in .npz format.
python scripts/prepare_dataset.py --mesh_dir data/thingi32 --voxel_out data/thingi32_voxel --name 441708 --resolution 256python scripts/train.py --voxel data/thingi32_voxel/256/441708.npz --exp_name logs/441708We put the pre-trained network in ./logs.
This project is licensed under the terms of the GPL3.0 License (see LICENSE for details).







