Skip to content

Commit

Permalink
Open source from Alibaba.
Browse files Browse the repository at this point in the history
  • Loading branch information
robertsheng committed May 14, 2021
1 parent 2d421ba commit 102e1d9
Show file tree
Hide file tree
Showing 36 changed files with 75,949 additions and 0 deletions.
93 changes: 93 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,93 @@


# UniFuse (RAL+ICRA2021)

Office source code of paper **UniFuse: Unidirectional Fusion for 360$^\circ$ Panorama Depth Estimation**, [arXiv](https://arxiv.org/abs/2102.03550), [Demo](https://youtu.be/9vm9OMksvrc)



# Preparation

#### Installation

Environments


* python 3.6
* Pytorch >= 1.0.0
* CUDA >= 9.0


Install requirements

```bash
pip install -r requirements.txt
```

#### Datasets

Please download the preferred datasets, i.e., [Matterport3D](https://niessner.github.io/Matterport/), [Stanford2D3D](http://3dsemantics.stanford.edu/), [3D60](https://vcl3d.github.io/3D60/) and [PanoSUNCG](https://fuenwang.ml/project/360-depth/). For Matterport3D, please preprocess it following [M3D-README.md](UniFuse/Matterport3D/README.md).



# Training

#### UniFuse on Matterport3D

```
python train.py --data_path $DATA_PATH \
-dataset matterport3d \
--model_name Matterport3D_UniFuse \
--batch_size 6 \
--num_epochs 100 \
--height 512 \
--width 1024 \
--imagenet_pretrained \
--net UniFuse
```

#### Equirectangular baseline on Matterport3D

```
python train.py --data_path $DATA_PATH \
-dataset matterport3d \
--model_name Matterport3D_Equi \
--batch_size 6 \
--num_epochs 100 \
--height 512 \
--width 1024 \
--imagenet_pretrained \
--net Equi
```

It is similar for other datasets.


# Evaluation

#### Pre-trained models

The pre-trained models of UniFuse for 4 datasets are available, [Matterport3D](PretrainedModels/Matterport3D_UniFuse_cee_se_b), [Stanford2D3D](PretrainedModels/Stanford2D3D_UniFuse_cee_se_b), [3D60](PretrainedModels/3D60_UniFuse_cee_se) and [PanoSUNCG](PretrainedModels/PanoSunCG_UniFuse_cee_se).

#### Test on a pre-trained model

```
python evaluate.py --data_path $DATA_PATH --dataset matterport3d --load_weights_folder $MODEL_PATH
```



## Citation

Please cite our paper if you find our work useful in your research.

```
@article{jiang2021unifuse,
title={UniFuse: Unidirectional Fusion for 360$^{\circ}$ Panorama Depth Estimation},
author={Hualie Jiang and Zhe Sheng and Siyu Zhu and Zilong Dong and Rui Huang},
journal={IEEE Robotics and Automation Letters},
year={2021},
publisher={IEEE}
}
```

24 changes: 24 additions & 0 deletions UniFuse/Matterport3D/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,24 @@
# Matterport3D Preprocessing for 360$^\circ$ Depth Estimation



# Steps

* Download the Matterport3D dataset

* Copy *extract.sh* to the folder of Matterport3D

* Open a terminal on the folder of Matterport3D and execute

```bash
chmod u+x extract.sh && ./extract.sh
```

* Download [PanoBasic](https://github.com/yindaz/PanoBasic)

* Copy *stitching_Matterport3D.m* to PanoBasic

* Modify the directories in *stitching_Matterport3D.m*, i.e., changing **source_dir** to the folder of Matterport3D and changing **target_dir** as the output folder of panorama images and depth maps.

* Execute *stitching_Matterport3D.m* using Matlab.

21 changes: 21 additions & 0 deletions UniFuse/Matterport3D/extract.sh
Original file line number Diff line number Diff line change
@@ -0,0 +1,21 @@
#!/bin/bash

clear

var="Extracting"
echo $var


for entry in `ls -d */`; do
echo $entry
cd $entry
for file in `ls -d *.zip`; do
echo extracting $file
unzip $file
done
cd $entry
mv * ../
cd ..
rm -r $entry
cd ..
done
20 changes: 20 additions & 0 deletions UniFuse/Matterport3D/stitching_Matterport3D.m
Original file line number Diff line number Diff line change
@@ -0,0 +1,20 @@
%% stitching Matterport3D panorama images and depth maps
clear; close all;

source_dir = '/home/wolian/disk2/Matterport';
target_dir = '/home/wolian/disk1/Matterport3D';

listing = dir(source_dir);
listing(ismember( {listing.name}, {'.', '..'})) = [];

for i = 1:length(listing)

if ~exist(strcat(target_dir, '/', listing(i).name), 'dir')
mkdir(strcat(strcat(target_dir, '/', listing(i).name)))
else
continue
end

stitch(strcat(source_dir, '/', listing(i).name), strcat(target_dir, '/', listing(i).name), listing(i).name);

end
93 changes: 93 additions & 0 deletions UniFuse/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,93 @@


# UniFuse (RAL+ICRA2021)

Office source code of paper **UniFuse: Unidirectional Fusion for 360$^\circ$ Panorama Depth Estimation**, [arXiv](https://arxiv.org/abs/2102.03550), [Demo](https://youtu.be/9vm9OMksvrc)



# Preparation

#### Installation

Environments


* python 3.6
* Pytorch >= 1.0.0
* CUDA >= 9.0


Install requirements

```bash
pip install -r requirements.txt
```

#### Datasets

Please download the preferred datasets, i.e., [Matterport3D](https://niessner.github.io/Matterport/), [Stanford2D3D](http://3dsemantics.stanford.edu/), [3D60](https://vcl3d.github.io/3D60/) and [PanoSUNCG](https://fuenwang.ml/project/360-depth/). For Matterport3D, please preprocess it following [M3D-README.md](UniFuse/Matterport3D/README.md).



# Training

#### UniFuse on Matterport3D

```
python train.py --data_path $DATA_PATH \
-dataset matterport3d \
--model_name Matterport3D_UniFuse \
--batch_size 6 \
--num_epochs 100 \
--height 512 \
--width 1024 \
--imagenet_pretrained \
--net UniFuse
```

#### Equirectangular baseline on Matterport3D

```
python train.py --data_path $DATA_PATH \
-dataset matterport3d \
--model_name Matterport3D_Equi \
--batch_size 6 \
--num_epochs 100 \
--height 512 \
--width 1024 \
--imagenet_pretrained \
--net Equi
```

It is similar for other datasets.


# Evaluation

#### Pre-trained models

The pre-trained models of UniFuse for 4 datasets are available, [Matterport3D](PretrainedModels/Matterport3D_UniFuse_cee_se_b), [Stanford2D3D](PretrainedModels/Stanford2D3D_UniFuse_cee_se_b), [3D60](PretrainedModels/3D60_UniFuse_cee_se) and [PanoSUNCG](PretrainedModels/PanoSunCG_UniFuse_cee_se).

#### Test on a pre-trained model

```
python evaluate.py --data_path $DATA_PATH --dataset matterport3d --load_weights_folder $MODEL_PATH
```



## Citation

Please cite our paper if you find our work useful in your research.

```
@article{jiang2021unifuse,
title={UniFuse: Unidirectional Fusion for 360$^{\circ}$ Panorama Depth Estimation},
author={Hualie Jiang and Zhe Sheng and Siyu Zhu and Zilong Dong and Rui Huang},
journal={IEEE Robotics and Automation Letters},
year={2021},
publisher={IEEE}
}
```

Loading

0 comments on commit 102e1d9

Please sign in to comment.