Semi-Supervised Segmentation of Salt Bodies in Seismic Images using an Ensemble of Convolutional Neural Networks
German Conference on Pattern Recognition (GCPR), 2019
Yauhen Babakhin, Artsiom Sanakoyeu, Hirotoshi Kitamura
https://arxiv.org/abs/1904.04445
Kaggle post about the solution: link.
The solution is available as a Docker container. The following dependecies should be installed:
- Python 3.5.2
- CUDA 9.0
- cuddn 7
- nvidia drivers v.384
- Docker
- nvidia-docker
Download and unzip competition data into data/
directory.
One could specify local path to the new test images in SETTINGS.json
file (NEW_TEST_IMAGES_DATA
field). The competition test data is used by default.
To get the weights from the final stage models download them from google drive and unzip into corresponding bes/weights/
and phalanx/weights
directories.
To build and start a docker container run:
cd docker
./build.sh
./run.sh
-
train models from scratch
a) trains all models from scratch
b) expect this to run for about 16 days on a single GTX1080Ti
-
make prediction
a) uses weights from the final stage models to make predictions
b) expect this to run for 3.5 hours for 18,000 test images on a single GTX1080Ti
Commands to run each build are presented below:
./train.sh
./predict.sh
-
Model weights are saved in bes/weights and phalanx/weights for b.e.s. and phalanx models respectively
-
Individual model predictions before ensembling are stored in bes/predictions (lots of .png images) and phalanx/predictions (.npy files)
-
Scripts to generate initial folds and jigsaw mosaics are located in bes/datasets: generate_folds.py and Generate_Mosaic.R
If you find this code useful, please cite our paper:
@journal{tgsSaltBodiesSegmentation2019,
title={Semi-Supervised Segmentation of Salt Bodies in Seismic Images using an Ensemble of Convolutional Neural Networks},
author={Babakhin, Yauhen, and Sanakoyeu, Artsiom, and Kitamura, Hirotoshi},
journal={German Conference on Pattern Recognition (GCPR)},
year={2019}
}