Skip to content

Latest commit

 

History

History
144 lines (122 loc) · 5.49 KB

File metadata and controls

144 lines (122 loc) · 5.49 KB

3D U-Net FP32 inference

Description

This document has instructions for running 3D U-Net FP32 inference using Intel-optimized TensorFlow.

Datasets

The following instructions are based on BraTS2018 dataset preprocessing steps in the 3D U-Net repository.

  1. Download BraTS2018 dataset. Please follow the steps to register and request the training and the validation data of the BraTS 2018 challenge.

  2. Create a virtual environment and install the dependencies:

    # create a python3.6 based venv
    virtualenv --python=python3.6 brats18_env
    . brats18_env/bin/activate
    
    # install dependencies
    pip install intel-tensorflow==1.15.2
    pip install SimpleITK===1.2.0
    pip install keras==2.2.4
    pip install nilearn==0.6.2
    pip install tables==3.4.4
    pip install nibabel==2.3.3
    pip install nipype==1.7.0
    pip install numpy==1.16.3
    

    Install ANTs N4BiasFieldCorrection and add the location of the ANTs binaries to the PATH environmental variable:

    wget https://github.com/ANTsX/ANTs/releases/download/v2.1.0/Linux_Debian_jessie_x64.tar.bz2
    tar xvjf Linux_Debian_jessie_x64.tar.bz2
    cd debian_jessie
    export PATH=${PATH}:$(pwd)
    
  3. Clone the 3D U-Net repository, and run the script for the dataset preprocessing:

    git clone https://github.com/ellisdg/3DUnetCNN.git
    cd 3DUnetCNN
    git checkout update_to_brats18
    
    # add the repository directory to the PYTHONPATH system variable
    export PYTHONPATH=${PWD}:$PYTHONPATH
    

    After downloading the dataset file MICCAI_BraTS_2018_Data_Training.zip (from step 1), place the unzipped folders in the brats/data/original directory.

    # extract the dataset
    mkdir -p brats/data/original && cd brats
    unzip MICCAI_BraTS_2018_Data_Training.zip -d data/original
    
    # import the conversion function and run the preprocessing:
    python
    >>> from preprocess import convert_brats_data
    >>> convert_brats_data("data/original", "data/preprocessed")
    
    # run training using the original UNet model to get `validation_ids.pkl` created in `brats` directory.
    python train.py 
    

After it finishes, set an environment variable to the path that contains the preprocessed dataset file validation_ids.pkl.

export DATASET_DIR=/home/<user>/3DUnetCNN/brats

Quick Start Scripts

Script name Description
fp32_inference.sh Runs inference with a batch size of 1 using the BraTS dataset and a pretrained model

Run the model

Setup your environment using the instructions below, depending on if you are using AI Kit:

Setup using AI Kit Setup without AI Kit

AI Kit does not currently support TF 1.15.2 models

To run without AI Kit you will need:

  • Python 3.6 or 3.7
  • intel-tensorflow==1.15.2
  • numactl
  • Keras==2.6.0rc3
  • numpy==1.16.3
  • nilearn==0.6.2
  • tables==3.4.4
  • nibabel==2.3.3
  • SimpleITK===1.2.0
  • h5py==2.10.0
  • A clone of the Model Zoo repo
    git clone https://github.com/IntelAI/models.git

Download the pre-trained model from the 3DUnetCNN repository. In this example, we are using the "Original U-Net" model, trained using the BRATS 2017 data. Set the the PRETRAINED_MODEL env var as the path to the tumor_segmentation_model.h5 file.

wget https://www.dropbox.com/s/m99rqxunx0kmzn7/tumor_segmentation_model.h5
export PRETRAINED_MODEL=$(pwd)/tumor_segmentation_model.h5

After your environment is setup, set environment variables to the DATASET_DIR and an OUTPUT_DIR where log files will be written. Ensure that you already have the PRETRAINED_MODEL path set from the previous command. Once the environment variables are all set, you can run the quickstart script.

# cd to your model zoo directory
cd models

export DATASET_DIR=<path to the dataset>
export OUTPUT_DIR=<directory where log files will be written>
export PRETRAINED_MODEL=<path to the pretrained model>

./quickstart/image_segmentation/tensorflow/3d_unet/inference/cpu/fp32/fp32_inference.sh

Additional Resources