Skip to content
This repository was archived by the owner on May 11, 2024. It is now read-only.

Commit e3c3b62

Browse files
End to end training script
1 parent 8752f15 commit e3c3b62

File tree

2 files changed

+49
-3
lines changed

2 files changed

+49
-3
lines changed

single-node/run_brats_model.sh

Lines changed: 46 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,46 @@
1+
#!/usr/bin/env bash
2+
3+
# -*- coding: utf-8 -*-
4+
#
5+
# Copyright (c) 2018 Intel Corporation
6+
#
7+
# Licensed under the Apache License, Version 2.0 (the "License");
8+
# you may not use this file except in compliance with the License.
9+
# You may obtain a copy of the License at
10+
#
11+
# http://www.apache.org/licenses/LICENSE-2.0
12+
#
13+
# Unless required by applicable law or agreed to in writing, software
14+
# distributed under the License is distributed on an "AS IS" BASIS,
15+
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
16+
# See the License for the specific language governing permissions and
17+
# limitations under the License.
18+
#
19+
# SPDX-License-Identifier: EPL-2.0
20+
#
21+
22+
# You'll need to download the Decathlon dataset.
23+
# You can get the data here: http://medicaldecathlon.com/
24+
# Currently it is on Google Drive:
25+
# https://drive.google.com/drive/folders/1HqEgzS8BV2c7xYNrZdEAnrHk7osJJ--2
26+
# Download Task01_BrainTumour.tar
27+
# Then untar to the Decathlon directory.
28+
29+
# tar -xvf Task01_BrainTumour.tar
30+
#
31+
32+
DECATHLON_DIR="../../data/decathlon/Task01_BrainTumour/"
33+
HDF5_DIR="../../data/decathlon/"
34+
IMG_SIZE=128
35+
MODEL_OUTPUT_DIR="./output/"
36+
37+
echo "Converting Decathlon raw data to HDF5 file."
38+
# Run Python script to convert to a single HDF5 file
39+
python convert_raw_to_hdf5.py --data_path $DECATHLON_DIR \
40+
--save_path $HDF5_DIR --resize=$IMG_SIZE
41+
42+
echo "Run U-Net training on BraTS Decathlon dataset"
43+
# Run training script
44+
# The settings.py file contains the model training.
45+
python train.py --data_path $HDF5_DIR/$IMG_SIZEx$IMG_SIZE \
46+
--output_path $MODEL_OUTPUT_DIR

single-node/train.py

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -159,7 +159,7 @@ def dice_coef(y_true, y_pred, smooth=1.):
159159
numerator = tf.constant(2.) * intersection + smooth
160160
denominator = union + smooth
161161
coef = numerator / denominator
162-
162+
163163
return tf.reduce_mean(coef)
164164

165165
def dice_coef_loss(y_true, y_pred, smooth=1.):
@@ -226,13 +226,13 @@ def unet_model(img_height=128,
226226
pool2 = K.layers.MaxPooling2D(name="pool2", pool_size=(2, 2))(conv2)
227227

228228
conv3 = K.layers.Conv2D(name="conv3a", filters=fms*4, **params)(pool2)
229-
conv3 = K.layers.Dropout(dropout)(conv3)
229+
conv3 = K.layers.SpatialDropout2D(dropout, data_format=data_format)(conv3)
230230
conv3 = K.layers.Conv2D(name="conv3b", filters=fms*4, **params)(conv3)
231231

232232
pool3 = K.layers.MaxPooling2D(name="pool3", pool_size=(2, 2))(conv3)
233233

234234
conv4 = K.layers.Conv2D(name="conv4a", filters=fms*8, **params)(pool3)
235-
conv4 = K.layers.Dropout(dropout)(conv4)
235+
conv4 = K.layers.SpatialDropout2D(dropout, data_format=data_format)(conv4)
236236
conv4 = K.layers.Conv2D(name="conv4b", filters=fms*8, **params)(conv4)
237237

238238
pool4 = K.layers.MaxPooling2D(name="pool4", pool_size=(2, 2))(conv4)

0 commit comments

Comments
 (0)