Skip to content

Commit 78933df

Browse files
authored
1015 Move examples to tutorials repo (#1051)
* [DLMED] move examples to tutorials repo Signed-off-by: Nic Ma <nma@nvidia.com> * [DLMED] update docs Signed-off-by: Nic Ma <nma@nvidia.com>
1 parent 1bdd287 commit 78933df

36 files changed

+11
-5261
lines changed

README.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -51,7 +51,7 @@ For more details, please refer to [the installation guide](https://docs.monai.io
5151

5252
[MedNIST demo](https://colab.research.google.com/drive/1wy8XUSnNWlhDNazFdvGBHLfdkGvOHBKe) and [MONAI for PyTorch Users](https://colab.research.google.com/drive/1boqy7ENpKrqaJoxFlbHIBnIODAs1Ih1T) are available on Colab.
5353

54-
Examples are located at [monai/examples](https://github.com/Project-MONAI/MONAI/tree/master/examples), notebook tutorials are located at [Project-MONAI/Tutorials](https://github.com/Project-MONAI/Tutorials).
54+
Examples and notebook tutorials are located at [Project-MONAI/tutorials](https://github.com/Project-MONAI/tutorials).
5555

5656
Technical documentation is available at [docs.monai.io](https://docs.monai.io).
5757

docs/source/highlights.md

Lines changed: 10 additions & 10 deletions
Original file line numberDiff line numberDiff line change
@@ -47,7 +47,7 @@ transformations. These currently include, for example:
4747
- `Rand2DElastic`: Random elastic deformation and affine in 2D
4848
- `Rand3DElastic`: Random elastic deformation and affine in 3D
4949

50-
[2D transforms tutorial](https://github.com/Project-MONAI/Tutorials/blob/master/transforms_demo_2d.ipynb) shows the detailed usage of several MONAI medical image specific transforms.
50+
[2D transforms tutorial](https://github.com/Project-MONAI/tutorials/blob/master/modules/transforms_demo_2d.ipynb) shows the detailed usage of several MONAI medical image specific transforms.
5151
![image](../images/medical_transforms.png)
5252

5353
### 3. Fused spatial transforms and GPU acceleration
@@ -66,13 +66,13 @@ affine = Affine(
6666
# convert the image using bilinear interpolation
6767
new_img = affine(image, spatial_size=(300, 400), mode='bilinear')
6868
```
69-
Experiments and test results are available at [Fused transforms test](https://github.com/Project-MONAI/Tutorials/blob/master/transform_speed.ipynb).
69+
Experiments and test results are available at [Fused transforms test](https://github.com/Project-MONAI/tutorials/blob/master/acceleration/transform_speed.ipynb).
7070

71-
Currently all the geometric image transforms (Spacing, Zoom, Rotate, Resize, etc.) are designed based on the PyTorch native interfaces. [Geometric transforms tutorial](https://github.com/Project-MONAI/Tutorials/blob/master/3d_image_transforms.ipynb) indicates the usage of affine transforms with 3D medical images.
71+
Currently all the geometric image transforms (Spacing, Zoom, Rotate, Resize, etc.) are designed based on the PyTorch native interfaces. [Geometric transforms tutorial](https://github.com/Project-MONAI/tutorials/blob/master/modules/3d_image_transforms.ipynb) indicates the usage of affine transforms with 3D medical images.
7272
![image](../images/affine.png)
7373

7474
### 4. Randomly crop out batch images based on positive/negative ratio
75-
Medical image data volume may be too large to fit into GPU memory. A widely-used approach is to randomly draw small size data samples during training and run a “sliding window” routine for inference. MONAI currently provides general random sampling strategies including class-balanced fixed ratio sampling which may help stabilize the patch-based training process. A typical example is in [Spleen 3D segmentation tutorial](https://github.com/Project-MONAI/Tutorials/blob/master/spleen_segmentation_3d.ipynb), which achieves the class-balanced sampling with `RandCropByPosNegLabel` transform.
75+
Medical image data volume may be too large to fit into GPU memory. A widely-used approach is to randomly draw small size data samples during training and run a “sliding window” routine for inference. MONAI currently provides general random sampling strategies including class-balanced fixed ratio sampling which may help stabilize the patch-based training process. A typical example is in [Spleen 3D segmentation tutorial](https://github.com/Project-MONAI/tutorials/blob/master/3d_segmentation/spleen_segmentation_3d.ipynb), which achieves the class-balanced sampling with `RandCropByPosNegLabel` transform.
7676

7777
### 5. Deterministic training for reproducibility
7878
Deterministic training support is necessary and important for deep learning research, especially in the medical field. Users can easily set the random seed to all the random transforms in MONAI locally and will not affect other non-deterministic modules in the user's program.
@@ -110,23 +110,23 @@ MONAI also provides post-processing transforms for handling the model outputs. C
110110
- Removing segmentation noise based on Connected Component Analysis, as below figure (c).
111111
- Extracting contour of segmentation result, which can be used to map to original image and evaluate the model, as below figure (d) and (e).
112112

113-
After applying the post-processing transforms, it's easier to compute metrics, save model output into files or visualize data in the TensorBoard. [Post transforms tutorial](https://github.com/Project-MONAI/Tutorials/blob/master/post_transforms.ipynb) shows an example with several main post transforms.
113+
After applying the post-processing transforms, it's easier to compute metrics, save model output into files or visualize data in the TensorBoard. [Post transforms tutorial](https://github.com/Project-MONAI/tutorials/blob/master/modules/post_transforms.ipynb) shows an example with several main post transforms.
114114
![image](../images/post_transforms.png)
115115

116116
### 9. Integrate third-party transforms
117117
The design of MONAI transforms emphasis code readability and usability. It works for array data or dictionary-based data. MONAI also provides `Adaptor` tools to accommodate different data format for 3rd party transforms. To convert the data shapes or types, utility transforms such as `ToTensor`, `ToNumpy`, `SqueezeDim` are also provided. So it's easy to enhance the transform chain by seamlessly integrating transforms from external packages, including: `ITK`, `BatchGenerator`, `TorchIO` and `Rising`.
118118

119-
For more details, please check out the tutorial: [integrate 3rd party transforms into MONAI program](https://github.com/Project-MONAI/Tutorials/blob/master/integrate_3rd_party_transforms.ipynb).
119+
For more details, please check out the tutorial: [integrate 3rd party transforms into MONAI program](https://github.com/Project-MONAI/tutorials/blob/master/modules/integrate_3rd_party_transforms.ipynb).
120120

121121
## Datasets
122122
### 1. Cache IO and transforms data to accelerate training
123123
Users often need to train the model with many (potentially thousands of) epochs over the data to achieve the desired model quality. A native PyTorch implementation may repeatedly load data and run the same preprocessing steps for every epoch during training, which can be time-consuming and unnecessary, especially when the medical image volumes are large.
124124

125-
MONAI provides a multi-threads `CacheDataset` to accelerate these transformation steps during training by storing the intermediate outcomes before the first randomized transform in the transform chain. Enabling this feature could potentially give 10x training speedups in the [Datasets experiment](https://github.com/Project-MONAI/Tutorials/blob/master/dataset_type_performance.ipynb).
125+
MONAI provides a multi-threads `CacheDataset` to accelerate these transformation steps during training by storing the intermediate outcomes before the first randomized transform in the transform chain. Enabling this feature could potentially give 10x training speedups in the [Datasets experiment](https://github.com/Project-MONAI/tutorials/blob/master/acceleration/dataset_type_performance.ipynb).
126126
![image](../images/cache_dataset.png)
127127

128128
### 2. Cache intermediate outcomes into persistent storage
129-
The `PersistentDataset` is similar to the CacheDataset, where the intermediate cache values are persisted to disk storage for rapid retrieval between experimental runs (as is the case when tuning hyperparameters), or when the entire data set size exceeds available memory. The `PersistentDataset` could achieve similar performance when comparing to `CacheDataset` in [Datasets experiment](https://github.com/Project-MONAI/Tutorials/blob/master/dataset_type_performance.ipynb).
129+
The `PersistentDataset` is similar to the CacheDataset, where the intermediate cache values are persisted to disk storage for rapid retrieval between experimental runs (as is the case when tuning hyperparameters), or when the entire data set size exceeds available memory. The `PersistentDataset` could achieve similar performance when comparing to `CacheDataset` in [Datasets experiment](https://github.com/Project-MONAI/tutorials/blob/master/acceleration/dataset_type_performance.ipynb).
130130
![image](../images/datasets_speed.png)
131131

132132
### 3. Zip multiple PyTorch datasets and fuse the output
@@ -148,7 +148,7 @@ dataset = ZipDataset([DatasetA(), DatasetB()], transform)
148148
### 4. Predefined Datasets for public medical data
149149
To quickly get started with popular training data in the medical domain, MONAI provides several data-specific Datasets(like: `MedNISTDataset`, `DecathlonDataset`, etc.), which include downloading, extracting data files and support generation of training/evaluation items with transforms. And they are flexible that users can easily modify the JSON config file to change the default behaviors.
150150

151-
MONAI always welcome new contributions of public datasets, please refer to existing Datasets and leverage the download and extracting APIs, etc. [Public datasets tutorial](https://github.com/Project-MONAI/Tutorials/blob/master/public_datasets.ipynb) indicates how to quickly set up training workflows with `MedNISTDataset` and `DecathlonDataset` and how to create a new `Dataset` for public data.
151+
MONAI always welcome new contributions of public datasets, please refer to existing Datasets and leverage the download and extracting APIs, etc. [Public datasets tutorial](https://github.com/Project-MONAI/tutorials/blob/master/modules/public_datasets.ipynb) indicates how to quickly set up training workflows with `MedNISTDataset` and `DecathlonDataset` and how to create a new `Dataset` for public data.
152152

153153
The common workflow of predefined datasets:
154154
![image](../images/dataset_progress.png)
@@ -187,7 +187,7 @@ A typical process is:
187187
4. Save the results to file or compute some evaluation metrics.
188188
![image](../images/sliding_window.png)
189189

190-
The [Spleen 3D segmentation tutorial](https://github.com/Project-MONAI/Tutorials/blob/master/spleen_segmentation_3d.ipynb) leverages `SlidingWindow` inference for validation.
190+
The [Spleen 3D segmentation tutorial](https://github.com/Project-MONAI/tutorials/blob/master/3d_segmentation/spleen_segmentation_3d.ipynb) leverages `SlidingWindow` inference for validation.
191191

192192
### 2. Metrics for medical tasks
193193
Various useful evaluation metrics have been used to measure the quality of medical image specific models. MONAI already implemented mean Dice score for segmentation tasks and the area under the ROC curve for classification tasks. We continue to integrate more options.

examples/README.md

Lines changed: 0 additions & 39 deletions
This file was deleted.

examples/classification_3d/densenet_evaluation_array.py

Lines changed: 0 additions & 77 deletions
This file was deleted.

0 commit comments

Comments
 (0)