Skip to content

MIDA-group/SampledABMIL

Repository files navigation

Attention-Based deep Multiple Instance Learning with within-bag sampling

Nadezhda Koriakina✉️, Nataša Sladoje and Joakim Lindblad

Pytorch implementation of Attention-Based deep Multiple Instance Learning (ABMIL) [1] with within-bag sampling according to ISPA 2021 paper "The Effect of Within-Bag Sampling on End-to-End Multiple Instance Learning"[2].

Table of Contents

  1. Overview
  2. Dependencies
  3. How to use
  4. Citation
  5. References
  6. Acknowledgements

Overview

  • Code for creating QMNIST-bags and Imagenette-bags datasets
  • Code for training and evaluating ABMIL with/without within bag sampling for QMNIST-bags and Imagenette-bags

Additions to the original implementation of ABMIL:

  • Within-bag sampling option
  • Options of GPU training with one or three GPUs
  • Validation technique with moving average
  • Evaluation by computing Area Under the Receiver Operating Characteristic (AUC) curve at a bag and instance level

Dependencies

Python version 3.6.13. Main dependencies: reqs.txt

How to use

  • Create_QMNIST_bags_dataset.ipynb: code for creating QMNIST-bags dataset. There are options to choose number of bags, number of instances in a bag, percent of key instances in a positive bag. The code is made to create bags without repeating instances in a bag
  • Create_IMAGENETTE_bags_dataset.ipynb: code for creating Imagenette-bags dataset. There are options to choose number of bags, number of instances in a bag, percent of key instances in a positive bag, augmentation of images
  • MAIN_ABMIL_with_within_bag_sampling_QMNIST.ipynb and MAIN_ABMIL_with_within_bag_sampling_IMAGENETTE.ipynb: code for training and evaluating ABMIL with/without within bag sampling for QMNIST-bags and Imagenette-bags datasets correspondingly.

Restart the kernel if changes to the internal codes attention_model.py, dataloaders.py and evaluation.py are made.

Note: the code is created for QMNIST and Imagenette data and might require changes if custom data is used.

Citation

@INPROCEEDINGS{9552170,
author={Koriakina, Nadezhda and Sladoje, Nataša and Lindblad, Joakim},
booktitle={2021 12th International Symposium on Image and Signal Processing and Analysis (ISPA)},
title={The Effect of Within-Bag Sampling on End-to-End Multiple Instance Learning},
year={2021},
volume={},
number={},
pages={183-188},
doi={10.1109/ISPA52656.2021.9552170}}

References

[1] M. Ilse, J. Tomczak, and M. Welling, “Attention-based deep multiple instance learning,” in International conference on machine learning.PMLR, 2018, pp. 2127–2136.
[2] N. Koriakina, N. Sladoje and J. Lindblad, "The Effect of Within-Bag Sampling on End-to-End Multiple Instance Learning," 2021 12th International Symposium on Image and Signal Processing and Analysis (ISPA), 2021, pp. 183-188, doi: 10.1109/ISPA52656.2021.9552170.

Acknowledgements

This work is supported by: VINNOVA MedTech4Health grant 2017-02447 and Swedish Research Council grants 2015-05878 and 2017-04385. A part of computations was enabled by resources provided by the Swedish National Infrastructure for Computing (SNIC) at Chalmers Centre for Computational Science and Engineering (C3SE), partially funded by the Swedish Research Council through grant no. 2018-05973.

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published