Skip to content

Language Grounded Single Source Domain Generalization in Medical Image Segmentation [ISBI2024]

Notifications You must be signed in to change notification settings

ShahinaKK/LG_SDG

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

8 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Language Grounded Single Source Domain Generalization in Medical Image Segmentation [ISBI 2024]

Shahina Kunhimon, Muzammal Naseer, Salman Khan, and Fahad Shahbaz Khan paper

main figure Abstract

SDG holds promise for more reliable and consistent image segmentation across real-world clinical settings. Textual cues describing the anatomical structures, their appearances, and variations across various imaging modalities can guide the model in domain adaptation, ultimately contributing to more robust and consistent segmentation. In this paper, we propose an approach that explicitly leverages textual information by incorporating a contrastive learning mechanism guided by the text encoder features to learn a more robust feature representation. We assess the effectiveness of our text-guided contrastive feature alignment technique in various scenarios, including cross-modality, cross-sequence, and cross-site settings for different segmentation tasks.


Citation

If you find our work, this repository, or pretrained models useful, please consider giving a star ⭐ and citation.

@INPROCEEDINGS{10635823,
  author={Kunhimon, Shahina and Naseer, Muzammal and Khan, Salman and Khan, Fahad Shahbaz},
  booktitle={2024 IEEE International Symposium on Biomedical Imaging (ISBI)}, 
  title={Language Guided Domain Generalized Medical Image Segmentation}, 
  year={2024},
  volume={},
  number={},
  pages={1-5},
  keywords={Image segmentation;Visualization;Adaptation models;Data privacy;Costs;Correlation;Robustness;Multi-modal contrastive learning;Medical image segmentation;Single source domain generalization},
  doi={10.1109/ISBI56570.2024.10635823}}

Installation

Create and activate conda environment.
    conda env create -f lgsdg.yml
    conda activate lgsdg
Run the setup file for CCSDG module.
    cd CCSDG
    pip install -e. 

Download Dataset and Text Embeddings

For Fundus dataset Download the CCSDG Fundus dataset.
For Abdominal and Cardiac datasets Download the SLAUG Processed datasets and follow the instructions in this repo (SLAug) to organize the data.
Get the Text Embeddings Download the Text_Embeddings and unzip it to use them directly. OR You can download the jupyter notebooks from Notebooks, unzip it, update the text annotations and generate the text embeddings.

Inference using Pretrained Models

Fundus Dataset

Download the pretrained model weights and put it in the directory path:

OUTPUT_FOLDER/unet_ccsdg_source_Magrabia/checkpoints/ 

To run the inference:

   cd CCSDG
   python ccsdg/inference/run_inference.py --model unet_ccsdg --gpu 0 --tag source_Magrabia --log_folder OUTPUT_FOLDER -r ./CCSDG_DATA --ts_csv ./CCSDG_DATA/MESSIDOR_Base1_test.csv
For Abdominal and Cardiac datasets

Download the pretrained models and run the inference:

   cd SLAug
   python test.py -r $CHECKPOINT

Training the Models

Fundus Dataset

Update the paths and run the bash script:

   cd CCSDG
   bash train.sh
Abdominal Dataset

For CT -> MRI:

      cd SLAug
      python main.py --base configs/efficientUnet_SABSCT_to_CHAOS.yaml --seed 23

For MRI -> CT:

    cd SLAug
    python main.py --base configs/efficientUnet_CHAOS_to_SABSCT.yaml --seed 23
Cardiac Dataset

For bSSFP -> LEG:

    cd SLAug
    python main.py --base configs/efficientUnet_bSSFP_to_LEG.yaml --seed 23

For LEG -> bSSFP:

    cd SLAug
    python main.py --base configs/efficientUnet_LEG_to_bSSFP.yaml --seed 23

Contact

Should you have any questions, please create an issue in this repository or contact shahina.kunhimon@mbzuai.ac.ae

References

Our code is build on the repositories of SLAug and CCSDG. We thank them for releasing their code.


About

Language Grounded Single Source Domain Generalization in Medical Image Segmentation [ISBI2024]

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published