This repo is the official source code for 'Mask-Free Neuron Concept Annotation for Interpreting Neural Networks in Medical Domain' International Conference on Medical Image Computing and Computer Assisted Intervention (MICCAI)
- Create virtual environment by conda.
conda create -n MAMMI python=3.10
conda activate MAMMI
pip install torch==1.13.1+cu116 torchvision==0.14.1+cu116 torchaudio==0.13.1 --extra-index-url https://download.pytorch.org/whl/cu116
pip install -r requirements.txt
- Note
- MedCLIP: pip install git+https://github.com/RyanWangZf/MedCLIP.git
- CLIP: pip install git+https://github.com/openai/CLIP.git
-
Prepare resources to run code.
Data
- Probing set: NIH14, ChestX-det (for visualization)
- Concept set: MIMIC-CXR Report (following R2Gen);
We provide preprocessed concept set by MIMIC CXR Report test data. ('./dataset/concept_set/nouns.txt)
Also, we povide processing code for concept set consturction. (prepare_mimic_nouns.py
)
Pre-trained model
Model(Link): DenseNet121(Moco v2), ResNet50(Moco v2)
Put inpretrained/target_model/{TARGET_MODEL.pth}
We already provide concept set file. dataset/concept_set/nouns.txt
.
If you want to create concept set, run 'prepare_mimic_nouns.py'
- # of MIMIC Nouns = 1361
run 'example_selection.py'
run 'concept_matching.py'
run 'target_model_perform.py' for multi-label classification.
run 'bbox_img.py'
run 'visualization.py'
This work was supported by the IITP grant funded by the Korea government (MSIT) (No.RS2022-00155911, Artificial Intelligence Convergence Innovation Human Resources Development (Kyung Hee University)).