Skip to content

AIM-Harvard/ssl_thymus_quantification

Repository files navigation

SSL Thymus: Self-Supervised Learning for Thymus Analysis

This repository provides a SwaV (Swapping Assignments between Views) implementation for self-supervised pretraining on medical images, using PyTorch Lightning and MONAI. It focuses on ablation studies and feature extraction for thymus grading.

Features

  • SwaV Pretraining: Self-supervised learning using Swapping Assignments between Views.
  • ResNet Backbone: Uses MONAI's ResNet50 adaptation for 3D/2D medical imaging.
  • Lighter Integration: Configurable training pipeline using Lighter.
  • Technical Ablations: Includes notebooks and scripts for technical ablation studies.
  • YAML Configuration: Centralized configuration for training and feature extraction.

Directory Structure

.
├── train.yaml              # Main SwaV training configuration
├── get_features.yaml       # Feature extraction configuration
├── models/
│   └── swav.py             # SwaV model implementation
├── losses/
│   └── swav_loss.py        # SwaV loss function
├── transforms/             # Data augmentations
├── datasets/               # Data loading wrappers
├── analysis/               # Notebooks for technical ablations
├── README.md
└── pyproject.toml          # Project dependencies

Installation

This project uses uv for dependency management.

  1. Clone the repository:

    git clone <repo-url>
    cd <repo-root>
  2. Install dependencies:

    pip install .
    # OR using uv
    uv sync

Usage

Pretraining (SwaV)

Run the training using the root configuration file:

lighter fit train.yaml

You can override parameters via the CLI:

lighter fit train.yaml trainer::max_epochs=100 trainer::devices=1

Feature Extraction

Extract features from a trained model using the feature extraction config:

lighter predict get_features.yaml

References

License

This project is for research purposes.

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 2

  •  
  •  

Languages