Skip to content
/ CAMP Public

Implementation of "CAMP: Continuous and Adaptive Learning Model in Pathology"

License

Notifications You must be signed in to change notification settings

QuIIL/CAMP

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

34 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

CAMP: Continuous and Adaptive Learning Model in Pathology

Overview

Implementation of the paper (under review):

CAMP: Classify Anything Model in Pathology
Anh Tien Nguyen, Keunho Byeon, Kyungeun Kim, Boram Song, Seoung Wan Chae, and Jin Tae Kwak

Abstract

There exist numerous diagnostic tasks in pathology. Conventional computational pathology formulates and tackles them as independent and individual image classification problems, thereby resulting in computational inefficiency and high costs. To address the challenges, we propose a generic, unified, and universal framework, called a continuous and adaptive learning model in pathology (CAMP), for pathology image classification. CAMP is a generative, efficient, and adaptive classification model that can continuously adapt to any classification task by leveraging pathology-specific prior knowledge and learning task-specific knowledge with minimal computational cost and without forgetting the knowledge from the existing tasks. We evaluated CAMP on 22 datasets, including 1,171,526 patches and 11,811 pathology slides, across 17 classification tasks. CAMP achieves state-of-the-art classification performance on a wide range of datasets and tasks at both patch- and slide-levels and reduces up to 94% of computation time and 85% of storage memory in comparison to the conventional classification models. Our results demonstrate that CAMP can offer a fundamental transformation in pathology image classification, paving the way for the fully digitized and computerized pathology practice.

Architecture

Environment set up

git clone https://github.com/QuIIL/CAMP
cd CAMP
conda create --name CAMP --file requirements.txt
conda activate CAMP
pip install -r requirements.txt

Datasets

Models

  • ConvNeXt-B: link
  • RegNet: link
  • ResNeXt50: link
  • SwinV2-B: link
  • ViT-B: link
  • PLIP: link
  • CTransPath: link
  • UNI: link
  • Phikon: link
  • GPC: link
  • GIT-B: link
  • Step by Step Instruction

    Step 1: Training

    The code for training is mainly based on the file train.py. The arguments are important for the training setting includes dataset (dataset to train), lora_r (rank of LoRA), lora_alpha (alpha of LoRA), and out_dir to save the training results. Please refer to the file train.py for the default arguments of other arguments.

    Sample command for training with colon-1.

    python train.py \
        --dataset colon-1
        --device 0
        --lora_r 6
        --lora_alpha 12
        --out_dir <train_result_saving_dir>
    

    Step 2: Testing

    The code for training is mainly based on the file test.py. The arguments are important for the training setting includes dataset (dataset to test), model_pth (path of a test model), and out_dir to save the testing results. Please refer to the file test.py for the default arguments of other arguments.

    Sample command for testing with colon-1.

    python test.py \
        --dataset colon-1
        --device 0
        --model_pth <ckpt_path>
        --out_dir <test_result_saving_dir>
    

About

Implementation of "CAMP: Continuous and Adaptive Learning Model in Pathology"

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages