Skip to content

Richardqiyi/ProjectedEx

Repository files navigation

ProjectedEx

This is the code repository for the paper:

ProjectedEx: Enhancing Generation in Explainable AI for Prostate Cancer

Xuyin Qi*, Zeyu Zhang*, Aaron Berliano Handoko*, Huazhan Zheng, Mingxi Chen, Ta Duc Huy, Vu Minh Hieu Phan, Lei Zhang, Linqi Cheng, Shiyu Jiang, Zhiwei Zhang, Zhibin Liao, Yang Zhao#, Minh-Son To

*Equal contribution. Project lead. #Corresponding author.

CBMS 2025

[arXiv] [Paper with Code] [HF Paper]

presentation_compressed_10mb.mp4

Youtube Video!!!

If you’d like to learn more about our paper, be sure to check out this youtube video on the Open Life Science AI channel.

Watch the video

Citation

@article{qi2025projectedex,
  title={ProjectedEx: Enhancing Generation in Explainable AI for Prostate Cancer},
  author={Qi, Xuyin and Zhang, Zeyu and Handoko, Aaron Berliano and Zheng, Huazhan and Chen, Mingxi and Huy, Ta Duc and Phan, Vu Minh Hieu and Zhang, Lei and Cheng, Linqi and Jiang, Shiyu and others},
  journal={arXiv preprint arXiv:2501.01392},
  year={2025}
}

T2 modality.

Introduction

Prostate cancer, a growing global health concern, necessitates precise diagnostic tools, with Magnetic Resonance Imaging (MRI) offering high-resolution soft tissue imaging that significantly enhances diagnostic accuracy. Recent advancements in explainable AI and representation learning have significantly improved prostate cancer diagnosis by enabling automated and precise lesion classification. However, existing explainable AI methods, particularly those based on frameworks like generative adversarial networks (GANs), are predominantly developed for natural image generation, and their application to medical imaging often leads to suboptimal performance due to the unique characteristics and complexity of medical image. To address these challenges, our paper introduces three key contributions. First, we propose ProjectedEx, a generative framework that provides interpretable, multi-attribute explanations, effectively linking medical image features to classifier decisions. Second, we enhance the encoder module by incorporating feature pyramids, which enables multiscale feedback to refine the latent space and improves the quality of generated explanations. Additionally, we conduct comprehensive experiments on both the generator and classifier, demonstrating the clinical relevance and effectiveness of ProjectedEx in enhancing interpretability and supporting the adoption of AI in medical settings. Overview of the proposed network. Overview of the proposed network.

Environment Setup

for easy use:

docker pull qiyi007/prostateca1:latest

Data

The dataset should be organized as follows:

  data
   └── ProstateCa
       ├── train
       │   ├── 1.jpg
       │   ├── 2.jpg
       │   ├── ...
       └── valid
           ├── ...

Training

Modify the parameters and run:

python cli.py

Inferring

First use run_attfind_combined.ipynb to find the top 4 coordinators which have the most impact on classifer output.

Then use all_results_notebook.ipynb to visualize.

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Packages

No packages published

Contributors 2

  •  
  •