Skip to content

Interpretability analysis of Vision Transformers using Attention Rollout, Gradient Rollout, and a novel FAF/BAF metric with Mask R-CNN. Includes experiments on DeiT-Tiny comparing CNN and ViT explanation behavior, showing ~25% improvement in semantic focus using gradient-based rollout.

Notifications You must be signed in to change notification settings

RiyaH2020/Explainability_in_Vision_transformers

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

4 Commits
 
 
 
 
 
 
 
 

Repository files navigation

🔍 Explainability in Vision Transformers

This repository contains an interpretability pipeline for Vision Transformers (ViTs) using a pretrained DeiT-Tiny model. It implements both Attention Rollout and Gradient Attention Rollout, and introduces a quantitative metric — Foreground–Background Attention Fraction (FAF/BAF) — using Mask R-CNN to measure how well ViTs focus on meaningful object regions.

The project also compares explanation behaviour between CNNs 🧠 and Vision Transformers 🤖, highlighting how transformers rely on global patch interactions rather than localized feature extraction. Experiments show that gradient-based rollout achieves ~25% higher foreground alignment, producing more precise and class-specific attribution maps.


✨ Features

  • 🔄 Attention Rollout & Gradient Attention Rollout implementations
  • 🧩 Patch-level attention visualization on the 14×14 ViT grid
  • 🎯 FAF/BAF metric using Mask R-CNN for quantitative semantic focus evaluation
  • ⚖️ Comparison of interpretability behaviour between CNNs and ViTs
  • 📈 Reproducible experiments with visualizations

About

Interpretability analysis of Vision Transformers using Attention Rollout, Gradient Rollout, and a novel FAF/BAF metric with Mask R-CNN. Includes experiments on DeiT-Tiny comparing CNN and ViT explanation behavior, showing ~25% improvement in semantic focus using gradient-based rollout.

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published