Skip to content
/ TOAST Public

Official code for "TOAST: Transfer Learning via Attention Steering"

Notifications You must be signed in to change notification settings

bfshi/TOAST

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

19 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

TOAST: Top-Down Attention Steering

This is the official codebase of TOAST, from the following paper:

TOAST: Transfer Learning via Attention Steering
Baifeng Shi, Siyu Gai, Trevor Darrell, and Xin Wang
UC Berkeley, Microsoft Research

drawing

Motivation

We find previous transfer learning methods (fine-tuning, LoRA, prompt tuning, etc.) often fail to focus on the features relevant to the downstream tasks (see figure above). We show that refocusing the attention to task-relevant features can improve downstream performances.

What is TOAST?

TOAST is a transfer learning algorithm which transfers a large pre-trained model to a downstream task by refocusing the model's attention to task-relevant features. Specifically, TOAST freezes the pre-trained backbone and tunes a top-down attention module to refocus the attention (see figure below).

drawing

This repo contains

  • visual_classification: TOAST for visual classification (including transfer learning on FGVC and VTAB)
  • language_generation: TOAST for language generation (including transfer learning on Alpaca)

Links

This codebase is largely built upon

Citation

If you found this code helpful, please consider citing our work:

@article{shi2023toast,
  title={TOAST: Transfer Learning via Attention Steering},
  author={Shi, Baifeng and Gai, Siyu and Darrell, Trevor and Wang, Xin},
  journal={arXiv preprint arXiv:2305.15542},
  year={2023}
}

About

Official code for "TOAST: Transfer Learning via Attention Steering"

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published