-
Notifications
You must be signed in to change notification settings - Fork 2.2k
Add AdaMSS tuner with Adaptive Subspace Allocation (ASA) support #2967
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
|
Thank you for your PR @LonglongaaaGo. We're currently off for the holidays, so a proper review will have to wait for next year. I did skim the code though and just wanted to add a few comments:
|
|
@LonglongaaaGo Please ping me when the PR is ready for review. |
Hey @BenjaminBossan, Sure! I will let you know once it is ready. Thank you!!! |
|
This PR was replaced by this PR: #2987 |
Paper title: AdaMSS: Adaptive Multi-Subspace Approach for Parameter-Efficient Fine-Tuning
Paper: https://neurips.cc/virtual/2025/loc/san-diego/poster/119606
Github page: https://github.com/jzheng20/AdaMSS/tree/main
Summary
This PR adds AdaMSS (Adaptive Multi-Subspace Selection) as a new PEFT tuner with optional ASA (Adaptive Subspace Allocation) for dynamic subspace selection during training.
Implementation
New Tuner: AdaMSS
ASA Features (Optional)
ASACallbackfor Transformers Trainerupdate_and_allocate()method following AdaLora conventionFiles Modified
Added:
src/peft/tuners/adamss/config.pysrc/peft/tuners/adamss/layer.pysrc/peft/tuners/adamss/model.pysrc/peft/tuners/adamss/asa_callback.pysrc/peft/tuners/adamss/__init__.pyModified:
src/peft/__init__.py- Export AdaMSSConfig, AdaMSSModel, ASACallbacksrc/peft/tuners/__init__.py- Export AdaMSS tunersrc/peft/utils/peft_types.py- Add ADAMSS PeftTypesrc/peft/tuners/adamss/__init__.py- Register AdaMSS withregister_peft_method()Usage Examples
Basic Usage (No ASA)
With ASA - Callback Pattern
With ASA - Standard Pattern
Algorithm Details
AdaMSS Decomposition
ASA Schedule
Design Notes
BaseTunerandBaseTunerLayerpatterns consistent with other PEFT tunersupdate_and_allocate()method follows AdaLora convention for dynamic allocation