A Unified Library for Parameter-Efficient and Modular Transfer Learning
-
Updated
Oct 12, 2025 - Python
A Unified Library for Parameter-Efficient and Modular Transfer Learning
Code for our EMNLP 2023 Paper: "LLM-Adapters: An Adapter Family for Parameter-Efficient Fine-Tuning of Large Language Models"
Trankit is a Light-Weight Transformer-based Python Toolkit for Multilingual Natural Language Processing
PyTorch code for "VL-Adapter: Parameter-Efficient Transfer Learning for Vision-and-Language Tasks" (CVPR2022)
Collection of Tools and Papers related to Adapters / Parameter-Efficient Transfer Learning/ Fine-Tuning
Research code for "What to Pre-Train on? Efficient Intermediate Task Selection", EMNLP 2021
Code of NAACL 2022 "Efficient Hierarchical Domain Adaptation for Pretrained Language Models" paper.
X-MIC: Cross-Modal Instance Conditioning for Egocentric Action Generalization, CVPR 2024
Code for EACL'23 paper "Udapter: Efficient Domain Adaptation Using Adapters"
This repositary hosts my experiments for the project, I did with OffNote Labs.
Systems submitted to IWSLT 2022 by the MT-UPC group.
Integration of Adapters into HuggingFace's Transformers (implemented in TensorFlow 2.0)
Master Thesis on "Comparing Modular Approaches for Parameter-Efficient Fine-Tuning"
Ready-to-run training scripts for Transformers and Adapters on >50 NLP tasks
Python samples for Reports.PYTHON report builder for Python 3.10 and higher, and embedded report designer and report viewer
Experiments for channel-based Structured Pruning Adapters
Sequencing adapters and contaminants collected from different tools for easy download
Add a description, image, and links to the adapters topic page so that developers can more easily learn about it.
To associate your repository with the adapters topic, visit your repo's landing page and select "manage topics."