🤗 PEFT: State-of-the-art Parameter-Efficient Fine-Tuning.
-
Updated
Nov 12, 2024 - Python
🤗 PEFT: State-of-the-art Parameter-Efficient Fine-Tuning.
A Unified Library for Parameter-Efficient and Modular Transfer Learning
An optimized deep prompt tuning strategy comparable to fine-tuning across scales and tasks
A plug-and-play library for parameter-efficient-tuning (Delta Tuning)
A novel method to tune language models. Codes and datasets for paper ``GPT understands, too''.
Live Training for Open-source Big Models
A collection of parameter-efficient transfer learning papers focusing on computer vision and multimodal domains.
Research Trends in LLM-guided Multimodal Learning.
Collection of Tools and Papers related to Adapters / Parameter-Efficient Transfer Learning/ Fine-Tuning
K-CAI NEURAL API - Keras based neural network API that will allow you to create parameter-efficient, memory-efficient, flops-efficient multipath models with new layer types. There are plenty of examples and documentation.
CodeUp: A Multilingual Code Generation Llama2 Model with Parameter-Efficient Instruction-Tuning on a Single RTX 3090
This Repository surveys the paper focusing on Prompting and Adapters for Speech Processing.
On Transferability of Prompt Tuning for Natural Language Processing
[CVPR2024] The code of "UniPT: Universal Parallel Tuning for Transfer Learning with Efficient Parameter and Memory"
[arXiv] Cross-Modal Adapter for Text-Video Retrieval
Code for the ACL 2022 paper "Continual Sequence Generation with Adaptive Compositional Modules"
INTERSPEECH 23 - Refunction Whisper to recognize new tasks with adapters!
Code for EACL'23 paper "Udapter: Efficient Domain Adaptation Using Adapters"
CAMERO: Consistency Regularized Ensemble of Perturbed Language Models with Weight Sharing (ACL 2022)
This repository contains the source code for the paper "Grouped Pointwise Convolutions Reduce Parameters in Convolutional Neural Networks".
Add a description, image, and links to the parameter-efficient-learning topic page so that developers can more easily learn about it.
To associate your repository with the parameter-efficient-learning topic, visit your repo's landing page and select "manage topics."