Skip to content
/ EBFT Public

Code for paper "EBFT: Effective and Block-Wise Fine-Tuning for Sparse LLMs"

Notifications You must be signed in to change notification settings

sunggo/EBFT

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

4 Commits
 
 
 
 
 
 
 
 

Repository files navigation

EBFT: Effective and Block-Wise Fine-Tuning for Sparse LLMs

fine-tuning

1. LlamaV1-7B
python main.py --model decapoda-research/llama-7b-hf \
    --prune_method wanda \
    --density 0.5 \
    --sparsity_type unstructured \
    --learning_rate 0.0002 \
    --eval_zero_shot \

Acknowledgments

Our implementation partially reuses Wanda's code.

About

Code for paper "EBFT: Effective and Block-Wise Fine-Tuning for Sparse LLMs"

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages