Skip to content

Conversation

@calpt
Copy link
Member

@calpt calpt commented Jul 13, 2024

Changes needed for sync:

  • remove setting _hf_peft_config_loaded for HF Trainer
  • fix BEiT interpolate_pos_encoding
  • add sdpa to GPT-2
  • add LlamaForTokenClassification head conversion
  • copy changes to Mistral implementation

@calpt calpt added the sync label Jul 13, 2024
lenglaender and others added 5 commits July 16, 2024 17:21
If this should results in any new errors, put the line back in and set `self._hf_peft_config_loaded = False` in the `save_pretrained` function
@calpt calpt marked this pull request as ready for review July 20, 2024 18:12
@calpt calpt merged commit 1a7d24e into adapter-hub:main Jul 27, 2024
@calpt calpt deleted the sync/v4.42.x branch August 4, 2024 17:25
dainis-boumber added a commit to ReDASers/adapters that referenced this pull request Aug 30, 2024
Changes needed for sync:
- remove setting `_hf_peft_config_loaded` for HF Trainer
- fix BEiT interpolate_pos_encoding
- add sdpa to GPT-2
- add LlamaForTokenClassification head conversion
- copy changes to Mistral implementation

---------

Co-authored-by: Leon Engländer <leon.englaender@gmail.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants