Skip to content

Commit

Permalink
[O] Allow backward-compatibility for torch state dict
Browse files Browse the repository at this point in the history
  • Loading branch information
hykilpikonna committed Oct 11, 2023
1 parent d472899 commit ea5629d
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion src/CLAPWrapper.py
Original file line number Diff line number Diff line change
Expand Up @@ -98,7 +98,7 @@ def load_clap(self):

# We unwrap the DDP model and save. If the model is not unwrapped and saved, then the model needs to unwrapped before `load_state_dict`:
# Reference link: https://discuss.pytorch.org/t/how-to-load-dataparallel-model-which-trained-using-multiple-gpus/146005
clap.load_state_dict(model_state_dict)
clap.load_state_dict(model_state_dict, strict=False)

clap.eval() # set clap in eval mode
tokenizer = AutoTokenizer.from_pretrained(args.text_model)
Expand Down

0 comments on commit ea5629d

Please sign in to comment.