Skip to content

Commit

Permalink
fix tp split
Browse files Browse the repository at this point in the history
  • Loading branch information
DesmonDay committed Jul 23, 2024
1 parent f1d65ab commit e7d96a0
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion paddlenlp/experimental/transformers/utils.py
Original file line number Diff line number Diff line change
Expand Up @@ -135,7 +135,7 @@ def load_tp_checkpoint(folder, cls, config, return_numpy=False):
"""

config = AutoConfig.from_pretrained(folder)
if config.tensor_parallel_degree == 1:
if config.tensor_parallel_degree == 1 or config.tensor_parallel_degree == -1:
return load_sharded_checkpoint(folder, return_numpy=return_numpy)

Check warning on line 139 in paddlenlp/experimental/transformers/utils.py

View check run for this annotation

Codecov / codecov/patch

paddlenlp/experimental/transformers/utils.py#L137-L139

Added lines #L137 - L139 were not covered by tests
else:
rank_model_path = os.path.join(folder, f"model_state.tp0{config.tensor_parallel_rank}.pdparams")
Expand Down

0 comments on commit e7d96a0

Please sign in to comment.