Skip to content

Commit

Permalink
[Unified checkpoint] Turn off unified checkpoint when using sharding …
Browse files Browse the repository at this point in the history
…stage3 (PaddlePaddle#7969)

* turn of uc when sharding stage3

* fix
  • Loading branch information
DesmonDay committed Feb 6, 2024
1 parent b39e701 commit fc1a81b
Showing 1 changed file with 10 additions and 0 deletions.
10 changes: 10 additions & 0 deletions paddlenlp/trainer/training_args.py
Original file line number Diff line number Diff line change
Expand Up @@ -1290,6 +1290,16 @@ def is_segment_parallel_supported():
else:
paddle.distributed.init_parallel_env()

if (
self.unified_checkpoint
and self.sharding_parallel_degree > 0
and ShardingOption.FULL_SHARD in self.sharding
):
logger.warning(
"Unified checkpoint currently do not support sharding stage3, set `unified_checkpoint` to False."
)
self.unified_checkpoint = False

if self.unified_checkpoint:
unified_checkpoint_config = set(self.unified_checkpoint_config.split(" "))
for x in unified_checkpoint_config:
Expand Down

0 comments on commit fc1a81b

Please sign in to comment.