Skip to content

Commit d67b21a

Browse files
Quentin-Anthonydashstandertjruwaseloadams
authored andcommitted
Remove PP Grad Tail Check (deepspeedai#2538)
* Only communicate grad tail if it exists Co-authored-by: Dashiell Stander <dash.stander@gmail.com> * Revert previous patch and just always send the grad tail * Formatting --------- Co-authored-by: Dashiell Stander <dash.stander@gmail.com> Co-authored-by: Olatunji Ruwase <olruwase@microsoft.com> Co-authored-by: Logan Adams <114770087+loadams@users.noreply.github.com>
1 parent 9145fdd commit d67b21a

File tree

1 file changed

+1
-1
lines changed

1 file changed

+1
-1
lines changed

deepspeed/runtime/pipe/engine.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -988,7 +988,7 @@ def _exec_send_grads(self, buffer_id):
988988
if isinstance(inputs, tuple):
989989
first_input = inputs[0]
990990
assert all([torch.is_tensor(elt) for elt in inputs[1:]])
991-
inputs_grad_tail = [elt.grad for elt in inputs[1:] if elt.grad is not None]
991+
inputs_grad_tail = [elt.grad for elt in inputs[1:]]
992992
elif torch.is_tensor(inputs):
993993
first_input = inputs
994994
inputs_grad_tail = []

0 commit comments

Comments
 (0)