You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
When using lightning with RayDDPStrategy, we found that Trainer's limit_train_batches parameter was actually meant for each worker instead of "global".
I wanna confirm with the developer if
this is the case with other parallel training strategies
this applies to other limit_*_batches parameters of the Trainer.