Skip to content

Trainer parameter limit-train-batches was meant to be per-worker #21022

@wingkitlee0

Description

@wingkitlee0

📚 Documentation

When using lightning with RayDDPStrategy, we found that Trainer's limit_train_batches parameter was actually meant for each worker instead of "global".

I wanna confirm with the developer if

  1. this is the case with other parallel training strategies
  2. this applies to other limit_*_batches parameters of the Trainer.

It will be great to update the doc to mention this, e.g.,
https://lightning.ai/docs/pytorch/stable/common/trainer.html#limit-train-batches

Thanks

cc @lantiga @Borda @justusschock

Metadata

Metadata

Assignees

No one assigned

    Labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions