Skip to content

Finetune Transformers Models with PyTorch Lightning: documentation error? #139

Open
@yfeng24816

Description

@yfeng24816

When calculating the total steps, shouldn't we use number of batches * epoch size ? In this case, it would be self.total_steps = (len(train_loader.dataset) // tb_size) * ab_size instead of self.total_steps = (len(train_loader.dataset) // tb_size) // ab_size.

Please fix me if anywhere is wrong.

image

https://pytorchlightning.github.io/lightning-tutorials/notebooks/lightning_examples/text-transformers.html

cc @Borda @rohitgr7

Metadata

Metadata

Assignees

No one assigned

    Labels

    ExampleExample / Demo / Tutorialgood first issueGood for newcomersquestionFurther information is requested

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions