-
Notifications
You must be signed in to change notification settings - Fork 210
[bugfix] show_fix should set num_workers to 0 #226
Conversation
Codecov Report
@@ Coverage Diff @@
## master #226 +/- ##
=========================================
Coverage ? 86.61%
=========================================
Files ? 57
Lines ? 2750
Branches ? 0
=========================================
Hits ? 2382
Misses ? 368
Partials ? 0
Flags with carried forward coverage won't be shown. Click here to find out more.
Continue to review full report at Codecov.
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM 😃 small nits
iterator = datamodule._reset_iterator(RunningStage.TRAINING) | ||
assert isinstance(iterator, torch.utils.data.dataloader._SingleProcessDataLoaderIter) | ||
iterator = iter(datamodule.train_dataloader()) | ||
assert isinstance(iterator, torch.utils.data.dataloader._MultiProcessingDataLoaderIter) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'd add a small assert checking that internally the number of workers has been been kept as expected
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
That's what _MultiProcessingDataLoaderIter
does.
What does this PR do?
Fixes # (issue)
Before submitting
PR review
Anyone in the community is free to review the PR once the tests have passed.
If we didn't discuss your PR in Github issues there's a high chance it will not be merged.
Did you have fun?
Make sure you had fun coding 🙃