Skip to content

Fix CSVLogger hyperparameter is logged at every write which increase latency significantly. #20594

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 10 commits into from
Feb 26, 2025

Conversation

duydl
Copy link
Contributor

@duydl duydl commented Feb 18, 2025

What does this PR do?

In CSVLogger, the save method of experiment ExperimentWriter is override such that hyperparameters are saved at every write, and log_hyperparams only update hparam, which is not efficient. The training run become very slow when there is a significantly big object in hparams and regular logging (like every hundred steps). So I believe this change is necessary.

Before submitting
  • Was this discussed/agreed via a GitHub issue? (not for typos and docs)
  • Did you read the contributor guideline, Pull Request section?
  • Did you make sure your PR does only one thing, instead of bundling different changes together?
  • Did you make sure to update the documentation with your changes? (if necessary)
  • Did you write any new necessary tests? (not for typos and docs)
  • Did you verify new and existing tests pass locally with your changes?
  • Did you list all the breaking changes introduced by this pull request?
  • Did you update the CHANGELOG? (not for typos, docs, test updates, or minor internal changes/refactors)

PR review

Anyone in the community is welcome to review the PR.
Before you start reviewing, make sure you have read the review guidelines. In short, see the following bullet-list:

Reviewer checklist
  • Is this pull request ready for review? (if not, please submit in draft mode)
  • Check that all items from Before submitting are resolved
  • Make sure the title is self-explanatory and the description concisely explains the PR
  • Add labels and milestones (and optionally projects) to the PR so it can be classified

📚 Documentation preview 📚: https://pytorch-lightning--20594.org.readthedocs.build/en/20594/

@github-actions github-actions bot added the pl Generic label for PyTorch Lightning package label Feb 18, 2025
@duydl duydl changed the title Fix CSVLogger hyperparameter is logged at every write Fix CSVLogger hyperparameter is logged at every write which increase latency significantly. Feb 18, 2025
@duydl
Copy link
Contributor Author

duydl commented Feb 20, 2025

I search the issues and believe this would fix #19240.

Copy link
Member

@ethanwharris ethanwharris left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks @duydl! Could you add a test for this please? Let us know if you need a hand 🙂

@duydl
Copy link
Contributor Author

duydl commented Feb 21, 2025

@ethanwharris Hi, for the test there is only one line needed adjustment I added in the newest commit. Now to save hyperparams with CSVLogger we no longer need to call logger.save, just logger.log_hyperparams() is enough. It would be more consistent with other loggers.

Copy link

codecov bot commented Feb 22, 2025

Codecov Report

All modified and coverable lines are covered by tests ✅

Project coverage is 79%. Comparing base (b313cd9) to head (dd828c4).
Report is 112 commits behind head on master.

❗ There is a different number of reports uploaded between BASE (b313cd9) and HEAD (dd828c4). Click for more details.

HEAD has 1821 uploads less than BASE
Flag BASE (b313cd9) HEAD (dd828c4)
cpu 430 24
lightning 324 18
pytest 217 0
python3.9 108 6
lightning_fabric 54 0
python3.11 108 6
python3.10 54 3
python3.12.7 160 9
gpu 2 0
pytorch2.1 81 9
pytest-full 215 24
pytorch2.2.2 27 3
pytorch_lightning 54 6
pytorch2.3 27 3
pytorch2.5.1 54 6
pytorch2.4.1 26 3
Additional details and impacted files
@@            Coverage Diff            @@
##           master   #20594     +/-   ##
=========================================
- Coverage      88%      79%     -9%     
=========================================
  Files         267      264      -3     
  Lines       23369    23311     -58     
=========================================
- Hits        20478    18364   -2114     
- Misses       2891     4947   +2056     

@duydl
Copy link
Contributor Author

duydl commented Feb 22, 2025

@ethanwharris Hi, do you think this is ready to be merged?
Could you also look at #20593 to see if the change is welcome or if there is necessary adjustment?
Thank you.

@Borda Borda merged commit 1f5add3 into Lightning-AI:master Feb 26, 2025
77 checks passed
Borda pushed a commit that referenced this pull request Mar 12, 2025
…latency significantly. (#20594)

* Move save_hparams_to_yaml to log_hparams instead of auto save with metric
* Fix params to be optional
* Adjust test
* Fix test_csv, test_no_name

---------

Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
(cherry picked from commit 1f5add3)
lexierule pushed a commit that referenced this pull request Mar 18, 2025
…latency significantly. (#20594)

* Move save_hparams_to_yaml to log_hparams instead of auto save with metric
* Fix params to be optional
* Adjust test
* Fix test_csv, test_no_name

---------

Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
(cherry picked from commit 1f5add3)
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
pl Generic label for PyTorch Lightning package
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants