Skip to content

Unable to access datalaoders within configure_optimizers #10430

Closed
@rohitgr7

Description

@rohitgr7

Proposed refactoring or deprecation

Before v1.5, the dataloader hooks were patched to model and were easily accessible within configure_optimizers to setup total training steps for scheduler. But now since they are no longer patched,
https://github.com/PyTorchLightning/pytorch-lightning/blob/0ed5e3dc8abcec40aacd64cc9175590bb1409759/pytorch_lightning/trainer/connectors/data_connector.py#L213-L224
they are no longer available directly if using a datamodule or dataloaders are passed directly to .fit. Neither they can be accessed using self.trainer.train_dataloaders because dataloaders are being loaded within Fit Loop.
https://github.com/PyTorchLightning/pytorch-lightning/blob/0ed5e3dc8abcec40aacd64cc9175590bb1409759/pytorch_lightning/loops/fit_loop.py#L194-L197

Motivation

I'd suggest these dataloaders should be available for users, no matter how to passed it during .fit.

Pitch

If possible we should load call configure_optimizers after loading the dataloaders for the first time within fit loop. Not sure if it will bring some complications and failures because we load the optimizers differently for deepspeed.

Additional context

As always, alternative suggestions/thoughts would be appreciated :)

cc: @karthikrangasai @awaelchli


If you enjoy Lightning, check out our other projects! ⚡

  • Metrics: Machine learning metrics for distributed, scalable PyTorch applications.

  • Lite: enables pure PyTorch users to scale their existing code on any kind of device while retaining full control over their own loops and optimization logic.

  • Flash: The fastest way to get a Lightning baseline! A collection of tasks for fast prototyping, baselining, fine-tuning, and solving problems with deep learning.

  • Bolts: Pretrained SOTA Deep Learning models, callbacks, and more for research and production with PyTorch Lightning and PyTorch.

  • Lightning Transformers: Flexible interface for high-performance research using SOTA Transformers leveraging Pytorch Lightning, Transformers, and Hydra.

Metadata

Metadata

Assignees

No one assigned

    Labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions