-
Notifications
You must be signed in to change notification settings - Fork 3.6k
Closed
Labels
bugSomething isn't workingSomething isn't working
Description
Describe the bug
When loading a model directly from a checkpoint I get an error "OSError: Checkpoint does not contain hyperparameters. Are your model hyperparameters storedin self.hparams?"
But my model clearly has the hparams.
To Reproduce
Just create a model save a checkpoint and try to load it like explained in the documentation:
pretrained_model = MyLightningModule.load_from_checkpoint(
checkpoint_path='/path/to/pytorch_checkpoint.ckpt'
)
Possible reason
I found that code in the trainer_io.py class line 301:
try:
torch.save(checkpoint, filepath)
except AttributeError:
if 'hparams' in checkpoint:
del checkpoint['hparams']
torch.save(checkpoint, filepath)
Obviously if the code to save the checkpoint deletes de hparams the load checkpoint function will not find that...
Expected behavior
A more concise way to easily load a checkpoint without the need for the load_from_metrics function.
jeffling, akshaykulkarni07, bharathgs, ponimasza, expectopatronum and 6 more
Metadata
Metadata
Assignees
Labels
bugSomething isn't workingSomething isn't working