Skip to content

Commit

Permalink
ReturnnTrainingJob info, more robust LR log loading (#549)
Browse files Browse the repository at this point in the history
In my case, I had some np.float64(...) entries in it
(for the effective LR).
Not sure how I got that (maybe due to new PyTorch/Numpy),
and I already pushed some RETURNN change to avoid this in the future,
but I think we can anyway just handle that here as well.
  • Loading branch information
albertz authored Oct 17, 2024
1 parent 229d490 commit 7e067e8
Showing 1 changed file with 3 additions and 1 deletion.
4 changes: 3 additions & 1 deletion returnn/training.py
Original file line number Diff line number Diff line change
Expand Up @@ -287,6 +287,8 @@ def _get_run_cmd(self):
return run_cmd

def info(self):
import numpy as np

def try_load_lr_log(file_path: str) -> Optional[dict]:
# Used in parsing the learning rates
@dataclass
Expand All @@ -296,7 +298,7 @@ class EpochData:

try:
with open(file_path, "rt") as file:
return eval(file.read().strip())
return eval(file.read().strip(), {"EpochData": EpochData, "np": np})
except FileExistsError:
return None
except FileNotFoundError:
Expand Down

0 comments on commit 7e067e8

Please sign in to comment.