You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Thinking about what @quantumjot mentioned today about separating training from inference, this is exactly where this could happen.
To maintain the flexibility of the run script in the short-term we could say if a model_save_fn is given in the config file (and maybe an inference-only flag), then run script from here. Further in the line (when we know Grace is working as expected) we can think how es best to abstract each piece.
This comment is not for this PR, just to keep this in mind for the next step.
Thinking about what @quantumjot mentioned today about separating training from inference, this is exactly where this could happen.
To maintain the flexibility of the run script in the short-term we could say if a
model_save_fn
is given in the config file (and maybe an inference-only flag), then run script from here. Further in the line (when we know Grace is working as expected) we can think how es best to abstract each piece.This comment is not for this PR, just to keep this in mind for the next step.
Originally posted by @crangelsmith in #250 (comment)
The text was updated successfully, but these errors were encountered: