docs: Fit models with standard for loop + notebook on training loops. #382
Labels
documentation
Improvements or additions to documentation
good first issue
Good for newcomers
no-stale
We have a convenient
fit
function to train GPs against objectives. It would be good though e.g., in the regression notebook to show a simple python for loop, and a simplelax.scan
training loop, to demonstrate that users can write their own training loops. Give insight to, eg., the ConjugateMLL is something you can takejax.grad
against and just do gradient descent on.Then it would be good to link this to a more extensive notebook exposing users to stoping gradients, bijectors transformations etc, and show how to add a training bar to the loop.
The text was updated successfully, but these errors were encountered: