-
-
Notifications
You must be signed in to change notification settings - Fork 22
Issues: FluxML/Optimisers.jl
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Author
Label
Projects
Milestones
Assignee
Sort
Issues list
Adam optimizer can produce NaNs with Float16 due to small epsilon
#167
opened Feb 17, 2024 by
pevnak
Document
destructure
handling shared parameters differently to ComponentArrays.jl
documentation
#161
opened Oct 9, 2023 by
mcabbott
Utility for walking a tree (e.g. gradients) w.r.t. a model
enhancement
New feature or request
#143
opened Apr 17, 2023 by
darsnack
nothing
does not correspond to updating the state with a zero gradient.
#140
opened Apr 7, 2023 by
CarloLucibello
Wrong model update for BatchNorm for some specific synthax
bug
Something isn't working
#123
opened Dec 2, 2022 by
jeremiedb
Split out the
rules.jl
as a sub-package (or a separate package) ?
#108
opened Aug 29, 2022 by
chengchingwen
Consistency in the type behavior of restructure
documentation
#95
opened Jul 1, 2022 by
ChrisRackauckas
doc improvement: working with custom model types
documentation
#84
opened Jun 2, 2022 by
CarloLucibello
Add ArrowTypes.jl dependency to serialize optimizers?
enhancement
New feature or request
#77
opened May 20, 2022 by
ericphanson
destructure
's gradient is confused by trainable
bug
#72
opened May 1, 2022 by
mcabbott
Previous Next
ProTip!
Type g p on any issue or pull request to go back to the pull request listing page.