-
Notifications
You must be signed in to change notification settings - Fork 6
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Simple gradient descent works for callables without gradient #30
Simple gradient descent works for callables without gradient #30
Conversation
…ts-function-to-callable-with-gradient
…t_function. Thus it should not require it to have a gradient method.
…ts-function-to-callable-with-gradient
Codecov ReportBase: 94.95% // Head: 94.02% // Decreases project coverage by
Additional details and impacted files@@ Coverage Diff @@
## main #30 +/- ##
==========================================
- Coverage 94.95% 94.02% -0.93%
==========================================
Files 28 28
Lines 1011 1038 +27
==========================================
+ Hits 960 976 +16
- Misses 51 62 +11
Help us with your feedback. Take ten seconds to tell us how you rate us. Have a feature suggestion? Share it here. ☔ View full report at Codecov. |
Co-authored-by: Athena Caesura <athena.caesura@zapatacomputing.com>
…-to-callable-with-gradient' of https://github.com/zapatacomputing/orquestra-opt into fix/volodyaco/simple-gradient-descent-converts-function-to-callable-with-gradient
Description
Previously, the simple gradient descent optimiser accepted both callables and callables with gradient. However, when passing a callable, it would not compute gradients with a numerical approximation routine, and the optimisation would fail. This PR wraps the incoming cost function, if it's just a callable, to convert it to a callable with gradient by using finite differences.
Please verify that you have completed the following steps