Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Simple gradient descent works for callables without gradient #30

Conversation

VolodyaCO
Copy link
Contributor

@VolodyaCO VolodyaCO commented Dec 29, 2022

Description

Previously, the simple gradient descent optimiser accepted both callables and callables with gradient. However, when passing a callable, it would not compute gradients with a numerical approximation routine, and the optimisation would fail. This PR wraps the incoming cost function, if it's just a callable, to convert it to a callable with gradient by using finite differences.

Please verify that you have completed the following steps

  • I have self-reviewed my code.
  • I have included test cases validating introduced feature/fix.
  • I have updated documentation.

@VolodyaCO VolodyaCO added bug Something isn't working enhancement New feature or request labels Dec 29, 2022
@VolodyaCO VolodyaCO self-assigned this Dec 29, 2022
@codecov-commenter
Copy link

codecov-commenter commented Jan 3, 2023

Codecov Report

Base: 94.95% // Head: 94.02% // Decreases project coverage by -0.92% ⚠️

Coverage data is based on head (f5f1fa4) compared to base (76e96c9).
Patch coverage: 63.33% of modified lines in pull request are covered.

Additional details and impacted files
@@            Coverage Diff             @@
##             main      #30      +/-   ##
==========================================
- Coverage   94.95%   94.02%   -0.93%     
==========================================
  Files          28       28              
  Lines        1011     1038      +27     
==========================================
+ Hits          960      976      +16     
- Misses         51       62      +11     
Impacted Files Coverage Δ
...rquestra/opt/optimizers/simple_gradient_descent.py 78.43% <62.06%> (-21.57%) ⬇️
...tra/opt/optimizers/pso/continuous_pso_optimizer.py 84.21% <100.00%> (ø)

Help us with your feedback. Take ten seconds to tell us how you rate us. Have a feature suggestion? Share it here.

☔ View full report at Codecov.
📢 Do you have feedback about the report comment? Let us know in this issue.

@VolodyaCO VolodyaCO requested review from AthenaCaesura and removed request for mstechly January 13, 2023 14:19
VolodyaCO and others added 3 commits January 16, 2023 08:59
Co-authored-by: Athena Caesura <athena.caesura@zapatacomputing.com>
…-to-callable-with-gradient' of https://github.com/zapatacomputing/orquestra-opt into fix/volodyaco/simple-gradient-descent-converts-function-to-callable-with-gradient
@VolodyaCO VolodyaCO merged commit 4f3e561 into main Mar 15, 2023
@VolodyaCO VolodyaCO deleted the fix/volodyaco/simple-gradient-descent-converts-function-to-callable-with-gradient branch March 15, 2023 23:00
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working enhancement New feature or request
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants