Skip to content

Commit

Permalink
Add Sequential Monte Carlo Optimizer
Browse files Browse the repository at this point in the history
  • Loading branch information
Andreas Størksen Stordal committed Oct 9, 2023
1 parent 9ee335a commit f5bbe75
Show file tree
Hide file tree
Showing 2 changed files with 496 additions and 1 deletion.
30 changes: 29 additions & 1 deletion popt/update_schemes/optimizers.py
Original file line number Diff line number Diff line change
Expand Up @@ -85,7 +85,35 @@ def apply_update(self, control, gradient, **kwargs):
self.temp_velocity = beta*self.velocity + alpha*gradient
new_control = control + self.temp_velocity
return new_control


def apply_smc_update(self, control, gradient, **kwargs):
"""
Apply a gradient update to the control parameter.
Arguments
-------------------------------------------------------------------------------------
control : 1-D array_like
The current value of the parameter being optimized.
gradient : 1-D array_like (same shape as control)
The gradient of the objective function with respect to the control parameter.
**kwargs : dict
Additional keyword arguments.
Returns
-------------------------------------------------------------------------------------
1-D array_like (same shape as control).
The new value of the control parameter after the update.
"""
alpha = self._step_size


# apply update

new_control = alpha * control + (1-alpha) * gradient
return new_control

def apply_backtracking(self):
"""
Apply backtracking by reducing step size and momentum temporarily.
Expand Down
Loading

0 comments on commit f5bbe75

Please sign in to comment.