Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[feature] wrapping cost function to return best value explored #33

Merged
Merged
Show file tree
Hide file tree
Changes from 10 commits
Commits
Show all changes
25 commits
Select commit Hold shift + click to select a range
8d8cf99
[feature] wrapping cost function to return best value explored
VolodyaCO Jan 2, 2023
ff18230
[fix] complying with minimize signature
VolodyaCO Jan 2, 2023
4ce8bd7
[feature] saving best value and params only if params are bounded and…
VolodyaCO Jan 3, 2023
0a5ce54
Merge branch 'main' into fix/#29/wrap-cost-function-to-return-best-va…
VolodyaCO Jan 3, 2023
29fa0a3
[fix] correct way of checking bounds
VolodyaCO Jan 3, 2023
03e016f
[test] testing edge case of the langermann function
VolodyaCO Jan 3, 2023
0bb0216
[fix] fixes test
VolodyaCO Jan 3, 2023
a7bfb2b
[fix] inequality constraint fails if function is less than 0 only
VolodyaCO Jan 3, 2023
7026892
[style] length of comment
VolodyaCO Jan 3, 2023
c2311d0
[feature] expose argument to control if best parameters are stored
VolodyaCO Jan 16, 2023
3c84347
[feature] typing
VolodyaCO Jan 16, 2023
fa7da9b
[feature] wrapping cost function to return best value explored
VolodyaCO Jan 2, 2023
718322c
[fix] complying with minimize signature
VolodyaCO Jan 2, 2023
b015210
[feature] saving best value and params only if params are bounded and…
VolodyaCO Jan 3, 2023
f0afb09
[fix] correct way of checking bounds
VolodyaCO Jan 3, 2023
a4ff2a7
[test] testing edge case of the langermann function
VolodyaCO Jan 3, 2023
f17b02d
[fix] fixes test
VolodyaCO Jan 3, 2023
de20411
[fix] inequality constraint fails if function is less than 0 only
VolodyaCO Jan 3, 2023
e46d423
[style] length of comment
VolodyaCO Jan 3, 2023
c878851
[feature] expose argument to control if best parameters are stored
VolodyaCO Jan 16, 2023
9818a7a
[feature] typing
VolodyaCO Jan 16, 2023
3ee4bad
[test] asserting that the opt value is the minimum of the history
VolodyaCO Jun 2, 2023
3905209
Merge branch 'fix/#29/wrap-cost-function-to-return-best-val-scipy-opt…
VolodyaCO Jun 2, 2023
8192721
[style] mypy corrections
VolodyaCO Jun 2, 2023
6e6a611
[fix] seeding ttopt
VolodyaCO Jun 2, 2023
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
108 changes: 104 additions & 4 deletions src/orquestra/opt/optimizers/scipy_optimizer.py
Original file line number Diff line number Diff line change
@@ -1,6 +1,7 @@
################################################################################
# © Copyright 2022 Zapata Computing Inc.
################################################################################
import warnings
from typing import Callable, Dict, Optional, Sequence, Tuple, Union

import numpy as np
Expand All @@ -14,6 +15,88 @@
)
from ..history.recorder import RecorderFactory
from ..history.recorder import recorder as _recorder
from .pso.continuous_pso_optimizer import _get_bounds_like_array


class _CostFunctionWithBestValueType(type):
def __instancecheck__(cls, obj: object) -> bool:
if (
isinstance(obj, Callable)
and hasattr(obj, "best_value")
and hasattr(obj, "best_params")
):
return True
return False
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is a nice little piece of code! But I'm afraid it might be needlessly opaque since isinstance is only used once here and just for an assert statement. metaclasses are considered "heavy machinery" so it's best not to use them when they could be replaced with a simple if statement.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If I remove this to check the new instance check, I get some linting problems 😢

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes, but since you only use this isinstance once, it's not really worth changing the instancecheck method right?

Copy link
Contributor Author

@VolodyaCO VolodyaCO May 2, 2023

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I had to do this trick for the tensor train optimiser PR too #31. Do you want me to instead have a custom isinstanceCostFunctionWithBestValue function, or something like that?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

No, it's probably fine. Not worth holding up the PR for this.



class _CostFunctionWithBestValue(metaclass=_CostFunctionWithBestValueType):
best_value: float = np.inf
best_params: np.ndarray = np.empty(1)

def __init__(
self,
cost_function: Union[CallableWithGradient, Callable],
constraints: Optional[Tuple[Dict[str, Callable]]],
bounds: Union[
scipy.optimize.Bounds,
Sequence[Tuple[float, float]],
None,
] = None,
) -> None:
self.cost_function = cost_function
# Inherit all attributes from the cost function
for attr in dir(cost_function):
if not attr.startswith("__"):
if attr in [
"best_value",
"best_params",
"_are_params_bounded",
"_are_params_constrained",
]:
warnings.warn(
f"Attribute {attr} of the cost function is being overwritten "
"by the optimizer.",
UserWarning,
)
setattr(self, attr, getattr(cost_function, attr))
self.best_params.fill(np.nan)
self.constraints = constraints
self.bounds = _get_bounds_like_array(bounds) if bounds is not None else None

def __call__(self, parameters: np.ndarray) -> float:
value = self.cost_function(parameters)
if (
value < self.best_value
and self._are_params_bounded(parameters)
and self._are_params_constrained(parameters)
):
self.best_value = value
self.best_params = parameters
return value

def _are_params_bounded(self, params: np.ndarray) -> bool:
if self.bounds:
return bool(
np.all(
np.bitwise_and(params >= self.bounds[0], params <= self.bounds[1])
)
)
return True

def _are_params_constrained(self, params: np.ndarray) -> bool:
if self.constraints:
for constraint in self.constraints:
constraint_args = constraint.get("args", ())
assert isinstance(
constraint_args, tuple
), "If you pass args to the constraint, they must be a tuple."
if constraint["type"] == "eq":
if not np.isclose(constraint["fun"](params, *constraint_args), 0):
return False
elif constraint["type"] == "ineq":
if constraint["fun"](params, *constraint_args) < 0:
return False
return True


class ScipyOptimizer(Optimizer):
Expand All @@ -28,6 +111,7 @@ def __init__(
] = None,
options: Optional[Dict] = None,
recorder: RecorderFactory = _recorder,
store_best_parameters: bool = True,
):
"""
Integration with scipy optimizers. Documentation for this module is minimal,
Expand All @@ -40,6 +124,8 @@ def __init__(
options: dictionary with additional options for the optimizer.
recorder: recorder object which defines how to store
the optimization history.
store_best_parameters: whether to wrap the cost function to store the best
parameters the optimizer has seen so far.

""" # noqa: E501
super().__init__(recorder=recorder)
Expand All @@ -49,6 +135,17 @@ def __init__(
self.options = options
self.constraints = [] if constraints is None else constraints
self.bounds = bounds
self.store_best_parameters = store_best_parameters

def _preprocess_cost_function(
self, cost_function: Union[CallableWithGradient, Callable]
) -> Union[CallableWithGradient, Callable, _CostFunctionWithBestValue]:
if self.store_best_parameters:
return _CostFunctionWithBestValue(
cost_function, self.constraints, self.bounds
)
else:
return cost_function

def _minimize(
self,
Expand All @@ -66,7 +163,6 @@ def _minimize(
evaluations should be recorded.

"""

jacobian = None
if isinstance(cost_function, CallableWithGradient) and callable(
getattr(cost_function, "gradient")
Expand All @@ -82,8 +178,12 @@ def _minimize(
bounds=self.bounds,
jac=jacobian,
)
opt_value = result.fun
opt_params = result.x
if isinstance(cost_function, _CostFunctionWithBestValue):
opt_value = cost_function.best_value
opt_params = cost_function.best_params
else:
opt_value = result.fun
opt_params = result.x

nit = result.get("nit", None)
nfev = result.get("nfev", None)
Expand All @@ -93,5 +193,5 @@ def _minimize(
opt_params=opt_params,
nit=nit,
nfev=nfev,
**construct_history_info(cost_function, keep_history) # type: ignore
**construct_history_info(cost_function, keep_history), # type: ignore
)
30 changes: 29 additions & 1 deletion tests/orquestra/opt/optimizers/scipy_optimizer_test.py
Original file line number Diff line number Diff line change
Expand Up @@ -20,6 +20,7 @@
{"method": "L-BFGS-B"},
{"method": "Nelder-Mead"},
{"method": "SLSQP"},
{"method": "Powell"},
{"method": "COBYLA", "options": {"maxiter": 50000, "tol": 1e-7}},
]
)
Expand Down Expand Up @@ -89,7 +90,7 @@ def test_SLSQP_with_inequality_constraints(self):
cost_function = FunctionWithGradient(
rosenbrock_function, finite_differences_gradient(rosenbrock_function)
)
constraints = {"type": "ineq", "fun": lambda x: x[0] + x[1] - 3}
constraints = ({"type": "ineq", "fun": lambda x: x[0] + x[1] - 3},)
optimizer = ScipyOptimizer(method="SLSQP")
initial_params = np.array([0, 0])

Expand All @@ -107,3 +108,30 @@ def test_SLSQP_with_inequality_constraints(self):
results_with_constraints.opt_value, abs=1e-1
)
assert results_with_constraints.opt_params.sum() >= 3

# https://github.com/scipy/scipy/issues/17673 reports a function and that is
# minimised with an initial point that has a given cost function value which is
# lower than the optimised value by the Powell optimizer. This test is to ensure
# that our solution over scipy is not affected by this issue.
VolodyaCO marked this conversation as resolved.
Show resolved Hide resolved
@pytest.mark.parametrize("method", ["Powell", "L-BFGS-B", "Nelder-Mead", "SLSQP"])
def test_Langermann_function(self, method):
# Definition of the function
langer_c = np.array([6, 1, 4, 4, 8])
langer_A = np.array(
[[4, 6, 3, 5], [8, 7, 9, 9], [2, 7, 8, 8], [9, 2, 6, 9], [5, 4, 1, 4]]
)

def langermann(parameters: np.ndarray) -> float:
# This function is bounded in the [0, 10] box
sum_ = np.sum(np.subtract(langer_A, parameters) ** 2, axis=1)
vec = np.exp(-1 / np.pi * sum_) * np.cos(np.pi * sum_)
result = np.dot(langer_c, vec)
return result

# Initial point
x0 = np.array([4.65116802, 4.42985893, 1.74720157, 4.29727392])
y0 = langermann(x0)

optimiser = ScipyOptimizer(method=method, bounds=[(0, 10)] * 4)
result = optimiser.minimize(langermann, x0)
assert result.opt_value <= y0