-
Notifications
You must be signed in to change notification settings - Fork 358
Description
What happened?
SearchSpace.check_membership() raises ValueError when raise_error=False if constraint check fails due to missing parameters in constraint_dict. The pertinent code looks like this:
# In ax.core.search_space
def check_membership(..., raise_error=False):
...
for constraint in self._parameter_constraints:
if not constraint.check(numerical_param_dict):
if raise_error:
raise ValueError(f"Parameter constraint {constraint} is violated.")
return False
but constraint.check() may itself raise ValueError which is not handled in check_membership():
# In ax.core.parameter_constraint.check()
def check(...):
for parameter_name in self.constraint_dict.keys():
if parameter_name not in parameter_dict.keys():
raise ValueError(f"`{parameter_name}` not present in param_dict.")
This is usually not an issue, but the check_membership() is actually called with a subset of the parameters when computing a ContourPlot with parameter constraints present:
# ax.analysis.plotly.surface.contour._prepare_data
...
features = [
ObservationFeatures(
parameters={
x_parameter_name: x,
y_parameter_name: y,
**{
parameter.name: select_fixed_value(parameter=parameter)
for parameter in experiment.search_space.parameters.values()
if not (
parameter.name == x_parameter_name
or parameter.name == y_parameter_name
)
},
}
)
for x in xs
for y in ys
# Do not create features for any out of sample points
if experiment.search_space.check_membership(
parameterization={
x_parameter_name: x,
y_parameter_name: y,
},
raise_error=False,
check_all_parameters_present=False,
)
]
This will always raise a ValueError unless the constraints only pertain to x_parameter_name and y_parameter_name, since most parameters are not passed to the membership check parameterization dict. In turn, TopSurfaceAnalysis will almost always completely fail for any experiment with parameter constraints, since there's no try/catch block for computing individual ContourPlots.
Note that SlicePlot does not have a search space membership check:
# ax.analysis.plotly.slice._prepare_data
...
features = [
ObservationFeatures(
parameters={
parameter_name: x,
**{
parameter.name: select_fixed_value(parameter=parameter)
for parameter in experiment.search_space.parameters.values()
if parameter.name != parameter_name
},
}
)
for x in xs
]
so it works but may contain samples outside the search space.
Please provide a minimal, reproducible example of the unexpected behavior.
from ax import Client, RangeParameterConfig
from ax.analysis.plotly.surface.contour import ContourPlot
# 1. Initialize the Client.
client = Client()
# 2. Configure where Ax will search (now with 3 parameters).
client.configure_experiment(
name="three_param_booth_like",
parameters=[
RangeParameterConfig(
name="x1",
bounds=(-10.0, 10.0),
parameter_type="float",
),
RangeParameterConfig(
name="x2",
bounds=(-10.0, 10.0),
parameter_type="float",
),
RangeParameterConfig(
name="x3",
bounds=(-10.0, 10.0),
parameter_type="float",
),
],
parameter_constraints=['x1 + x2 + x3 <= 0']
)
# 3. Configure a metric Ax will target.
client.configure_optimization(objective="-1 * booth_like")
# 4. Run trials with a 3-parameter objective function.
for _ in range(20):
for trial_index, parameters in client.get_next_trials(max_trials=1).items():
# 3-variable Booth-like function
x1, x2, x3 = parameters["x1"], parameters["x2"], parameters["x3"]
objective_value = (
(x1 + 2 * x2 - 7) ** 2
+ (2 * x1 + x2 - 5) ** 2
+ (x3 - 3) ** 2 # additional dimension
)
client.complete_trial(
trial_index=trial_index,
raw_data={"booth_like": objective_value},
)
ContourPlot(
x_parameter_name='x1',
y_parameter_name='x2',
metric_name='booth_like'
).compute(client._experiment, client._generation_strategy)
## Raises ValueError: `x3` not present in param_dict.
Please paste any relevant traceback/logs produced by the example provided.
ValueError Traceback (most recent call last)
Cell In[12], line 5
1 ContourPlot(
2 x_parameter_name='x1',
3 y_parameter_name='x2',
4 metric_name='booth_like'
----> 5 ).compute(client._experiment, client._generation_strategy)
File [.venv/lib/python3.12/site-packages/ax/analysis/plotly/surface/contour.py:86], in ContourPlot.compute(self, experiment, generation_strategy, adapter)
78 relevant_adapter = extract_relevant_adapter(
79 experiment=experiment,
80 generation_strategy=generation_strategy,
81 adapter=adapter,
82 )
84 metric_name = self.metric_name or select_metric(experiment=experiment)
---> 86 df = _prepare_data(
87 experiment=experiment,
88 model=relevant_adapter,
89 x_parameter_name=self.x_parameter_name,
90 y_parameter_name=self.y_parameter_name,
91 metric_name=metric_name,
92 )
94 fig = _prepare_plot(
95 df=df,
96 x_parameter_name=self.x_parameter_name,
(...) 105 display_sampled=self._display_sampled,
106 )
108 return [
109 self._create_plotly_analysis_card(
110 title=(
(...) 129 )
130 ]
File [.venv/lib/python3.12/site-packages/ax/analysis/plotly/surface/contour.py:177], in _prepare_data(experiment, model, x_parameter_name, y_parameter_name, metric_name)
155 ys = [*[sample[1] for sample in sampled], *unsampled_ys]
157 # Construct observation features for each parameter value previously chosen by
158 # fixing all other parameters to their status-quo value or mean.
159 features = [
160 ObservationFeatures(
161 parameters={
162 x_parameter_name: x,
163 y_parameter_name: y,
164 **{
165 parameter.name: select_fixed_value(parameter=parameter)
166 for parameter in experiment.search_space.parameters.values()
167 if not (
168 parameter.name == x_parameter_name
169 or parameter.name == y_parameter_name
170 )
171 },
172 }
173 )
174 for x in xs
175 for y in ys
176 # Do not create features for any out of sample points
--> 177 if experiment.search_space.check_membership(
178 parameterization={
179 x_parameter_name: x,
180 y_parameter_name: y,
181 },
182 raise_error=False,
183 check_all_parameters_present=False,
184 )
185 ]
187 predictions = model.predict(observation_features=features)
189 return none_throws(
190 pd.DataFrame.from_records(
191 [
(...) 204 ).drop_duplicates()
205 )
File [.venv/lib/python3.12/site-packages/ax/core/search_space.py:247], in SearchSpace.check_membership(self, parameterization, raise_error, check_all_parameters_present)
240 numerical_param_dict = {
241 name: float(none_throws(value))
242 for name, value in parameterization.items()
243 if self.parameters[name].is_numeric
244 }
246 for constraint in self._parameter_constraints:
--> 247 if not constraint.check(numerical_param_dict):
248 if raise_error:
249 raise ValueError(f"Parameter constraint {constraint} is violated.")
File [.venv/lib/python3.12/site-packages/ax/core/parameter_constraint.py:67], in ParameterConstraint.check(self, parameter_dict)
65 for parameter_name in self.constraint_dict.keys():
66 if parameter_name not in parameter_dict.keys():
---> 67 raise ValueError(f"`{parameter_name}` not present in param_dict.")
69 weighted_sum = sum(
70 float(parameter_dict[param]) * weight
71 for param, weight in self.constraint_dict.items()
72 )
73 # Expected `int` for 2nd anonymous parameter to call `int.__le__` but got
74 # `float`.
ValueError: `x3` not present in param_dict.
Ax Version
1.0.0
Python Version
3.12
Operating System
Ubuntu 24.04
(Optional) Describe any potential fixes you've considered to the issue outlined above.
Fixing this might not be that easy. I've thought of several fixes but they contain some drawbacks:
- Fix
ax.core.search_space.check_membershipto try/catchconstraint.check(numerical_param_dict)and return False on exception:
# In ax.core.search_space.check_membership()
...
for constraint in self._parameter_constraints:
try:
ret = constraint.check(numerical_param_dict)
except ValueError:
if raise_error:
raise
else:
return False
else:
if not ret:
if raise_error:
raise ValueError(f"Parameter constraint {constraint} is violated.")
return False
This resolves the original bug with raise_error=False and TopSurfacesAnalysis should now at least return completed analyses, but ContourPlots will now be empty and possibly crash in predictions = model.predict(observation_features=features) if features is empty (since all constraint checks returned False).
-
Another option is to remove the search space membership check in
ax.analysis.plotly.surface.contour._prepare_data. Now ContourPlot and SlicePlot will work similarly but the result may contain features outside the search space. This is how I currently have fixed the issue. -
Add the fixed parameters to the membership check (and add the check to SlicePlot as well):
# ax.analysis.plotly.surface.contour._prepare_data
...
features = [
ObservationFeatures(
parameters={
x_parameter_name: x,
y_parameter_name: y,
**{
parameter.name: select_fixed_value(parameter=parameter)
for parameter in experiment.search_space.parameters.values()
if not (
parameter.name == x_parameter_name
or parameter.name == y_parameter_name
)
},
}
)
for x in xs
for y in ys
]
features = [
obs_features for obs_features in features
# Do not create features for any out of sample points
if experiment.search_space.check_membership(
parameterization=**obs_features.parameters, # <- here
raise_error=False,
check_all_parameters_present=False,
)]
# if no features, return empty df?
This will in a sense fix the issue, but if select_fixed_value() returns a combination of parameters that do not belong to the search space, all membership checks will fail and the features list will be empty.
- Make it easier for user to select fixed values for parameters. I'm not exactly sure how this would work, but user might have information how to construct a feature set that belongs to the search space. Perhaps search space could have a method that selects feature values for some frozen features:
# ax.core.search_space
# frozen_features = {x_parameter_name: x, y_parameter_name: y}
def select_fixed_values(self, frozen_features: dict[str, TParamValue]) -> ObservationFeatures:
# By default, just use select_fixed_value like it works currently
return ObservationFeatures(
parameters={
**frozen_features,
**{
parameter.name: select_fixed_value(parameter=parameter)
for parameter in self.parameters.values()
if not parameter.name in frozen_features
},
}
)
The user could then override that method to provide parameter values based on the frozen_features.
It might also make some sense to move select_fixed_value function to a method in Parameter, so that it could be overridden by user. At the moment I'm using mocks to select certain values when computing contour/slice plots.
Pull Request
None
Code of Conduct
- I agree to follow Ax's Code of Conduct