Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Plugging new parametrization inside instrumentation and optimization #391

Merged
merged 150 commits into from
Jan 17, 2020
Merged
Show file tree
Hide file tree
Changes from 139 commits
Commits
Show all changes
150 commits
Select commit Hold shift + click to select a range
2c50068
Prepare a new instrumentation pattern
jrapin Nov 14, 2019
f12070d
Value as property
jrapin Nov 14, 2019
fc87a3e
setter
jrapin Nov 14, 2019
da3e7b7
Random state and constraint
jrapin Nov 15, 2019
c3ab658
naming
jrapin Nov 15, 2019
41a86b4
nits
jrapin Nov 15, 2019
7f0f9a7
Split core features and secondary features
jrapin Nov 15, 2019
fce691b
More explicit structure
jrapin Nov 16, 2019
a27e0c1
missing
jrapin Nov 16, 2019
1c822e9
Feature test
jrapin Nov 16, 2019
1c7c523
Feature test
jrapin Nov 16, 2019
38020a6
fix
jrapin Nov 16, 2019
9c10389
Add name setter
jrapin Nov 16, 2019
1292002
Compliant dict
jrapin Nov 16, 2019
c185d7e
simplified spawn
jrapin Nov 16, 2019
4cbf118
Solve random_state propagation
jrapin Nov 16, 2019
20c315e
Solve random_state propagation
jrapin Nov 16, 2019
9ce1cd6
Constraint support
jrapin Nov 16, 2019
402c2f4
Array recombination
jrapin Nov 16, 2019
7b10af6
recombine
jrapin Nov 16, 2019
233dfc5
ParametersList
jrapin Nov 16, 2019
9cbfece
Starting choice parameter
jrapin Nov 16, 2019
9760530
Renaming mel-syd
jrapin Dec 1, 2019
ef9920e
Add generation + renaming (han-par)
jrapin Dec 7, 2019
6c8c55f
Added preliminary choice test
jrapin Dec 8, 2019
b9748d0
Better test
jrapin Dec 8, 2019
22d8b4a
Simplified access
jrapin Dec 9, 2019
ac71642
More simplification
jrapin Dec 9, 2019
900c025
Better access
jrapin Dec 9, 2019
3e33192
Start solving Choice test
jrapin Dec 9, 2019
ea74e47
Fixing all Choice main tests
jrapin Dec 9, 2019
6959a6a
Merge branch 'master' into instrumentation_v3
jrapin Dec 10, 2019
c4da7b9
nits
jrapin Dec 10, 2019
eafd9ce
Add instrumentation class
jrapin Dec 10, 2019
6ebc342
Restructure
jrapin Dec 10, 2019
b00178a
Merge branch 'master' into instrumentation_v3
jrapin Dec 10, 2019
7ad71a7
Simplify spawn
jrapin Dec 10, 2019
834582f
nits
jrapin Dec 10, 2019
91b08c6
Basic implementation of scalar
jrapin Dec 10, 2019
159e79d
Merge branch 'master' into instrumentation_v3
jrapin Dec 11, 2019
a01539d
Adding exponent to array
jrapin Dec 11, 2019
87c7c63
Log distributed data
jrapin Dec 11, 2019
56c7c69
Warning for bounds
jrapin Dec 11, 2019
b4ee7e1
Typing fix
jrapin Dec 11, 2019
5e78ac0
Better bound warning
jrapin Dec 11, 2019
9798394
Default to mutable sigma
jrapin Dec 11, 2019
bc3ea96
Better yet
jrapin Dec 11, 2019
e91c8f7
Samping
jrapin Dec 11, 2019
b11a59d
nit
jrapin Dec 11, 2019
4c01ca3
mypyfix
jrapin Dec 12, 2019
a2b6593
Convert other instances to std
jrapin Dec 12, 2019
9ba6758
Correction
jrapin Dec 12, 2019
4f9608e
Better tested conversion
jrapin Dec 12, 2019
51c0f00
Warning against lambdas
jrapin Dec 12, 2019
83ef082
Simplified initialization
jrapin Dec 12, 2019
a8d2511
Remove Ng prefix for simplicitys sake
jrapin Dec 12, 2019
98278f1
Disallow logarithmic with non-positive value
jrapin Dec 12, 2019
3ee9e5f
Activate other bounds, yet to be tested
jrapin Dec 12, 2019
ceacae3
Tested constraints
jrapin Dec 12, 2019
fe32abb
Removed warning
jrapin Dec 12, 2019
610effa
typing
jrapin Dec 12, 2019
813f6f3
Restructure as parametrization
jrapin Dec 13, 2019
9381212
Remove useless ignore
jrapin Dec 13, 2019
7868f87
Cast to int + beginning docstrings
jrapin Dec 13, 2019
c59d61a
Hopefully working typing
jrapin Dec 13, 2019
e16265a
Non-deterministic by default
jrapin Dec 13, 2019
77c99ec
Better docstrings
jrapin Dec 13, 2019
5aae810
More docstrings again
jrapin Dec 13, 2019
b8a9b33
Docstrings...
jrapin Dec 13, 2019
f473e54
Updated naming pattern and tested
jrapin Dec 13, 2019
c1717d9
Initial tag system
jrapin Dec 13, 2019
a559b6f
Updates and simplifications
jrapin Dec 16, 2019
83eaf5e
Add OrderedChoice
jrapin Dec 16, 2019
6040625
Share code for selection parameters
jrapin Dec 16, 2019
13e6a67
Names
jrapin Dec 16, 2019
d9a8ec7
Better docstrings
jrapin Dec 16, 2019
68c260f
Small comment
jrapin Dec 16, 2019
5f306be
file renaming
jrapin Dec 16, 2019
502b8d9
Compatibility for OrderedDiscrete
jrapin Dec 16, 2019
537efaf
[PR on instrumentation_v3] Convert all subparameters into Parameter (…
jrapin Dec 18, 2019
c98e6c1
Choice length
jrapin Dec 18, 2019
7125b77
Make Variable an Parameter/Instrumentation
jrapin Dec 19, 2019
b1a4f33
More tests
jrapin Dec 19, 2019
d3b8e2d
Constant correction
jrapin Dec 19, 2019
a2bce6f
Typing
jrapin Dec 19, 2019
5b3b0c0
Change typing to Parameter
jrapin Dec 19, 2019
28c60a3
Test both pipelines
jrapin Dec 19, 2019
3925292
Update core.py
jrapin Dec 19, 2019
d235a6f
Typing
jrapin Dec 19, 2019
2cf7566
details
jrapin Dec 19, 2019
ad839d7
Add set cheap constraint for compatiblity
jrapin Dec 19, 2019
d492182
skip
jrapin Dec 19, 2019
a2187f4
nit
jrapin Dec 19, 2019
484be3d
Merge base parameter and parameter
jrapin Dec 19, 2019
bc7ae1f
comment and renaming
jrapin Dec 19, 2019
2ba4ee6
comments
jrapin Dec 19, 2019
c1a82c1
Update description of constraint + start fixing sampling
jrapin Dec 20, 2019
0d7e108
Simplify choice
jrapin Dec 20, 2019
b9dea96
Move discretization and transforms modules
jrapin Dec 20, 2019
f8b6b8b
isinstance
jrapin Dec 20, 2019
f2e4017
Trying to remove some old code
jrapin Dec 20, 2019
1cd1756
Sanity check
jrapin Dec 20, 2019
94b2ac9
Merge branch 'instrumentation_v3' into variable_as_instrumentation
jrapin Dec 20, 2019
731ca05
Remove all Instrumentation
jrapin Dec 20, 2019
8d13ea7
Plug new instrumentation
jrapin Dec 20, 2019
f95a828
Improve perf with new Array
jrapin Dec 20, 2019
3c70b19
Flag failing instrumentations
jrapin Dec 20, 2019
a71db0d
Add a freezing mechanism
jrapin Dec 21, 2019
72080b8
Merging parametrization with freezing
jrapin Dec 21, 2019
bf0db82
Freezing and faster compatibility
jrapin Dec 21, 2019
546d51d
Start updating descriptros
jrapin Dec 21, 2019
038ec54
Updated descriptor pattern
jrapin Dec 21, 2019
8cd3eac
Enforce keyword arg
jrapin Dec 24, 2019
b6bdbb9
Solve initial value issue
jrapin Dec 24, 2019
56e308f
Update benchmark
jrapin Dec 24, 2019
1f2b9fc
Update functions
jrapin Dec 24, 2019
987cb1a
Use new choces
jrapin Dec 24, 2019
15076cb
transfor descriptors to children
jrapin Dec 24, 2019
b85b26b
Solve nan issue
jrapin Dec 24, 2019
4657833
No name shortening
jrapin Dec 24, 2019
2426f43
Merge branch 'master' into instrumentation_v3
jrapin Dec 24, 2019
d6eb182
Start merging master
jrapin Dec 24, 2019
4281962
Cleaner typing
jrapin Dec 25, 2019
01e6b9d
Add helpers
jrapin Dec 25, 2019
7bdecac
Merging master
jrapin Dec 25, 2019
0890214
Merge branch 'master' into instrumentation_v3
jrapin Dec 25, 2019
521cc56
Merge branch 'instrumentation_v3' into variable_as_instrumentation
jrapin Dec 25, 2019
3557470
Basic update of the documentation
jrapin Dec 25, 2019
0b9dbdd
Start adding deprecation warnings
jrapin Dec 25, 2019
4256956
Merge master
jrapin Dec 26, 2019
e558760
Merging master
jrapin Dec 26, 2019
160d3dc
Removing old constraint calls
jrapin Dec 27, 2019
64eea33
Remove from_value in favor of spawn_child param
jrapin Dec 27, 2019
f726de6
Deprecation warning for set_cheap_constraint_checker
jrapin Dec 27, 2019
1dab9a0
Merge branch 'instrumentation_v3' into variable_as_instrumentation
jrapin Dec 27, 2019
988ec11
Deprecation warnings for old descriptors
jrapin Dec 27, 2019
bac0150
Merging master (containing parameter)
jrapin Dec 30, 2019
c88893f
Merge master with ExperimentFunction (#432)
jrapin Jan 6, 2020
5281af4
Merge master again
jrapin Jan 6, 2020
efdf342
Minor documentation edit
jrapin Jan 6, 2020
4cd6de9
Change typing import to tp
jrapin Jan 6, 2020
6742296
Merge master
jrapin Jan 6, 2020
13ffb23
Solve compatibility bug
jrapin Jan 7, 2020
955942b
Merging master
jrapin Jan 7, 2020
7134287
Merge branch 'master' into variable_as_instrumentation
jrapin Jan 14, 2020
43d2014
merge
jrapin Jan 15, 2020
06b7c63
Merge branch 'master' into variable_as_instrumentation
jrapin Jan 15, 2020
0a17db6
Center and reduce standardized_space relatively to reference (#461)
jrapin Jan 16, 2020
675d929
Add test for offset
jrapin Jan 17, 2020
8791fb8
Merge branch 'master' into variable_as_instrumentation
jrapin Jan 17, 2020
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
5 changes: 5 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,6 +2,11 @@

## master

- Instrumentation names are changed (possibly breaking for benchmarks records)
- Instrumented functions may silently failed when initialized as: `InstrumentedFunction(func, *inst.args, **inst.kwargs)`,
because `args` and `kwargs` are now actual parameter values (new parametrization)
- Temporary performance loss is expected in orded to keep compatibility between `Variable` and `Parameter` paradigms.

### Breaking changes

- `Instrumentation` is now a `Variable` for simplicity and flexibility. The `Variable` API has therefore heavily changed, and more (bigger yet) changes are coming. This should only impact custom-made variables.
Expand Down
6 changes: 3 additions & 3 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -28,13 +28,13 @@ You can join Nevergrad users Facebook group [here](https://www.facebook.com/grou

The goals of this package are to provide:
- **gradient/derivative-free optimization algorithms**, including algorithms able to handle noise.
- **tools to instrument any code**, making it painless to optimize your parameters/hyperparameters, whether they are continuous, discrete or a mixture of continuous and discrete variables.
- **tools to parametrize any code**, making it painless to optimize your parameters/hyperparameters, whether they are continuous, discrete or a mixture of continuous and discrete parameters.
- **functions** on which to test the optimization algorithms.
- **benchmark routines** in order to compare algorithms easily.

The structure of the package follows its goal, you will therefore find subpackages:
- `optimization`: implementing optimization algorithms
- `instrumentation`: tooling to convert code into a well-defined function to optimize.
- `parametrization`: specifying what are the parameters you want to optimize
- `functions`: implementing both simple and complex benchmark functions
- `benchmark`: for running experiments comparing the algorithms on benchmark functions
- `common`: a set of tools used throughout the package
Expand All @@ -48,7 +48,7 @@ The structure of the package follows its goal, you will therefore find subpackag

The following README is very general, here are links to find more details on:
- [how to perform optimization](docs/optimization.md) using `nevergrad`, including using parallelization and a few recommendation on which algorithm should be used depending on the settings
- [how to instrument](docs/instrumentation.md) functions with any kind of parameters in order to convert them into a function defined on a continuous vectorial space where optimization can be performed. It also provides a tool to instantiate a script or non-python code in order into a Python function and be able to tune some of its parameters.
- [how to parametrize](docs/parametrization.md) your problem so that the optimizers are informed of the problem to solve. This also provides a tool to instantiate a script or non-python code in order into a Python function and be able to tune some of its parameters.
- [how to benchmark](docs/benchmarking.md) all optimizers on various test functions.
- [benchmark results](docs/benchmarks.md) of some standard optimizers an simple test cases.
- examples of [optimization for machine learning](docs/machinelearning.md).
Expand Down
2 changes: 1 addition & 1 deletion docs/instrumentation.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
# Instrumentation

**Please note that instrumentation is still a work in progress with heavy changes still planned. We will try to update it to make it simpler and simpler to use (all feedbacks are welcome ;) ), with the side effect that there will be breaking changes.**
**Please note that this document is deprecated, you should now refer to [parametrization.md](parametrization.md)**

The aim of instrumentation is to turn a piece of code with parameters you want to optimize into a function defined on an n-dimensional continuous data space in which the optimization can easily be performed. For this, discrete/categorical arguments must be transformed to continuous variables, and all variables concatenated. The instrumentation subpackage will help you do thanks to:
- the `variables` modules providing priors that can be used to define each argument.
Expand Down
4 changes: 2 additions & 2 deletions docs/optimization.md
Original file line number Diff line number Diff line change
Expand Up @@ -25,7 +25,7 @@ In this example, the optimal value will be found in `recommendation.args[0]` and
`instrumentation=n` is a shortcut to state that the function has only one variable, continuous, of dimension `n`,
Defining the following instrumentation instead will optimize on both `x` (continuous, dimension 2) and `y` (continuous, dimension 1).
```python
instrum = ng.Instrumentation(ng.var.Array(2), y=ng.var.Array(1).asscalar())
instrum = ng.Instrumentation(ng.p.Array(shape=(2,)), y=ng.p.Scalar())
optimizer = ng.optimizers.OnePlusOne(instrumentation=instrum, budget=100)
recommendation = optimizer.minimize(square)
print(recommendation)
Expand Down Expand Up @@ -82,7 +82,7 @@ def square(x):

optimizer = ng.optimizers.OnePlusOne(instrumentation=2, budget=100)
# define a constraint on first variable of x:
optimizer.instrumentation.set_cheap_constraint_checker(lambda x: x[0] >= 1)
optimizer.instrumentation.register_cheap_constraint(lambda x: x[0] >= 1)

recommendation = optimizer.minimize(square)
print(recommendation) # optimal args and kwargs
Expand Down
100 changes: 100 additions & 0 deletions docs/parametrization.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,100 @@
# Parametrization

**Please note that parametrization is still a work in progress with heavy changes comming soon! We are trying to update it to make it simpler and simpler to use (all feedbacks are welcome ;) ), with the side effect that there will be breaking changes.**

The aim of parametrization is to specify what are the parameters that the optimization should be performed upon.
The parametrization subpackage will help you do thanks to:
- the `parameter` modules providing classes that should be used to specify each parameter.
- the `Instrumentation`, and `InstrumentedFunction` classes which provide an interface for converting any arguments into the data space used for optimization, and convert from data space back to the arguments space.
jrapin marked this conversation as resolved.
Show resolved Hide resolved
- the `FolderFunction` which helps transform any code into a Python function in a few lines. This can be especially helpful to optimize parameters in non-Python 3.6+ code (C++, Octave, etc...) or parameters in scripts.

turn a piece of code with parameters you want to optimize into a function defined on an n-dimensional continuous data space in which the optimization can easily be performed, and define how these parameters can be mutated and combined together.

## Variables

6 types of variables are currently provided:
- `Choice(items)`: describes a parameter which can take values within the provided list of (usually unordered categorical) items, and for which transitions are global (from one item to any other item). The returned element will be sampled as the softmax of the values on these dimensions. Be cautious: this process is non-deterministic and makes the function evaluation noisy.
- `TransitionChoice(items)`: describes a parameter which can take values within the provided list of (usually ordered) items, and for which transitions are local (from one item to close items).
- `Array(shape)`: describes a `np.ndarray` of any shape. The bounds of the array and the mutation of this array can be specified (see `set_bounds`, `set_mutation`). This makes it a very flexible type of variable. Eg. `Array(shape=(2, 3)).set_bounds(0, 2)` encodes for an array of shape `(2, 3)`, with values bounded between 0 and 2.
- `Scalar(dtype)`: describes a float (the default) or an int.
and all `Array` methods are therefore available. Note that `Gaussian(a, b)` is equivalent to `Scalar().affined(a, b)`.
- `Log(a_min, a_max)`: describes log distributed data between two bounds. Under the hood this uses an `Scalar` with appropriate specifications for bounds and mutations.


## Instrumentation

Instrumentation helps you convert a set of arguments into variables in the data space which can be optimized. The core class performing this conversion is called `Instrumentation`. It provides arguments conversion through the `arguments_to_data` and `data_to_arguments` methods. Since `data_to_arguments` can be stochastic, the instrumentation holds a random state (`instrumentation.random_state`) which is also used by optimizers.


```python
import nevergrad as ng

# argument transformation
arg1 = ng.p.TransitionChoice(["a", "b"]) # 1st arg. = positional discrete argument
arg2 = ng.var.Choice(["a", "c", "e"]) # 2nd arg. = positional discrete argument
value = ng.p.Scalar() # the 4th arg. is a keyword argument with Gaussian prior

# create the instrumented function
instrum = ng.p.Instrumentation(arg1, arg2, "blublu", value=value)
# the 3rd arg. is a positional arg. which will be kept constant to "blublu"
print(instrum.dimension) # 5 dimensional space

# The dimension is 5 because:
# - the 1st discrete variable has 1 possible values, represented by a hard thresholding in
# a 1-dimensional space, i.e. we add 1 coordinate to the continuous problem
# - the 2nd discrete variable has 3 possible values, represented by softmax, i.e. we add 3 coordinates to the continuous problem
# - the 3rd variable has no uncertainty, so it does not introduce any coordinate in the continuous problem
# - the 4th variable is a real number, represented by single coordinate.


instrum.set_standardized_data([1, -80, -80, 80, 3]))
# prints (instrum.args, instrum.kwargs): (('b', 'e', 'blublu'), {'value': 7})
# b is selected because 1 > 0 (the threshold is 0 here since there are 2 values.
# e is selected because proba(e) = exp(80) / (exp(80) + exp(-80) + exp(-80))
# value=7 because 3 * std + mean = 7
```


You can then directly perform optimization on a function given its instrumentation:
```python
def myfunction(arg1, arg2, arg3, value=3):
print(arg1, arg2, arg3)
return value**2

optimizer = ng.optimizers.OnePlusOne(instrumentation=instrum, budget=100)
recommendation = optimizer.minimize(ifunc)
```


## External code instantiation

Sometimes it is completely impractical or impossible to have a simple Python3.6+ function to optimize. This may happen when the code you want to optimize is a script. Even more so if the code you want to optimize is not Python3.6+.

We provide tooling for this situation. Go through these steps to instrument your code:
- **identify the variables** (parameters, constants...) you want to optimize.
- **add placeholders** to your code. Placeholders are just tokens of the form `NG_ARG{name|comment}` where you can modify the name and comment. The name you set will be the one you will need to use as your function argument. In order to avoid breaking your code, the line containing the placeholders can be commented. To notify that the line should be uncommented for instrumentation, you'll need to add "@nevergrad@" at the start of the comment. Here is an example in C which will notify that we want to obtain a function with a `step` argument which will inject values into the `step_size` variable of the code:
```c
int step_size = 0.1
// @nevergrad@ step_size = NG_ARG{step|any comment}
```
- **prepare the command to execute** that will run your code. Make sure that the last printed line is just a float, which is the value to base the optimization upon. We will be doing minimization here, so this value must decrease for better results.
- **instantiate** your code into a function using the `FolderFunction` class:
```python
from nevergrad.instrumentation import FolderFunction
folder = "nevergrad/instrumentation/examples" # folder containing the code
command = ["python", "examples/script.py"] # command to run from right outside the provided folder
func = FolderFunction(folder, command, clean_copy=True)
print(func.placeholders) # will print the number of variables of the function
# prints: [Placeholder('value1', 'this is a comment'), Placeholder('value2', None), Placeholder('string', None)]
print(func(value1=2, value2=3, string="blublu"))
# prints: 12.0
```
- **instrument** the function, (see Instrumentation section just above).


## Tips and caveats

- using `FolderFunction` argument `clean_copy=True` will copy your folder so that tempering with it during optimization will run different versions of your code.
- under the hood, with or without `clean_copy=True`, when calling the function, `FolderFunction` will create symlink copy of the initial folder, remove the files that have tokens, and create new ones with appropriate values. Symlinks are used in order to avoid duplicating large projects, but they have some drawbacks, see next point ;)
- one can add a compilation step to `FolderFunction` (the compilation just has to be included in the script). However, be extra careful that if the initial folder contains some build files, they could be modified by the compilation step, because of the symlinks. Make sure that during compilation, you remove the build symlinks first! **This feature has not been fool proofed yet!!!**
- the following external file types are registered by default: `[".c", ".h", ".cpp", ".hpp", ".py", ".m"]`. Custom file types can be registered using `instrumentation.register_file_type` by providing the relevant file suffix as well as the characters that indicate a comment. However, for now, variables which can provide a vector or values (`Gaussian` when providing a `shape`) will inject code with a Python format (list) by default, which may not be suitable.
3 changes: 2 additions & 1 deletion nevergrad/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,8 @@
from .optimization import callbacks
from .instrumentation.multivariables import Instrumentation
from .instrumentation import variables as var
from .parametrization import parameter as p

__all__ = ["Instrumentation", "var", "optimizers", "families", "callbacks"]
__all__ = ["Instrumentation", "var", "optimizers", "families", "callbacks", "p"]

__version__ = "0.3.0"
1 change: 0 additions & 1 deletion nevergrad/benchmark/execution.py
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,6 @@
from collections import deque
from typing import List, Callable, Any, NamedTuple, Tuple, Dict, Optional, Deque
from nevergrad.functions import ExperimentFunction
# this object only serves to provide delays that the executor must use to order jobs


class MockedTimedJob:
Expand Down
14 changes: 11 additions & 3 deletions nevergrad/benchmark/experiments.py
Original file line number Diff line number Diff line change
Expand Up @@ -303,6 +303,13 @@ def illcondipara(seed: Optional[int] = None) -> Iterator[Experiment]:
yield Experiment(function, optim, budget=budget, num_workers=1, seed=next(seedg))


def _positive_sum(args_kwargs: Any) -> bool:
args, kwargs = args_kwargs
if kwargs or len(args) != 1 or not isinstance(args[0], np.ndarray):
raise ValueError(f"Unexpected inputs {args} and {kwargs}")
return float(np.sum(args[0])) > 0


@registry.register
def constrained_illconditioned_parallel(seed: Optional[int] = None) -> Iterator[Experiment]:
"""All optimizers on ill cond problems
Expand All @@ -315,10 +322,12 @@ def constrained_illconditioned_parallel(seed: Optional[int] = None) -> Iterator[
functions = [
ArtificialFunction(name, block_dimension=50, rotation=rotation) for name in ["cigar", "ellipsoid"] for rotation in [True, False]
]
for func in functions:
func.parametrization.register_cheap_constraint(_positive_sum)
for optim in optims:
for function in functions:
for budget in [400, 4000, 40000]:
yield Experiment(function, optim, budget=budget, num_workers=1, seed=next(seedg), cheap_constraint_checker=lambda x: np.sum(x) > 0)
yield Experiment(function, optim, budget=budget, num_workers=1, seed=next(seedg))


@registry.register
Expand Down Expand Up @@ -463,8 +472,7 @@ def realworld(seed: Optional[int] = None) -> Iterator[Experiment]:
# Adding ARCoating.
funcs += [ARCoating()]
funcs += [PowerSystem(), PowerSystem(13)]
funcs += [STSP(), STSP(2, 500)]

funcs += [STSP(), STSP(500)]
funcs += [game.Game("war")]
funcs += [game.Game("batawaf")]
funcs += [game.Game("flip")]
Expand Down
3 changes: 2 additions & 1 deletion nevergrad/benchmark/test_experiments.py
Original file line number Diff line number Diff line change
Expand Up @@ -62,7 +62,8 @@ def check_seedable(maker: Any) -> None:
simplified = [Experiment(xp.function, algo, budget=2, num_workers=min(2, xp.optimsettings.num_workers), seed=xp.seed) for xp in xps]
np.random.shuffle(simplified) # compute in any order
selector = Selector(data=[xp.run() for xp in simplified])
results.append(Selector(selector.loc[:, ["loss", "seed"]])) # elapsed_time can vary...
results.append(Selector(selector.loc[:, ["loss", "seed", "error"]])) # elapsed_time can vary...
assert results[-1].unique("error") == {""}, f"An error was raised during optimization:\n{results[-1]}"
results[0].assert_equivalent(results[1], f"Non identical outputs for seed={random_seed}")
np.testing.assert_raises(
AssertionError, results[1].assert_equivalent, results[2], f"Identical output with different seeds (seed={random_seed})"
Expand Down
7 changes: 2 additions & 5 deletions nevergrad/benchmark/xpbase.py
Original file line number Diff line number Diff line change
Expand Up @@ -119,18 +119,16 @@ class Experiment:
def __init__(self, function: fbase.ExperimentFunction,
optimizer: Union[str, obase.OptimizerFamily], budget: int, num_workers: int = 1,
batch_mode: bool = True, seed: Optional[int] = None,
cheap_constraint_checker: Optional[Callable[[Any], Any]] = None,
) -> None:
assert isinstance(function, fbase.ExperimentFunction), ("All experiment functions should "
"derive from ng.functions.ExperimentFunction")
assert function.dimension, "Nothing to optimize"
self.function = function
# Conjecture on the noise level.
self.seed = seed # depending on the inner workings of the function, the experiment may not be repeatable
self.optimsettings = OptimizerSettings(optimizer=optimizer, num_workers=num_workers, budget=budget, batch_mode=batch_mode)
self.result = {"loss": np.nan, "elapsed_budget": np.nan, "elapsed_time": np.nan, "error": ""}
self.recommendation: Optional[obase.Candidate] = None
self._optimizer: Optional[obase.Optimizer] = None # to be able to restore stopped/checkpointed optimizer
self._cheap_constraint_checker = cheap_constraint_checker

def __repr__(self) -> str:
return f"Experiment: {self.optimsettings} (dim={self.function.dimension}) on {self.function}"
Expand Down Expand Up @@ -197,11 +195,10 @@ def _run_with_error(self, callbacks: Optional[Dict[str, obase._OptimCallBack]] =
torch.manual_seed(self.seed) # type: ignore
pfunc = self.function.copy()
instrumentation = pfunc.parametrization
assert len(pfunc.parametrization) == len(self.function.parametrization), "Some constraints failed to be propagated"
# optimizer instantiation can be slow and is done only here to make xp iterators very fast
if self._optimizer is None:
self._optimizer = self.optimsettings.instantiate(instrumentation=instrumentation)
if self._cheap_constraint_checker:
self._optimizer.instrumentation.set_cheap_constraint_checker(self._cheap_constraint_checker)
if callbacks is not None:
for name, func in callbacks.items():
self._optimizer.register_callback(name, func)
Expand Down
Loading