Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Ask tell API #346

Merged
merged 24 commits into from
Sep 23, 2021
Merged

Ask tell API #346

merged 24 commits into from
Sep 23, 2021

Conversation

apaleyes
Copy link
Collaborator

Related issue: #215

This PR adds high level ask/tell interface. It is essentially a stripped down optimize function with the loop, all history and state tracking removed.

Contrary to the discussion in the issue, I've went with simpler approach and no coroutines. This allows for a nice way of saving and loading state, which can be important in complex scenarios where the optimization has to be stopped for a considerable time to perform the experiment. There are also a few one line fixes in here.

Once we agree on the API, the next PR with docs and example notebook will follow.

@apaleyes
Copy link
Collaborator Author

After discussion with @vpicheny , I am going to add a notebook to this PR, to help reviewers see the API in action.

Copy link
Collaborator

@uri-granta uri-granta left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks good, and reassuringly not over-engineered! Various comments.


@classmethod
def from_record(
cls, search_space: SP, acquisition_rule: AcquisitionRule[S, SP], record: Record[S]
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Maybe put record first and make acquisition_rule optional (assuming you end up doing that for init)? Also it might make sense to expose search_space and acquisition_rule as @properties. (There'd probably be no harm doing that for BayesianOptimizer too, though it's more obviously useful here.)

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

yup, now that the rule can be optional this needs some reordering indeed

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

what would properties be good for?


:return: An optimization state record.
"""
models_copy = copy.deepcopy(self._models)
Copy link
Collaborator

@hstojic hstojic Sep 22, 2021

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

should we check here if deepcopy method is available and give a user-friendly message that the model cannot be saved? though perhaps a better place for this check might be in models interfaces...
(Keras models don't have deepcopy I think, and would produce a less user-friendly message)

docstring could perhaps include a warning to users related to this, along the lines what you have in the notebook

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

How can this be checked? Just because an object doesn't override __deepcopy__ doesn't mean it cannot be deepcopied. What would happen if one tries to deepcopy a Keras model?

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We could catch any exception and reraise with an additional error message, but I don't think that'll really add much value. Maybe mention in the docstring that this will fail if the models or acquisition state are not deep copyable?

Copy link
Collaborator

@hstojic hstojic left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

left a few minor comments, but overall looks great!

Copy link
Collaborator

@uri-granta uri-granta left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

A few comments but overall LGTM!


:return: An optimization state record.
"""
models_copy = copy.deepcopy(self._models)
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We could catch any exception and reraise with an additional error message, but I don't think that'll really add much value. Maybe mention in the docstring that this will fail if the models or acquisition state are not deep copyable?

@henrymoss henrymoss merged commit 0a04e1d into secondmind-labs:develop Sep 23, 2021
@apaleyes apaleyes deleted the ask-tell branch September 27, 2021 14:02
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants