Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Non-greedy async optimization #374

Merged
merged 6 commits into from
Oct 19, 2021
Merged

Conversation

apaleyes
Copy link
Collaborator

This PR contains implementation for a second rule for async optimization, conveniently named AsynchronousOptimization. This isn't expected to be high-performant (and we warn users in docs about it), but it gives access to non-greedy batch functions in async mode.

Although not a part of the PR, new rule was verified to run in the async notebook in addition to all tests.

Plus there is a bit of a clean up for blunders I made in #366 .

if state is None:
state = AsynchronousRuleState(None)

state = state.subtract_points(datasets[OBJECTIVE].query_points)
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm not sure this covers the corner case where we have duplicated observations. In that case we do want to remove the pending points that have just been queried but not the pending points that have duplicates in the past.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

why would we want to keep duplicate points? and why would duplicates arise in the first place?

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

if you have noise, it can make sense to choose to sample twice the same point. In some cases, the maximiser of the acquisition function can be an existing point.
I think that making sure that we don't remove more points than the newly acquired ones should do the trick. E.g. if you have 3 identical pending points and you get 1 back, it doesn't matter which pending point you're substracting but you should not remove more than one.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

this is now implemented, but turned out to be very hard to do efficiently, so i suspect current implementation may trigger tf graph evaluation prematurely. If anyone has better ideas on how to implement it, please suggest

@apaleyes apaleyes requested a review from vpicheny October 13, 2021 15:10
Copy link
Collaborator

@vpicheny vpicheny left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

we're going to want to document this in a notebook, but I guess this is another PR?

@apaleyes
Copy link
Collaborator Author

Yes, the notebook is in process. I'd prefer to make a separate PR for it, to make these PRs reasonably sized. Smaller and focused PRs are easier to review.

Copy link
Collaborator

@uri-granta uri-granta left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Haven't looked at the tests yet, but code looks good!

@apaleyes apaleyes requested a review from vpicheny October 18, 2021 11:04
@apaleyes apaleyes merged commit 9e08daf into secondmind-labs:develop Oct 19, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants