Skip to content

Conversation

@TobyBoyne
Copy link
Contributor

Motivation

Currently, qNEHVI proposes repeated experiments in a batch when initial pending points are passed. This PR changes how this class handles pending points - X_pending is now always populated, and only appended in the forward pass if those points have not yet been cached. See issue #2983 for further discussion.

Have you read the Contributing Guidelines on pull requests?

Yes

Test Plan

I will rewrite the tests in test/acquisition/multi_objective/test_monte_carlo.py to ensure that they pass.

@meta-cla meta-cla bot added the CLA Signed Do not delete this pull request or issue due to inactivity. label Aug 19, 2025
@hvarfner
Copy link
Contributor

hvarfner commented Aug 19, 2025

@TobyBoyne Thanks for the PR! I'm pretty sure that @sdaulton meant qLogNEHVI and q(non-Log)NEHVI when he mentioned that more than one acqf have the same issue, so you're already good on that.

@facebook-github-bot
Copy link
Contributor

@hvarfner has imported this pull request. If you are a Meta employee, you can view this in D80533943.

@TobyBoyne TobyBoyne requested review from Balandat and hvarfner August 19, 2025 14:48
Copy link
Contributor

@hvarfner hvarfner left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks @TobyBoyne, this looks good to me!

@codecov
Copy link

codecov bot commented Aug 20, 2025

Codecov Report

✅ All modified and coverable lines are covered by tests.
✅ Project coverage is 100.00%. Comparing base (c60654b) to head (fc3a99b).
⚠️ Report is 9 commits behind head on main.

Additional details and impacted files
@@            Coverage Diff            @@
##              main     #2985   +/-   ##
=========================================
  Coverage   100.00%   100.00%           
=========================================
  Files          216       216           
  Lines        20327     20337   +10     
=========================================
+ Hits         20327     20337   +10     

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

🚀 New features to boost your workflow:
  • ❄️ Test Analytics: Detect flaky tests, report on failures, and find test suite problems.
  • 📦 JS Bundle Analysis: Save yourself from yourself by tracking and limiting bundle sizes in JS merges.

@t_batch_mode_transform()
@average_over_ensemble_models
def forward(self, X: Tensor) -> Tensor:
# Manually concatenate pending points only if:
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks like some 90% of the code in the forward() method here is shared with qLogNoisyExpectedHypervolumeImprovement.forward() above. We should probably deduplicate this by putting it into a shared utility.

@TobyBoyne if you're up for it that would be great, otherwise @hvarfner let's put this on the backlog.

@TobyBoyne
Copy link
Contributor Author

@Balandat, I've extracted the common code in the forward passes into the shared Mixin - is this what you had in mind?

Also, it looks like the non-log acqf always returns + self._prev_nehvi, even when incremental_nehvi==True. Is this correct, or shall I change this as well?

Copy link
Contributor

@hvarfner hvarfner left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hi @TobyBoyne ,

This change looks great. Some minor nits to make the forward passes less opaque, then we should be good to merge this.

q_in = X.shape[-2] * n_w
self._set_sampler(q_in=q_in, posterior=posterior)
samples = self._get_f_X_samples(posterior=posterior, q_in=q_in)
samples, X = super().forward(X)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Nice, this is awesome. Please just add a comment for both to say where the forward pass goes and what it does, since this forward pass is now more opaque.

q_in = X.shape[-2] * n_w
self._set_sampler(q_in=q_in, posterior=posterior)
samples = self._get_f_X_samples(posterior=posterior, q_in=q_in)
samples, X = super().forward(X)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

...and same here.

@TobyBoyne TobyBoyne requested a review from hvarfner August 22, 2025 07:51
@hvarfner
Copy link
Contributor

@TobyBoyne Thank you for being persistent with this. Looks ready to be merged!

@TobyBoyne TobyBoyne requested a review from Balandat August 29, 2025 09:57
@facebook-github-bot
Copy link
Contributor

@Balandat has imported this pull request. If you are a Meta employee, you can view this in D80533943.

Copy link
Contributor

@Balandat Balandat left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks, looks great. Let me import this and make sure this works; I'll land this after the long weekend here in the US.

@facebook-github-bot
Copy link
Contributor

@Balandat merged this pull request in c431c6d.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

CLA Signed Do not delete this pull request or issue due to inactivity. Merged

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants