-
Notifications
You must be signed in to change notification settings - Fork 72
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Enable BayesOpt #120
Enable BayesOpt #120
Conversation
Looks good to me. A few comments/questions:
|
return f_mu.detach(), f_var.detach() | ||
|
||
if joint: | ||
f_mu = f_mu.flatten() # (batch*out) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Could this be risky when we have multi-output regression or do we not support this anyway?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It follows the standard practice in NTK papers that we have f_mu
is of shape (nk,)
and f_cov
is (nk, nk)
, no? Do you have another idea?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Done in bc50025.
Seems to work reasonably:
There is, e.g. in Thompson sampling like https://arxiv.org/abs/1706.01825. I added the option.
There's already
|
@aleximmer, @runame: For making the last-layer Jacobians optionally backpropable, I need to do the following. I'm not sure how to make def forward_with_features(self, x: torch.Tensor, enable_backprop=True)
"""
Add an option `enable_backprop`
"""
pass
def _get_hook(self, name: str) -> Callable:
def hook(_, input, __):
# Depending on the above `enable_backprop`, we detach
self._features[name] = input[0].detach()
return hook Should I make |
I was asking because of this line: you detached |
Yeah I think a class-level attribute is probably the best solution. Also, maybe it makes sense to consistently name the argument which decides wether to detach or not; in some places it's called |
@aleximmer, @runame how do you actually enable backprop in ASDL's jacobian here? (I'm unfamiliar with its API.) |
See 7726b3d |
@aleximmer @runame: There's no more action for me in this PR, so you can review it again. As for enabling all these stuffs in the ASDL backend, how about we invite Michael Aerni to contribute? If he has the code already, I can also do it---in this case, he simply needs to push his changes to a new branch. In any case, let's do this in a separate PR. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Let's just merge this for now.
Fixed @runame |
Changes:
detach()
in GLM predictive so that one can backprop fromf(x)
tox
, in particular for optimizing acquisition functions.functional_covariance
method for GLM predictive that returns the(nk,)
mean and the(nk, nk)
covariance overn
input points andk
outputs.