-
Notifications
You must be signed in to change notification settings - Fork 430
Closed
Labels
bugSomething isn't workingSomething isn't working
Description
🐛 Bug
I want to fit a Gaussian process over samples. After optimization, the lengthscale seems strange. I tried normalizing it and it appeared that normalization change lengthscale which is not expected.
To reproduce
** Code snippet to reproduce **
from gpytorch.mlls import ExactMarginalLogLikelihood
from botorch.models import SingleTaskGP
from botorch import fit_gpytorch_model
from torch import tensor
train_X = tensor([[0.3660, 0.7463],
[0.8714, 0.4299],
[0.5104, 0.0620],
[0.1276, 0.9511]], dtype=float)
train_y = tensor([10000, 11000, 12000, 20000], dtype=float).reshape(-1, 1)
train_yvar = tensor([100, 200, 150, 100], dtype=float).reshape(-1, 1)
gpr = SingleTaskGP(train_X=train_X,
train_Y=train_y,
train_Yvar=train_yvar
)
mll = ExactMarginalLogLikelihood(likelihood=gpr.likelihood,
model=gpr
)
print(gpr.covar_module.base_kernel.lengthscale)
fit_gpytorch_model(mll=mll)
print(gpr.covar_module.base_kernel.lengthscale)
mean = 1.000001#train_y.mean()
std = 1 #train_y.std()
gpr = SingleTaskGP(train_X=train_X,
train_Y=(train_y - mean) / std,
train_Yvar=train_yvar / std**2
)
mll = ExactMarginalLogLikelihood(likelihood=gpr.likelihood,
model=gpr
)
fit_gpytorch_model(mll=mll)
print(gpr.covar_module.base_kernel.lengthscale)
** Stack trace/error message **
tensor([[0.6931, 0.6931]], dtype=torch.float64, grad_fn=<SoftplusBackward0>)
tensor([[0.0622, 1.0933]], dtype=torch.float64, grad_fn=<SoftplusBackward0>)
tensor([[0.3289, 0.0342]], dtype=torch.float64, grad_fn=<SoftplusBackward0>)
Expected Behavior
We expect the lengthscale to be independent of the chosen normalization.
System information
Please complete the following information:
- BoTorch Version 0.10.0
- BoTorch Version 1.11
- PyTorch Version 2.3.0+cu118
- Computer OS : Windows / Linux
Additional context
Add any other context about the problem here.
Metadata
Metadata
Assignees
Labels
bugSomething isn't workingSomething isn't working