You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Do not normalize or standardize dimension if all values are equal (#2185)
Summary:
Pull Request resolved: #2185
Issue description with `Normalize` transform: Suppose that the train data has x0 as a constant (can happen with few data points) but it is being optimized in [0, 1]. In the current behavior, we first calculate a coefficient of 0.0, then clamp this up to 1e-8. During acqf optimization, we will evaluate the model with values in [0, 1], which will then get normalized to [0, 1e8]. This can cause numerical issues in GPyTorch and lead to non-PSD covariance matrices since the model was trained with constant inputs, likely learning much more reasonable lengthscales that don't play well with these large values.
This diff updates the behavior of `min_range/min_std` in `Normalize/InputStandardize` transforms to skip transforming the given dimension if the range / std of the dimension is less than the minimum. This is achieved using an offset of 0 and a coefficient of 1 for the given dimension.
Reviewed By: esantorella
Differential Revision: D53213759
fbshipit-source-id: 9f738e9c6654e184f6e8a74bb8abe8a530290691
0 commit comments