-
Notifications
You must be signed in to change notification settings - Fork 564
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Convert fp32 datasets to fp64 in ARIMA and AutoARIMA + update notebook to avoid deprecation warnings with positional parameters #4195
Conversation
… update notebook to avoid deprecation warnings with positional parameters
Check out this pull request on See visual diffs & provide feedback on Jupyter Notebooks. Powered by ReviewNB |
python/cuml/tsa/arima.pyx
Outdated
# Get device array. Float64 only for now. | ||
self.d_y, self.n_obs, self.batch_size, self.dtype \ | ||
= input_to_cuml_array(endog, check_dtype=np.float64) | ||
= input_to_cuml_array(endog, convert_to_dtype=np.float64) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
= input_to_cuml_array(endog, convert_to_dtype=np.float64) | |
= input_to_cuml_array( | |
endog, | |
convert_to_dtype=(np.float64 if convert_dtype else None) | |
) |
We can add a convert_dtype
parameter in the __init__
(set by default to True) so that the user can disable the auto conversion if they want (similar to fit/predict in other models
def fit(self, X, y, convert_dtype=True) -> "LinearRegression": |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I have added convert_dtype
and removed the warning (unless I make False
the default for this option, I don't think that it makes sense to have a warning, because this is the intended behavior of convert_dtype=True
).
I guess we can implement a warning when we add float32 support.
This PR has been labeled |
This PR has been labeled |
Codecov ReportBase: 78.02% // Head: 78.04% // Increases project coverage by
Additional details and impacted files@@ Coverage Diff @@
## branch-22.10 #4195 +/- ##
================================================
+ Coverage 78.02% 78.04% +0.02%
================================================
Files 180 180
Lines 11385 11423 +38
================================================
+ Hits 8883 8915 +32
- Misses 2502 2508 +6
Flags with carried forward coverage won't be shown. Click here to find out more.
Help us with your feedback. Take ten seconds to tell us how you rate us. Have a feature suggestion? Share it here. ☔ View full report at Codecov. |
@gpucibot merge |
…k to avoid deprecation warnings with positional parameters (rapidsai#4195) Authors: - Louis Sugy (https://github.com/Nyrio) - Dante Gama Dessavre (https://github.com/dantegd) Approvers: - Dante Gama Dessavre (https://github.com/dantegd) URL: rapidsai#4195
No description provided.