-
Notifications
You must be signed in to change notification settings - Fork 70
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[FEAT] Probabilistic Forecasting Util Functions #195
Conversation
Check out this pull request on See visual diffs & provide feedback on Jupyter Notebooks. Powered by ReviewNB |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
hey @dluuo! Nice work so far! I just left a comment related to the usage of torch
inside the level_to_outputs
function.
Also, it might be a good moment to include some unit tests to catch potential errors in the new functions. You can see here some examples.
The main idea is to think about the expected output of the function for a specific case, write that expected output, and then compare the actual and expected outputs. For example, in the case of level_to_outputs
a unit test could be something like:
test_eq(
level_to_outputs([80, 90]),
([0.05, 0.1, 0.5, 0.9, 0.95], ['lo-90', 'lo-80', 'median', 'hi-80', 'hi-90'])
)
The test_eq
function can be imported from fastcore.test
:
from fastcore.test import test_eq
Thank you @dluuo! |
I added three functions to utils.ipynb that are using for probabilistic forecasting.
The main one is samples_to_quantiles_df(..), which given a numpy array of forecasts of samples, and quantiles/levels, it finds the quantiles of the sample forecasts and creates a dataframe where each columns are the forecasts corresponding to a quantile.
I also added level_to_outputs and quantile_to_outputs, which are helper functions imported from neuralforecast that given a level or quantiles list, outputs a array of names corresponding to those levels/quantiles.