-
Notifications
You must be signed in to change notification settings - Fork 23
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
✨ Add Monte-Carlo Dropout #49
Conversation
Hi! Thanks for the great work, again! I'll have a more thorough look at it next week. I've skimmed your PR and am just wondering how we could adapt your baseline to the different variants of MC Dropout. It may be a question worth considering, given your interest in its variants! Then, there is also the question of the number of inference passes. I agree that it is more optimal to avoid the for loop and increase the batch size, but it may lead, in some cases, to increased VRAM needs (for instance, with 1000 passes?). We could let the user select the test batch size if wanted (that could be reduced to 1, for instance, if we want to compute the mean on many passes). It seems to be the simplest solution, but it is not completely safe & it adds a new parameter to the CLI. |
…into mc-dropout
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for the PR. I will make some modifications, but having it in the library is great. If you want, you can join the Discord server whose link I added in the readme of the dev branch!
Co-authored-by: Adrien Lafage <adrienlafage@outlook.com>
I had to make a few changes to avoid errors. Merging now. |
This PR fixes #40. The main changes are:
eval
mode.enable_last_layer_dropout
to enable last-layer dropout.