-
-
Notifications
You must be signed in to change notification settings - Fork 608
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add option for "Same" padding to conv and pooling layers #901
Conversation
Does the gitlab CI run all tests on GPU? Some of the new tests I added use asymetric padding which will fail as it is not supported when using CuArrays. I couldn't find anything which points to this in the failure reasons though. Is it just a fluke or did I break the camels back or something? |
No, only the tests in the bors try |
Should also be fine to add a check in the GPU case imo |
tryBuild failed |
Odd error. Seems unlikely it's due to this patch. |
@MikeInnes : Could the gitlab failure be due to old baseline revision vs newer Zygote? Does @dhairyagandhi96 : Not sure what you mean. Add test cases with CuArrays as well (using CUDA-safe kernel sizes) or? |
Ok I tried pulling latest master to my local branch but I still got errors. Looking at the gitlab pipeline it seems like the same error has happend for all pipelines since yesterday: https://gitlab.com/JuliaGPU/Flux.jl/pipelines |
@DrChainsaw yes, exactly. Thanks for the fixes! |
@dhairyagandhi96 I can do that of course, but I'm not sure it would add any value. SamePad is only used to calculate the padding values in the layers (e.g Conv) constructor and then it goes out of scope. There are no GPU operations involved in this procedure. After construction, the layer struct looks just as if the user had supplied the padding as integers and there is no trace of any of the changes done in this branch. |
Ah Sorry, I just meant the check on the sizes after the forward pass with cuda. But you are indeed right, so yeah, should be fine as is. |
@dhairyagandhi96 @MikeInnes Just checking if you need anything more from me here. I perfectly understand that you probably have more pressing matters to attend to. Also, don't hesitate to close the PR if this functionality is not wanted or done in the wrong way. No feelings hurt, I promise :) |
Apologies, I know everyone is very busy, but I was wondering if there has been any movement on this issue. I am working on a project where the ability to set same padding would be very useful. I have manufactured a workaround, but it is less than ideal and this fix would be very helpful. |
I'm open to resolving the merge conflicts in case this functionality is wanted. |
It would be great to get this in, it's a great idea and the implementation looks solid too. It needs a NEWS.md entry, but otherwise we can merge as soon as the conflicts are fixed. |
I got a bit uncertain where to put the NEWS.md entry as there does not seem to be any new entries since 0.10.0 in it. There is nothing breaking in this commit so it should not require a step to 0.11, but the pattern I could extract from NEWS.md is that it only covers "major" releases. Sorry if there is some convention here which I should have known about. It is a separate commit which can easily be lifted out if done wrong. |
Yeah, we should be better about our news coverage, you can put it under a 10.5 banner for now I suppose |
Done and done! |
bors r+ |
Thanks for the patch, looking forward to using it more! |
Build succeeded: |
Awesome! Can't wait to test it out! Thanks for the quick work on getting it merged! |
Me too. Thanks for the pull! |
Fixes #813
This adds the possibility to set "pad=SamePad()" to automatically calculate the amount of padding to apply so that outputsize==inputsize (assuming stide == 1).
Comments on API more than welcome. I considered the following options:
Happy to change to any of the above or to anything else.
I don't think that same padding is common for pooling layers, but I added it just for the sake of consistency. It is a separate commit so it can easily be removed if not wanted.