Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Adapt to changes in NNlib to use mode externally #308

Closed
wants to merge 1 commit into from
Closed

Adapt to changes in NNlib to use mode externally #308

wants to merge 1 commit into from

Conversation

ayush1999
Copy link
Contributor

This works on the changes made in FluxML/NNlib.jl#53 .
(To use mode directly from Flux).

@MikeInnes
Copy link
Member

I think the API needs some thought here, and I'd rather not 0/1 (which are unclear to read). Maybe a kwarg like reflect=true?

I'm almost tempted to just have a separate CrossCorrelation layer for this, but it's probably not worth that.

@ayush1999
Copy link
Contributor Author

@MikeInnes I've added reflect as an attribute of Conv. When true, reflects internally acts similar to mode=0.

@ayush1999 ayush1999 closed this Jul 6, 2018
@ayush1999 ayush1999 reopened this Jul 6, 2018
@philtomson
Copy link

It seems like convolution mode should be the default with cross correlation as the option.

@MikeInnes
Copy link
Member

I believe we already do; mode = 0 corresponds to convolution in CUDNN.

@ayush1999
Copy link
Contributor Author

@philtomson The nomenclature is actually a bit confusing. Convolution means flipping the kernel and then adding the element wise products. Cross Correlation is the same as Convolution, but without flipping the kernel. Flux by default flips the kernel, and hence does Convolutions. I think it'd be better to have a kwarg with mode="convolution" or mode="cross-correlation", so that there isn't any confusion.

@philtomson
Copy link

philtomson commented Jul 10, 2018

Yeah, I don't like plain strings for these kinds of things (It's all the rage in Python-land I know, but I think we can do better).

It would be good if it was some kind of enum type, I think you can do this with @enum:

@enum Choice convolution correlation

EDIT: actually, thinking about it a bit more, I think I like the idea of a separate CrossCorrelation layer/function for this. It would just be a wrapper around Conv with the kernel flipped and passed into Conv. It just makes it cleaner all around to make it explicit, I think, and no special-casing needed in the the actual convolution function which will get used more often anyway.

@MikeInnes
Copy link
Member

Are we doing #423 instead?

@ayush1999
Copy link
Contributor Author

Yes, there was a plan to add an option to Conv initially, but that changed later. Closing this.

@ayush1999 ayush1999 closed this Mar 26, 2019
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants