Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Default to zero'ed initial state for all RNN #590

Merged
merged 1 commit into from
Feb 4, 2019

Conversation

oxinabox
Copy link
Member

@oxinabox oxinabox commented Feb 2, 2019

Zero as the default initial state is AFAIK what every other neural network framework does.

It shouldn't actually have much (any?) effect, since the Recur remembers the initial state anyway,
but it makes things match expectations.

I also see no reason that the biases were zeroed.
I kinda suspect there may just have been a type on the order of arguments there.

Maybe I am wrong and just don't understand the code

@MikeInnes
Copy link
Member

Seems reasonable to me, thanks.

@MikeInnes MikeInnes merged commit e774053 into FluxML:master Feb 4, 2019
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants