Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Recurrent Attention (Step 2): Support functional composition of RNN Cells (incl. constants) #7981

Conversation

andhus
Copy link
Contributor

@andhus andhus commented Sep 24, 2017

This PR is the second step to support recurrent attention mechanisms as suggested in #7633.

In short, it adds the possibility to compose RNN Cells using keras functional API, including support of constants, by introducing the Model-like wrapper class FunctionalRNNCell.

Note: this is an extension of #7980

@farizrahman4u
Copy link
Contributor

I think we should add support for optional named inputs (states, consts etc) for layers at topology level and should be reflected in the input_spec.

@@ -4,6 +4,8 @@
import functools
import warnings

from keras.engine import Model
from keras.layers.wrappers import Wrapper
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Use relative imports.

@andhus
Copy link
Contributor Author

andhus commented Oct 2, 2017

We will keep all additions related to recurrent attention in a single PR: #7980, closing this one.

@andhus andhus closed this Oct 2, 2017
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants