Skip to content
This repository has been archived by the owner on Nov 17, 2023. It is now read-only.

Commit

Permalink
Update rnn_layer.py
Browse files Browse the repository at this point in the history
  • Loading branch information
ThomasDelteil authored and piiswrong committed Apr 30, 2018
1 parent 04e00c8 commit 8fdcb85
Showing 1 changed file with 10 additions and 12 deletions.
22 changes: 10 additions & 12 deletions python/mxnet/gluon/rnn/rnn_layer.py
Original file line number Diff line number Diff line change
Expand Up @@ -285,10 +285,10 @@ class RNN(_RNNLayer):
Inputs:
- **data**: input tensor with shape `(sequence_length, batch_size, input_size)`
when `layout` is "TNC". For other layouts dimensions are permuted accordingly.
Be aware that a `transpose` operation with a ndarray results in a new allocation of
memory. For optimal performance and when applicable, consider transposing
your layout to "TNC" before loading your data into a ndarray.
when `layout` is "TNC". For other layouts, dimensions are permuted accordingly
using transpose() operator which adds performance overhead. Consider creating
batches in TNC layout during data batching step.
- **states**: initial recurrent state tensor with shape
`(num_layers, batch_size, num_hidden)`. If `bidirectional` is True,
shape will instead be `(2*num_layers, batch_size, num_hidden)`. If
Expand Down Expand Up @@ -388,10 +388,9 @@ class LSTM(_RNNLayer):
Inputs:
- **data**: input tensor with shape `(sequence_length, batch_size, input_size)`
when `layout` is "TNC". For other layouts dimensions are permuted accordingly.
Be aware that a `transpose` operation with a ndarray results in a new allocation of
memory. For optimal performance and when applicable, consider transposing
your layout to "TNC" before loading your data into a ndarray.
when `layout` is "TNC". For other layouts, dimensions are permuted accordingly
using transpose() operator which adds performance overhead. Consider creating
batches in TNC layout during data batching step.
- **states**: a list of two initial recurrent state tensors. Each has shape
`(num_layers, batch_size, num_hidden)`. If `bidirectional` is True,
shape will instead be `(2*num_layers, batch_size, num_hidden)`. If
Expand Down Expand Up @@ -488,10 +487,9 @@ class GRU(_RNNLayer):
Inputs:
- **data**: input tensor with shape `(sequence_length, batch_size, input_size)`
when `layout` is "TNC". For other layouts dimensions are permuted accordingly.
Be aware that a `transpose` operation with a ndarray results in a new allocation of
memory. For optimal performance and when applicable, consider transposing
your layout to "TNC" before loading your data into a ndarray.
when `layout` is "TNC". For other layouts, dimensions are permuted accordingly
using transpose() operator which adds performance overhead. Consider creating
batches in TNC layout during data batching step.
- **states**: initial recurrent state tensor with shape
`(num_layers, batch_size, num_hidden)`. If `bidirectional` is True,
shape will instead be `(2*num_layers, batch_size, num_hidden)`. If
Expand Down

0 comments on commit 8fdcb85

Please sign in to comment.