Open
Description
This is maybe a collection of multiple issues.
Consider the example from the test case test_loop_axis_indices
:
class _Net(nn.Module):
@nn.scoped
def __call__(self, x: nn.Tensor, *, axis: nn.Dim) -> nn.Tensor:
loop = nn.Loop(axis=axis)
indices = nn.range_over_dim(axis)
loop.state.x = nn.zeros([nn.batch_dim, x.feature_dim], dtype=indices.dtype)
with loop:
i = loop.unstack(indices)
loop.state.x = loop.state.x + i
loop.stack(i) # loop needs some dummy output currently...
return loop.state.x
There are multiple problems here:
- The
nn.range_over_dim
is too complicated just to get the running indexi
in the loop. In RETURNN, we have:i
for that. Although that support of:i
is a bit incomplete, so it might make sense to use aRangeInAxesLayer
internally. But this could stay internally, and we could introduce sth likeloop.i
. - The
loop.stack(i)
in the end is a workaround because there must be some stacked output. This is RecLayer without accumulated output but just last frame returnn#1029.
Metadata
Metadata
Assignees
Labels
No labels
Activity