Skip to content

Loop not simple enough in some cases #151

Open
@albertz

Description

@albertz

This is maybe a collection of multiple issues.

Consider the example from the test case test_loop_axis_indices:

class _Net(nn.Module):
  @nn.scoped
  def __call__(self, x: nn.Tensor, *, axis: nn.Dim) -> nn.Tensor:
    loop = nn.Loop(axis=axis)
    indices = nn.range_over_dim(axis)
    loop.state.x = nn.zeros([nn.batch_dim, x.feature_dim], dtype=indices.dtype)
    with loop:
      i = loop.unstack(indices)
      loop.state.x = loop.state.x + i
      loop.stack(i)  # loop needs some dummy output currently...
    return loop.state.x

There are multiple problems here:

  • The nn.range_over_dim is too complicated just to get the running index i in the loop. In RETURNN, we have :i for that. Although that support of :i is a bit incomplete, so it might make sense to use a RangeInAxesLayer internally. But this could stay internally, and we could introduce sth like loop.i.
  • The loop.stack(i) in the end is a workaround because there must be some stacked output. This is RecLayer without accumulated output but just last frame returnn#1029.

Activity

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions