Skip to content

Dimension of subsequent layers in Hypernetwork #169

Open
@Simply-Adi

Description

@Simply-Adi

Hi, I was reading through your implementation of HyperLSTM and the associated paper. I got lost in the shaping of the layers after the first layer. Could you please explain why the input size is 2*main_lstm_hidden_size?

Metadata

Metadata

Assignees

No one assigned

    Labels

    questionFurther information is requested

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions