Skip to content

[BUG] UnboundLocalError in AEAttentionBiGRUNetwork When temporal_latent_space=True #2560

Open
@lucifer4073

Description

@lucifer4073

Describe the bug

When initializing AEAttentionBiGRUNetwork with temporal_latent_space=True, calling build_network results in an UnboundLocalError. The issue occurs because the variable shape_before_flatten is referenced but never assigned a value when temporal_latent_space=True. The issue does not occur when temporal_latent_space=False.

Steps/Code to reproduce the bug

from aeon.networks import AEAttentionBiGRUNetwork  

# Initialize with temporal_latent_space=True
aeattentionbigru = AEAttentionBiGRUNetwork(latent_space_dim=64, temporal_latent_space=True)  
aeattentionbigru.build_network((1000, 5))  # This causes the error

Expected results

The build_network function should execute without errors when temporal_latent_space=True, correctly constructing the network.

Actual results

Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
line 177, in build_network
    shape=shape_before_flatten, name="decoder_input"
          ^^^^^^^^^^^^^^^^^^^^
UnboundLocalError: cannot access local variable 'shape_before_flatten' where it is not associated with a value

Versions

No response

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't workingdeep learningDeep learning relatednetworksNetworks package

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions