Skip to content

batch_norm layer isn't affected by the updated phase (train/test) #5045

Closed
@yossibiton

Description

@yossibiton

Issue summary

The parameter use_global_stats_ in batch_norm layer is defined automatically in layer setup, based on the current phase (train/test). see batch_norm_layer.cpp line 14 (false for train phase, true for test phase).

However the layer phase is dynamic and can be changed during train/test iterations.
For that reason, the correct behavior is to change use_global_stats_ value in forward & backward methods (based on the current phase_ value) instead of doing it only once in setup method.
This behavior is similar to what is done in dropout layer, which acts differently in train/test phase.

I will explain why this issue bothers me :
In my application, i use one prototxt network definition file and all layers are initialized in train mode. When i change the network phase from train the test i expect all layers to act respectively. However, the batch_norm layer doesn't change use_global_stats_ from false to true, so actually continue to act like it's in train phase.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions