Skip to content

[BUG] loss results are different even though random seed is set #1770

Closed
@ljm565

Description

@ljm565
class EfficientFormerV2(nn.Module):
    def __init__(self, config):
        super(EfficientFormerV2, self).__init__()
        self.model = timm.create_model("efficientformerv2_s0.snap_dist_in1k", pretrained=True)
        self.fc = nn.Linear(176*7*7, 2, bias=config.bias)

    def forward(self, x):
        batch_size = x.size(0)
        output = self.model.forward_features(x) 
        output = output.view(batch_size, -1)
        output = self.fc(output)
        return output

Loss results of every training are different despite using efficientformer V2 model with fixed random seed.
More specifically, training steps of each training epoch are 75 and after few steps (around 30 steps), loss results are different.
I think, the issue is come from timm randomness because when I use our customed model, the loss results are the same.
Is there any solution for this issue?
Do I have to use only train.py that they provided?

Thanks

Metadata

Metadata

Assignees

Labels

bugSomething isn't working

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions