Skip to content

Commit

Permalink
Remove F.normalize from SoftmaxLoss
Browse files Browse the repository at this point in the history
  • Loading branch information
vjoki committed Jan 29, 2021
1 parent 6f042d8 commit 6b0697b
Showing 1 changed file with 0 additions and 1 deletion.
1 change: 0 additions & 1 deletion snn/librispeech/loss/softmax.py
Original file line number Diff line number Diff line change
Expand Up @@ -14,7 +14,6 @@ def __init__(self, nOut=512, nClasses=251, **kwargs):
def forward(self, x, label=None):
# TODO: Not sure what the rationale for N -> nClasses is...
# B*W*SxN -> B*W*SxnClasses
x = F.normalize(x, p=2, dim=1)
x = self.fc(x)
nloss = self.criterion(x, label)
prec1 = accuracy(x.detach(), label.detach(), topk=(1,))[0]
Expand Down

0 comments on commit 6b0697b

Please sign in to comment.