Skip to content

Commit

Permalink
delete an assertion in batchnormalization (torch#765)
Browse files Browse the repository at this point in the history
Hi all,

This assertion is not really needed and it prevents doing back-propagation to input (computing the derivative of loss w.r.t. the input image).
  • Loading branch information
Kaiyu Yang authored and soumith committed Apr 13, 2016
1 parent 50deb01 commit 130955e
Showing 1 changed file with 0 additions and 1 deletion.
1 change: 0 additions & 1 deletion BatchNormalization.lua
Original file line number Diff line number Diff line change
Expand Up @@ -125,7 +125,6 @@ end
local function backward(self, input, gradOutput, scale, gradInput, gradWeight, gradBias)
self:checkInputDim(input)
self:checkInputDim(gradOutput)
assert(self.train == true, 'should be in training mode when self.train is true')
assert(self.save_mean and self.save_std, 'must call :updateOutput() first')

input, gradOutput = makeContiguous(self, input, gradOutput)
Expand Down

0 comments on commit 130955e

Please sign in to comment.