Skip to content

Commit 02dda92

Browse files
Farzad Abdolhosseiniapaszke
authored andcommitted
Fixed clamp not being applied to parameters
I'm just learning about PyTorch, but I think the line `param.grad.data.clamp(-1, 1)` has no effect. I guess you wanted to do it *in-place*, which is `clamp_`.
1 parent c44bb88 commit 02dda92

File tree

1 file changed

+1
-1
lines changed

1 file changed

+1
-1
lines changed

Reinforcement (Q-)Learning with PyTorch.ipynb

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -367,7 +367,7 @@
367367
" optimizer.zero_grad()\n",
368368
" loss.backward()\n",
369369
" for param in model.parameters():\n",
370-
" param.grad.data.clamp(-1, 1)\n",
370+
" param.grad.data.clamp_(-1, 1)\n",
371371
" optimizer.step()\n",
372372
"\n",
373373
"\n",

0 commit comments

Comments
 (0)