Skip to content

Commit 4ed4291

Browse files
authored
fix: clear wording error in numpy warmup
In the backpropagation section, there was a comment saying the code was computing the gradient of parameters a, b, c, d wrt the loss. This is flipped and causes confusion about the math of backprop.
1 parent 6e96f1a commit 4ed4291

File tree

1 file changed

+1
-1
lines changed

1 file changed

+1
-1
lines changed

beginner_source/examples_tensor/polynomial_numpy.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -37,7 +37,7 @@
3737
if t % 100 == 99:
3838
print(t, loss)
3939

40-
# Backprop to compute gradients of a, b, c, d with respect to loss
40+
# Backprop to compute gradients of loss with respect to parameters a, b, c, d
4141
grad_y_pred = 2.0 * (y_pred - y)
4242
grad_a = grad_y_pred.sum()
4343
grad_b = (grad_y_pred * x).sum()

0 commit comments

Comments
 (0)