You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Then `z`is equvilant to 
147
139
148
140
* Call `backward()` function to compute gradients automatically
149
141
@@ -154,9 +146,8 @@ For example:
154
146
`z.backward()`is actually just the derivative of z with respect to inputs (tensors whose `is_leaf`and`requires_grad` both equals `True`)
155
147
156
148
For example, if we want to know the derivative of `z`with respect to `x_1`, it is:
0 commit comments