You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
@@ -67,14 +67,14 @@ A `torch.tensor` is conceptually identical to a numpy array, but with GPU suppor
67
67
# torch.Size([1, 2, 3])
68
68
b1.view((1, 3, 2)) # same as reshape in numpy
69
69
# tensor([[[1, 2],
70
-
# [3, 4],
71
-
# [5, 6]]])
70
+
# [3, 4],
71
+
# [5, 6]]])
72
72
b1.squeeze() # removes all the dimensions of size 1
73
73
# tensor([[1, 2, 3],
74
-
# [4, 5, 6]])
74
+
# [4, 5, 6]])
75
75
b1.unsqueeze() # inserts a new dimension of size one in a specific position
76
76
# tensor([[[[1, 2, 3],
77
-
#[4, 5, 6]]]])
77
+
#[4, 5, 6]]]])
78
78
```
79
79
80
80
* Other manipulation functions are similar to that of NumPy, we omitted it here for simplification. For more information, please check the PyTorch documentation: https://pytorch.org/docs/stable/tensors.html
@@ -84,11 +84,11 @@ A `torch.tensor` is conceptually identical to a numpy array, but with GPU suppor
84
84
- Some important attributes of `torch.tensor`
85
85
86
86
-```python
87
-
b1.grad # gradient of the tensor
88
-
b1.grad_fn# the gradient function the tensor
89
-
b1.is_leaf# check if tensor is a leaf node of the graph
90
-
b1.requires_grad# if set to True, starts tracking all operations performed
91
-
```
87
+
b1.grad # gradient of the tensor
88
+
b1.grad_fn# the gradient function the tensor
89
+
b1.is_leaf# check if tensor is a leaf node of the graph
90
+
b1.requires_grad# if set to True, starts tracking all operations performed
91
+
```
92
92
93
93
### Autograd
94
94
@@ -121,10 +121,10 @@ For example:
121
121
122
122
```python
123
123
z = (0.5* x1 + x2).sum()
124
-
# x2.grad None
125
-
# x2.grad_fn <SumBackward0>
126
-
# x2.is_leaf False
127
-
# x2.requires_gradTrue
124
+
# x2.grad None
125
+
# x2.grad_fn <SumBackward0>
126
+
# x2.is_leaf False
127
+
# x2.requires_grad. True
128
128
```
129
129
130
130
* Call `backward()` function to compute gradients automatically
0 commit comments