Skip to content

Commit b54c3b3

Browse files
committed
bug fixes
1 parent ac1596b commit b54c3b3

File tree

1 file changed

+28
-28
lines changed

1 file changed

+28
-28
lines changed

readme.md

Lines changed: 28 additions & 28 deletions
Original file line numberDiff line numberDiff line change
@@ -21,32 +21,32 @@ Some useful functions that you may use for managing your training data. We **mus
2121

2222
Note that the shape of array in the sequence should be the same except the dimension corresponds to the axis.
2323

24-
```
24+
```python
2525
# concatenate two array
26-
a1 = np.array([[1, 2], [3, 4], [5, 6]]) // shape: (3, 2)
27-
a2 = np.array([[3, 4], [5, 6], [7, 8]]) // shape: (3, 2)
26+
a1 = np.array([[1, 2], [3, 4], [5, 6]]) # shape: (3, 2)
27+
a2 = np.array([[3, 4], [5, 6], [7, 8]]) # shape: (3, 2)
2828

2929
# along the axis = 0
30-
a3 = np.concatenate((a1, a2), axis=0) // shape: (6, 4)
30+
a3 = np.concatenate((a1, a2), axis=0) # shape: (6, 4)
3131

3232
# along the axis = 1
33-
a4 = np.concatenate((a1, a2), axis=1) // shape: (3, 4)
33+
a4 = np.concatenate((a1, a2), axis=1) # shape: (3, 4)
3434
```
3535

3636
* `np.transpose(arr, axis)`
3737

3838
Mostly we use it to align the dimension of our data.
39-
```
39+
```python
4040
# transpose 2D array
41-
a5 = np.array([[1, 2], [3, 4], [5, 6]]) // shape: (3, 2)
42-
np.transpose(a5) // shape: (2, 3)
41+
a5 = np.array([[1, 2], [3, 4], [5, 6]]) # shape: (3, 2)
42+
np.transpose(a5) # shape: (2, 3)
4343
```
4444

4545
We can also permute multiple axis of the array.
4646

47-
```
48-
a6 = np.array([[[1, 2], [3, 4], [5, 6]]]) // shape: (1, 3, 2)
49-
np.transpose((a6), axes=(2, 1, 0)) // shape: (2, 3, 1)
47+
```python
48+
a6 = np.array([[[1, 2], [3, 4], [5, 6]]]) # shape: (1, 3, 2)
49+
np.transpose((a6), axes=(2, 1, 0)) # shape: (2, 3, 1)
5050
```
5151

5252
## PyTorch
@@ -67,14 +67,14 @@ A `torch.tensor` is conceptually identical to a numpy array, but with GPU suppor
6767
# torch.Size([1, 2, 3])
6868
b1.view((1, 3, 2)) # same as reshape in numpy
6969
# tensor([[[1, 2],
70-
# [3, 4],
71-
# [5, 6]]])
70+
# [3, 4],
71+
# [5, 6]]])
7272
b1.squeeze() # removes all the dimensions of size 1
7373
# tensor([[1, 2, 3],
74-
# [4, 5, 6]])
74+
# [4, 5, 6]])
7575
b1.unsqueeze() # inserts a new dimension of size one in a specific position
7676
# tensor([[[[1, 2, 3],
77-
# [4, 5, 6]]]])
77+
# [4, 5, 6]]]])
7878
```
7979

8080
* Other manipulation functions are similar to that of NumPy, we omitted it here for simplification. For more information, please check the PyTorch documentation: https://pytorch.org/docs/stable/tensors.html
@@ -84,11 +84,11 @@ A `torch.tensor` is conceptually identical to a numpy array, but with GPU suppor
8484
- Some important attributes of `torch.tensor`
8585

8686
- ```python
87-
b1.grad # gradient of the tensor
88-
b1.grad_fn # the gradient function the tensor
89-
b1.is_leaf # check if tensor is a leaf node of the graph
90-
b1.requires_grad # if set to True, starts tracking all operations performed
91-
```
87+
b1.grad # gradient of the tensor
88+
b1.grad_fn # the gradient function the tensor
89+
b1.is_leaf # check if tensor is a leaf node of the graph
90+
b1.requires_grad # if set to True, starts tracking all operations performed
91+
```
9292

9393
### Autograd
9494

@@ -121,10 +121,10 @@ For example:
121121

122122
```python
123123
z = (0.5 * x1 + x2).sum()
124-
# x2.grad None
125-
# x2.grad_fn <SumBackward0>
126-
# x2.is_leaf False
127-
# x2.requires_grad True
124+
# x2.grad None
125+
# x2.grad_fn <SumBackward0>
126+
# x2.is_leaf False
127+
# x2.requires_grad. True
128128
```
129129

130130
* Call `backward()` function to compute gradients automatically
@@ -135,17 +135,17 @@ For example:
135135

136136
* Check the gradients using `.grad`
137137

138-
```
138+
```python
139139
x1.grad
140140
x2.grad
141141
```
142142

143143
Output will be something like this
144144

145-
```
146-
tensor([[[0.5000, 0.5000], // x1.grad
145+
```python
146+
tensor([[[0.5000, 0.5000], # x1.grad
147147
[0.5000, 0.5000]]])
148-
tensor([[[1., 1.], // x2.grad
148+
tensor([[[1., 1.], # x2.grad
149149
[1., 1.]]])
150150
```
151151

0 commit comments

Comments
 (0)