Skip to content

Commit b493d7d

Browse files
misc updates
1 parent 3bce464 commit b493d7d

File tree

3 files changed

+8
-2
lines changed

3 files changed

+8
-2
lines changed

README.md

+3
Original file line numberDiff line numberDiff line change
@@ -55,3 +55,6 @@ https://deeplearningcourses.com/c/deep-learning-recurrent-neural-networks-in-pyt
5555

5656
Advanced Natural Language Processing: Deep Learning in Python
5757
https://deeplearningcourses.com/c/natural-language-processing-with-deep-learning-in-python
58+
59+
Artificial Intelligence: Reinforcement Learning in Python
60+
https://deeplearningcourses.com/c/artificial-intelligence-reinforcement-learning-in-python

ann_class/forwardprop.py

+4-1
Original file line numberDiff line numberDiff line change
@@ -29,8 +29,11 @@
2929
W2 = np.random.randn(M, K)
3030
b2 = np.random.randn(K)
3131

32+
def sigmoid(a):
33+
return 1 / (1 + np.exp(-a))
34+
3235
def forward(X, W1, b1, W2, b2):
33-
Z = 1 / (1 + np.exp(-X.dot(W1) - b1))
36+
Z = sigmoid(X.dot(W1) + b1)
3437
A = Z.dot(W2) + b2
3538
expA = np.exp(A)
3639
Y = expA / expA.sum(axis=1, keepdims=True)

ann_class2/tensorflow2.py

+1-1
Original file line numberDiff line numberDiff line change
@@ -70,7 +70,7 @@ def main():
7070
# softmax_cross_entropy_with_logits take in the "logits"
7171
# if you wanted to know the actual output of the neural net,
7272
# you could pass "Yish" into tf.nn.softmax(logits)
73-
cost = tf.reduce_sum(tf.nn.softmax_cross_entropy_with_logits(Yish, T))
73+
cost = tf.reduce_sum(tf.nn.softmax_cross_entropy_with_logits(logits=Yish, labels=T))
7474

7575
# we choose the optimizer but don't implement the algorithm ourselves
7676
# let's go with RMSprop, since we just learned about it.

0 commit comments

Comments
 (0)