Skip to content

Commit f706bea

Browse files
committed
added note on the similarity to logistic regression cost function
1 parent 55fbe06 commit f706bea

File tree

1 file changed

+2
-0
lines changed

1 file changed

+2
-0
lines changed

softmax_regression.ipynb

+2
Original file line numberDiff line numberDiff line change
@@ -52,6 +52,8 @@
5252
"$J(\\boldsymbol{W},b) = - \\frac{1}{m} \\sum_{i=1}^m \\sum_{k=1}^{K} \\Big[ y_k^{(i)} \\log(\\hat{p}_k^{(i)})\\Big]$\n",
5353
"\n",
5454
"In this formula, the target labels are *one-hot encoded*. So $y_k^{(i)}$ is $1$ is the target class for $\\boldsymbol{x}^{(i)}$ is k, otherwise $y_k^{(i)}$ is $0$.\n",
55+
"\n",
56+
"Note: when there are only two classes, this cost function is equivalent to the cost function of [logistic regression](logistic_regression.ipynb).\n",
5557
"* * *\n",
5658
"\n",
5759
"** Step 4: ** Compute the gradient of the cost function with respect to each weight vector and bias. A detailed explanation of this derivation can be found [here](http://ufldl.stanford.edu/tutorial/supervised/SoftmaxRegression/).\n",

0 commit comments

Comments
 (0)