Skip to content

Commit 9bc0cce

Browse files
committed
assignments of regularization are done and submitted.
1 parent da47af7 commit 9bc0cce

File tree

1 file changed

+17
-6
lines changed

1 file changed

+17
-6
lines changed

ex2/mlclass-ex2/costFunctionReg.m

Lines changed: 17 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -2,12 +2,12 @@
22
%COSTFUNCTIONREG Compute cost and gradient for logistic regression with regularization
33
% J = COSTFUNCTIONREG(theta, X, y, lambda) computes the cost of using
44
% theta as the parameter for regularized logistic regression and the
5-
% gradient of the cost w.r.t. to the parameters.
5+
% gradient of the cost w.r.t. to the parameters.
66

77
% Initialize some useful values
88
m = length(y); % number of training examples
99

10-
% You need to return the following variables correctly
10+
% You need to return the following variables correctly
1111
J = 0;
1212
grad = zeros(size(theta));
1313

@@ -17,10 +17,21 @@
1717
% Compute the partial derivatives and set grad to the partial
1818
% derivatives of the cost w.r.t. each parameter in theta
1919

20-
21-
22-
23-
20+
% calculate cost function
21+
h = sigmoid(X*theta);
22+
% calculate penalty
23+
% excluded the first theta value
24+
theta1 = theta(2:size(theta), :);
25+
p = lambda*(theta1'*theta1)/(2*m);
26+
J = ((-y)'*log(h) - (1-y)'*log(1-h))/m + p;
27+
28+
% calculate grads
29+
for j = 1:size(theta)
30+
if(j == 1)
31+
grad(j) = ((h - y)'*X(:, j))/m;
32+
else
33+
grad(j) = ((h - y)'*X(:, j) + lambda*theta(j))/m;
34+
end
2435

2536
% =============================================================
2637

0 commit comments

Comments
 (0)