File tree Expand file tree Collapse file tree 1 file changed +17
-6
lines changed Expand file tree Collapse file tree 1 file changed +17
-6
lines changed Original file line number Diff line number Diff line change 22% COSTFUNCTIONREG Compute cost and gradient for logistic regression with regularization
33% J = COSTFUNCTIONREG(theta, X, y, lambda) computes the cost of using
44% theta as the parameter for regularized logistic regression and the
5- % gradient of the cost w.r.t. to the parameters.
5+ % gradient of the cost w.r.t. to the parameters.
66
77% Initialize some useful values
88m = length(y ); % number of training examples
99
10- % You need to return the following variables correctly
10+ % You need to return the following variables correctly
1111J = 0 ;
1212grad = zeros(size(theta ));
1313
1717% Compute the partial derivatives and set grad to the partial
1818% derivatives of the cost w.r.t. each parameter in theta
1919
20-
21-
22-
23-
20+ % calculate cost function
21+ h = sigmoid(X * theta );
22+ % calculate penalty
23+ % excluded the first theta value
24+ theta1 = theta(2 : size(theta ), : );
25+ p = lambda *(theta1 ' *theta1 )/(2 * m );
26+ J = ((-y )' *log(h ) - (1 - y )' *log(1 - h ))/m + p ;
27+
28+ % calculate grads
29+ for j = 1 : size(theta )
30+ if (j == 1 )
31+ grad(j ) = ((h - y )' *X(: , j ))/m ;
32+ else
33+ grad(j ) = ((h - y )' *X(: , j ) + lambda * theta(j ))/m ;
34+ end
2435
2536% =============================================================
2637
You can’t perform that action at this time.
0 commit comments