I have this code for the cost in logistic regression, in matlab:
function [J, grad] = costFunction(theta, X, y)
m = length(y); % number of training examples
thetas = size(theta,1);
features = size(X,2);
steps = 100;
alpha = 0.1;
J = 0;
grad = zeros(size(theta));
sums = [];
result = 0;
for i=1:m
% sums = [sums; (y(i))*log10(sigmoid(X(i,:)*theta))+(1-y(i))*log10(1-sigmoid(X(i,:)*theta))]
sums = [sums; -y(i)*log(sigmoid(theta'*X(i,:)'))-(1-y(i))*log(1-sigmoid(theta'*X(i,:)'))];
%use log simple not log10, mistake
end
result = sum(sums);
J = (1/m)* result;
%gradient one step
tempo = [];
thetas_update = 0;
temp_thetas = [];
grad = temp_thetas;
for i = 1:size(theta)
for j = 1:m
tempo(j) = (sigmoid(theta'*X(j,:)')-y(j))*X(j,i);
end
temp_thetas(i) = sum(tempo);
tempo = [];
end
grad = (1/m).*temp_thetas;
% =============================================================
end
And I need to vectorize it, but I do not know how do it do it and why? I'm a programmer so I like the for's. But to vectorize it, I'm blank. Any help? Thanks.
Vectorizing Logistic Regression's Gradient Output To derive a very efficient implementation of logistic regression. For the gradient computation , the first step is to compute \( dz^{(1)} \) for the first example, which could be \( a^{(1)} - y^{(1)} \) and then \( dz^{(2)} = a^{(2)} - y^{(2)} \) and so on.
The cost function used in Logistic Regression is Log Loss.
In Machine Learning, vectorization is a step in feature extraction. The idea is to get some distinct features out of the text for the model to train on, by converting text to numerical vectors.
function [J, grad] = costFunction(theta, X, y)
hx = sigmoid(X * theta);
m = length(X);
J = (-y' * log(hx) - (1 - y')*log(1 - hx)) / m;
grad = X' * (hx - y) / m;
end
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With