Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Gradient in continuous regression using a neural network

I'm trying to implement a regression NN that has 3 layers (1 input, 1 hidden and 1 output layer with a continuous result). As a basis I took a classification NN from coursera.org class, but changed the cost function and gradient calculation so as to fit a regression problem (and not a classification one):

My nnCostFunction now is:

function [J grad] = nnCostFunctionLinear(nn_params, ...
                                   input_layer_size, ...
                                   hidden_layer_size, ...
                                   num_labels, ...
                                   X, y, lambda)

Theta1 = reshape(nn_params(1:hidden_layer_size * (input_layer_size + 1)), ...
                 hidden_layer_size, (input_layer_size + 1));

Theta2 = reshape(nn_params((1 + (hidden_layer_size * (input_layer_size + 1))):end), ...
                 num_labels, (hidden_layer_size + 1));

m = size(X, 1);

a1 = X;
a1 = [ones(m, 1) a1];
a2 = a1 * Theta1';
a2 = [ones(m, 1) a2];
a3 = a2 * Theta2';
Y = y;

J = 1/(2*m)*sum(sum((a3 - Y).^2))

th1 = Theta1;
th1(:,1) = 0; %set bias = 0 in reg. formula
th2 = Theta2;
th2(:,1) = 0;

t1 = th1.^2;
t2 = th2.^2;
th = sum(sum(t1)) + sum(sum(t2));
th = lambda * th / (2*m);
J = J + th; %regularization


del_3 = a3 - Y;
t1 = del_3'*a2;
Theta2_grad = 2*(t1)/m + lambda*th2/m;

t1 = del_3 * Theta2;
del_2 = t1 .*  a2;
del_2 = del_2(:,2:end);
t1 = del_2'*a1;
Theta1_grad = 2*(t1)/m + lambda*th1/m;

grad = [Theta1_grad(:) ; Theta2_grad(:)];
end

Then I use this func in fmincg algorithm, but in firsts iterations fmincg end it's work. I think my gradient is wrong, but I can't find the error.

Can anybody help?

like image 630
mishka Avatar asked Nov 06 '12 16:11

mishka


1 Answers

If I understand correctly, your first block of code (shown below) -

m = size(X, 1);

a1 = X;
a1 = [ones(m, 1) a1];
a2 = a1 * Theta1';
a2 = [ones(m, 1) a2];
a3 = a2 * Theta2';
Y = y;

is to get the output a(3) at the output layer.

Ng's slides about NN has the below configuration to calculate a(3). It's different from what your code presents.

  • in the middle/output layer, you are not doing the activation function g, e.g., a sigmoid function.

enter image description here

In terms of the cost function J without regularization terms, Ng's slides has the below formula:

enter image description here

I don't understand why you can compute it using:

J = 1/(2*m)*sum(sum((a3 - Y).^2))

because you are not including the log function at all.

like image 82
greeness Avatar answered Oct 25 '22 05:10

greeness