Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How to determine the weights in a neural network? [closed]

If I have a three layer neural network and if I have 3 input samples with their corresponding expected output values, how can I determine the values of the weights along all the edges?

like image 350
London guy Avatar asked Jan 16 '23 14:01

London guy


2 Answers

Estimating the weights of an artificial neural network(ANN) is nothing but a parametric optimization problem. In general one needs a non-linear optimizer to get the job done. Most cost functions that are optimized in the process are those which penalize the mismatch between the network output and the desired output.

The Backpropagation method is an elegant method of applying gradient based optimization since it enables one to estimate the error at the output of hidden layer neurons. Thus enabling the updation of weights in the hidden layer using error gradients.

To deal with the problem of local minima in gradient based methods it is common practice to use multi-start methods which essentially amount to repeating the estimation procedure from a bunch of different initial guesses.

Mind you evolutionary methods such as genetic algorithms also suffer from premature convergence when the population loses diversity.

Also watch out for overfitting the network to the training data. You won't be able to get good generalization for unseen data, which after all is the point of function approximation for predictive learning.

All this aside what is disconcerting is that the number of training samples is too low to yield much information about the function you are trying to approximate. Loosely speaking if the ANN has a large number of free parameters then the training data must provide enough information to allow for a meaningful estimation of the parameters. 3 samples is just too low for any practical function approximation task.

like image 79
awhan Avatar answered Feb 05 '23 20:02

awhan


Backpropagation is traditionally used for this. Personally, I have had much better and faster results with the Levenberg-Marquardt algorithm.

You also might want to test an evolutionary algorithm (e.g., Genetic Algorithms, Particle Swarm Optimization (easy to implement!)). These are less prone to getting stuck in local optima because they are not based on gradients.

like image 37
Def_Os Avatar answered Feb 05 '23 19:02

Def_Os