Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How to set initial weights in MLPClassifier?

I cannot find a way to set the initial weights of the neural network, could someone tell me how please? I am using python package sklearn.neural_network.MLPClassifier.

Here is the code for reference:

from sklearn.neural_network import MLPClassifier
classifier = MLPClassifier(solver="sgd")
classifier.fit(X_train, y_train)
like image 679
Mohamed ElSheikh Avatar asked Jun 10 '17 21:06

Mohamed ElSheikh


People also ask

What is Max_iter in MLPClassifier?

max_iter: It denotes the number of epochs. activation: The activation function for the hidden layers. solver: This parameter specifies the algorithm for weight optimization across the nodes.

What is random state in MLP classifier?

Your classification scores will depend on random_state . As @Ujjwal rightly said, it is used for splitting the data into training and test test. Not just that, a lot of algorithms in scikit-learn use the random_state to select the subset of features, subsets of samples, and determine the initial weights etc. For eg.

Is MLP classifier good?

MLPs are suitable for classification prediction problems where inputs are assigned a class or label. They are also suitable for regression prediction problems where a real-valued quantity is predicted given a set of inputs.


1 Answers

The docs show you the attributes in use.

Attributes:
...

coefs_ : list, length n_layers - 1 The ith element in the list represents the weight matrix corresponding to > layer i.

intercepts_ : list, length n_layers - 1 The ith element in the list represents the bias vector corresponding to layer > i + 1.

Just build your classifier clf=MLPClassifier(solver="sgd") and set coefs_ and intercepts_ before calling clf.fit().

The only remaining question is: does sklearn overwrite your inits?

The code looks like:

    if not hasattr(self, 'coefs_') or (not self.warm_start and not
                                       incremental):
        # First time training the model
        self._initialize(y, layer_units)

This looks to me like it won't replace your given coefs_ (you might check biases too).

The packing and unpacking functions further indicates that this should be possible. These are probably used for serialization through pickle internally.

like image 137
sascha Avatar answered Nov 13 '22 02:11

sascha