I cannot find a way to set the initial weights of the neural network, could someone tell me how please? I am using python package sklearn.neural_network.MLPClassifier.
Here is the code for reference:
from sklearn.neural_network import MLPClassifier
classifier = MLPClassifier(solver="sgd")
classifier.fit(X_train, y_train)
max_iter: It denotes the number of epochs. activation: The activation function for the hidden layers. solver: This parameter specifies the algorithm for weight optimization across the nodes.
Your classification scores will depend on random_state . As @Ujjwal rightly said, it is used for splitting the data into training and test test. Not just that, a lot of algorithms in scikit-learn use the random_state to select the subset of features, subsets of samples, and determine the initial weights etc. For eg.
MLPs are suitable for classification prediction problems where inputs are assigned a class or label. They are also suitable for regression prediction problems where a real-valued quantity is predicted given a set of inputs.
The docs show you the attributes in use.
Attributes:
...
coefs_
: list, length n_layers - 1 The ith element in the list represents the weight matrix corresponding to > layer i.
intercepts_
: list, length n_layers - 1 The ith element in the list represents the bias vector corresponding to layer > i + 1.
Just build your classifier clf=MLPClassifier(solver="sgd")
and set coefs_
and intercepts_
before calling clf.fit()
.
The only remaining question is: does sklearn overwrite your inits?
The code looks like:
if not hasattr(self, 'coefs_') or (not self.warm_start and not
incremental):
# First time training the model
self._initialize(y, layer_units)
This looks to me like it won't replace your given coefs_
(you might check biases too).
The packing and unpacking functions further indicates that this should be possible. These are probably used for serialization through pickle internally.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With