Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Reset weights in Keras layer

I'd like to reset (randomize) the weights of all layers in my Keras (deep learning) model. The reason is that I want to be able to train the model several times with different data splits without having to do the (slow) model recompilation every time.

Inspired by this discussion, I'm trying the following code:

# Reset weights for layer in KModel.layers:     if hasattr(layer,'init'):         input_dim = layer.input_shape[1]         new_weights = layer.init((input_dim, layer.output_dim),name='{}_W'.format(layer.name))         layer.trainable_weights[0].set_value(new_weights.get_value()) 

However, it only partly works.

Partly, becuase I've inspected some layer.get_weights() values, and they seem to change. But when I restart the training, the cost values are much lower than the initial cost values on the first run. It's almost like I've succeeded resetting some of the weights, but not all of them.

like image 711
Tor Avatar asked Nov 08 '16 20:11

Tor


People also ask

How do I reset my keras weights?

Try set_weights. Set each weight for each convolutional layer (you'll see that the first layer is actually input and you don't want to change that, that's why the range starts from 1 not zero). Sample output: Using Theano backend.

Does model compile reset weights?

What does compile do? Compile defines the loss function, the optimizer and the metrics. That's all. It has nothing to do with the weights and you can compile a model as many times as you want without causing any problem to pretrained weights.

What is the default weight initialization in keras?

The default is glorot initializer. It draws samples from a uniform distribution within [-limit, limit] where limit is sqrt(6 / (fan_in + fan_out)) where fan_in is the number of input units in the weight tensor and fan_out is the number of output units in the weight tensor.


1 Answers

Save the initial weights right after compiling the model but before training it:

model.save_weights('model.h5') 

and then after training, "reset" the model by reloading the initial weights:

model.load_weights('model.h5') 

This gives you an apples to apples model to compare different data sets and should be quicker than recompiling the entire model.

like image 151
ezChx Avatar answered Sep 22 '22 13:09

ezChx