I am trying to introduce keras.initializers into my net, following this link:
import keras
from keras.optimizers import RMSprop, Adam
from keras.layers import Input, Embedding, LSTM, Dense, merge, Activation
from keras.models import Model, Sequential
model = Sequential()
model.add(Dense(100, init='lecun_uniform', input_shape=(6,)))
model.add(Activation('relu'))
model.add(Dense(27, init='lecun_uniform'))
model.add(Activation('linear'))
rms = RMSprop(lr = 0.01)
keras.initializers.RandomUniform(minval=-0.05, maxval=0.05, seed=None)
model.compile(loss='mse', optimizer=rms)
And it fails with the following error:
keras.initializers.RandomUniform(minval=-0.05, maxval=0.05, seed=None)
AttributeError: module 'keras' has no attribute 'initializers'
Any ideas as to why it happens?
Kernel initializers are used to statistically initialise the weights in the model. This will generate the weights and distribute them, it can be used as the starting weights.
Initializers define the way to set the initial random weights of Keras layers. The keyword arguments used for passing initializers to layers depends on the layer. Usually, it is simply kernel_initializer and bias_initializer : from tensorflow.keras import layers from tensorflow.keras import initializers layer = layers.
Usually, it's glorot_uniform by default. Different layer types might have different default kernel_initializer . When in doubt, just look in the source code. For example, for Dense layer: class Dense(Layer): ...
he_uniform . Draws samples from a uniform distribution within [-limit, limit] , where limit = sqrt(6 / fan_in) ( fan_in is the number of input units in the weight tensor).
You have to check version of Keras being used. Probable mistake is that you have 1.x.x and trying to use initializers from Keras 2.x.x
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With