I'm trying to set up custom initializer to tf.layers.dense
where I initialize kernel_initializer
with a weight matrix I already have.
u_1 = tf.placeholder(tf.float32, [784, 784])
first_layer_u = tf.layers.dense(X_, n_params, activation=None,
kernel_initializer=u_1,
bias_initializer=tf.keras.initializers.he_normal())
This is throwing error saying ValueError: If initializer is a constant, do not specify shape.
Is it a problem to assign placeholder to kernel_initializer
or am I missing something?
A dense layer can be added to the sequential model using the 'add' method, and specifying the type of layer as 'Dense'. The layers are first flattened, and then a layer is added. This new layer will be applied to the entire training dataset.
The other weight initialization function used in the scenarios is the tf. random_normal_initializer with default parameters. The default parameters for this initializer are a mean of zero, and a unit (i.e. 1.0) standard deviation / variance.
zeros_initializer(), padding="same", activation=tf. nn. relu, name="conv_chad_2") . This will initialize your weights randomly from a Gaussian distribution with a variance related to the variance of its outputs (Xavier method) and biases as zeros.
Jonathan's answer worked for me on conv as well -
kernel_in = np.random.uniform(100,1000,(filter_width, filter_height, input_channels, output_channels)).astype(np.float32)
init = tf.constant_initializer(kernel_in)
def model(x):
x = tf.layers.conv2d(x, filters=3, kernel_size=1, strides=1, kernel_initializer=init)
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With