I am making a custom convolutional layer by subclassing a Keras Layer. I did this with a previous version of Tensorflow, and received no warnings:
import tensorflow as tf
class MyCustomLayer(tf.Module):
def __init__(self, in_channels,
filters,
kernel_size,
padding,
strides,
activation,
kernel_initializer,
bias_initializer,
use_bias):
super(MyCustomLayer, self).__init__()
self.filters = filters
self.kernel_size = kernel_size
self.activation = activation
self.padding = padding
self.kernel_initializer = kernel_initializer
self.bias_initializer = bias_initializer
self.strides = strides
self.use_bias = use_bias
self.in_channels = in_channels
self.w = tf.Variable(
initial_value=self.kernel_initializer(shape=(*self.kernel_size,
in_channels,
self.filters),
dtype='float32'), trainable=True)
if self.use_bias:
self.b = tf.Variable(
initial_value=self.bias_initializer(shape=(self.filters,),
dtype='float32'),
trainable=True)
def __call__(self, inputs, training=None):
x = tf.nn.conv2d(inputs,
filters=self.w,
strides=self.strides,
padding=self.padding)
if self.use_bias:
x = tf.add(x, self.b)
x = self.activation(x)
return x
x = tf.keras.Input(shape=(28, 28, 3))
y = MyCustomLayer(
in_channels=3,
filters=16,
kernel_size=(3, 3),
strides=(1, 1),
activation=tf.nn.relu,
padding='VALID',
kernel_initializer=tf.initializers.GlorotUniform(),
bias_initializer=tf.initializers.Zeros(),
use_bias=True)(x)
model = tf.keras.Model(inputs=x, outputs=y)
I'm getting this warning, with tf.__version__ == 2.4.1:
WARNING:tensorflow: The following Variables were used a Lambda layer's call (tf.compat.v1.nn.conv2d_12), but are not present in its tracked objects: <tf.Variable 'Variable:0' shape=(3, 3, 3, 16) dtype=float32> It is possible that this is intended behavior, but it is more likely an omission. This is a strong indication that this layer should be formulated as a subclassed Layer rather than a Lambda layer.
WARNING:tensorflow: The following Variables were used a Lambda layer's call (tf.math.add_2), but are not present in its tracked objects: <tf.Variable 'Variable:0' shape=(16,) dtype=float32> It is possible that this is intended behavior, but it is more likely an omission. This is a strong indication that this layer should be formulated as a subclassed Layer rather than a Lambda layer.
What does this mean? I am using a subclassed Layer.
Here is some hint. I've found a blog where it discussed this. It states
tf.keras.Lambda: Note that if variables are involved in the layer created by this method, the variable will not be automatically added to the variable set for gradient calculation. Therefore, if there are parameters to be trained in the user-defined layer, it is recommended to customize the model layer based on the base class.
They showed
weights = tf.Variable(tf.random.normal((4, 2)), name='w')
bias = tf.ones((1, 2), name='b')
print(bias)
x_input = tf.range(12.).numpy().reshape(-1, 4)
# lambda custom layer
mylayer1 = tf.keras.layers.Lambda(lambda x: tf.add(tf.matmul(x, weights),
bias), name='lambda1')
mylayer1(x_input)
tf.Tensor([[1. 1.]], shape=(1, 2), dtype=float32)
WARNING:tensorflow:
The following Variables were used a Lambda layer's call (lambda1), but
are not present in its tracked objects:
<tf.Variable 'w:0' shape=(4, 2) dtype=float32, numpy=
array([[ 2.54332 , 1.5078725],
[ 0.5291851, -1.1049112],
[ 1.475109 , -1.6525942],
[ 1.593746 , -0.4049823]], dtype=float32)>
It is possible that this is intended behavior, but it is more likely
an omission. This is a strong indication that this layer should be
formulated as a subclassed Layer rather than a Lambda layer.
<tf.Tensor: shape=(3, 2), dtype=float32, numpy=
array([[ 9.260641 , -4.6250463],
[ 33.82608 , -11.243508 ],
[ 58.391525 , -17.861969 ]], dtype=float32)>
They also showed a way to bypass the warning. But I'm not sure (yet) what could be done to bypass in your case. A Quick surveying on source code of tf.compat.v1.nn.conv2d, lead to a lambda expression that might be the cause.
def build_op(num_spatial_dims, padding):
return lambda inp, _: op(inp, num_spatial_dims, padding)
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With