Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

TensorFlow 2.0 Keras layers with custom tensors as variables

In TF 1.x, it was possible to build layers with custom variables. Here's an example:

import numpy as np
import tensorflow as tf

def make_custom_getter(custom_variables):
    def custom_getter(getter, name, **kwargs):
        if name in custom_variables:
            variable = custom_variables[name]
        else:
            variable = getter(name, **kwargs)
        return variable
    return custom_getter

# Make a custom getter for the dense layer variables.
# Note: custom variables can result from arbitrary computation;
#       for the sake of this example, we make them just constant tensors.
custom_variables = {
    "model/dense/kernel": tf.constant(
        np.random.rand(784, 64), name="custom_kernel", dtype=tf.float32),
    "model/dense/bias": tf.constant(
        np.random.rand(64), name="custom_bias", dtype=tf.float32),
}
custom_getter = make_custom_getter(custom_variables)

# Compute hiddens using a dense layer with custom variables.
x = tf.random.normal(shape=(1, 784), name="inputs")
with tf.variable_scope("model", custom_getter=custom_getter):
    Layer = tf.layers.Dense(64)
    hiddens = Layer(x)

print(Layer.variables)

The printed variables of the constructed dense layer will be custom tensors we specified in the custom_variables dict:

[<tf.Tensor 'custom_kernel:0' shape=(784, 64) dtype=float32>, <tf.Tensor 'custom_bias:0' shape=(64,) dtype=float32>]

This allows us to create layers/models that use provided tensors in custom_variables directly as their weights, so that we could further differentiate the output of the layers/models with respect to any tensors that custom_variables may depend on (particularly useful for implementing functionality in modulating sub-nets, parameter generation, meta-learning, etc.).

Variable scopes used to make it easy to nest all off graph-building inside scopes with custom getters and build models on top of the provided tensors as their parameters. Since sessions and variable scopes are no longer advisable in TF 2.0 (and all of that low-level stuff is moved to tf.compat.v1), what would be the best practice to implement the above using Keras and TF 2.0?

(Related issue on GitHub.)

like image 424
maruan Avatar asked Oct 07 '19 18:10

maruan


People also ask

Does TensorFlow 2.0 include Keras?

Keras is the high-level API of TensorFlow 2: an approachable, highly-productive interface for solving machine learning problems, with a focus on modern deep learning. It provides essential abstractions and building blocks for developing and shipping machine learning solutions with high iteration velocity.

How do I create a custom layer in TensorFlow?

Implementing custom layers Layer class and implementing: __init__ , where you can do all input-independent initialization. build , where you know the shapes of the input tensors and can do the rest of the initialization. call , where you do the forward computation.

When should you create a custom layer versus a custom model?

If you are building a new model architecture using existing keras/tf layers then build a custom model. If you are implementing your own custom tensor operations with in a layer, then build a custom layer.


1 Answers

Answer based on the comment below

Given you have:

kernel = createTheKernelVarBasedOnWhatYouWant() #shape (784, 64)
bias = createTheBiasVarBasedOnWhatYouWant() #shape (64,)

Make a simple function copying the code from Dense:

def custom_dense(x):
    inputs, kernel, bias = x

    outputs = K.dot(inputs, kernel)
    outputs = K.bias_add(outputs, bias, data_format='channels_last')
    return outputs

Use the function in a Lambda layer:

layer = Lambda(custom_dense)
hiddens = layer([x, kernel, bias])

Warning: kernel and bias must be produced from a Keras layer, or come from an kernel = Input(tensor=the_kernel_var) and bias = Input(tensor=bias_var)


If the warning above is bad for you, you can always use kernel and bias "from outside", like:

def custom_dense(inputs):
    outputs = K.dot(inputs, kernel) #where kernel is not part of the arguments anymore
    outputs = K.bias_add(outputs, bias, data_format='channels_last')
    return outputs

layer = Lambda(custom_dense)
hiddens = layer(x)

This last option makes it a bit more complicated to save/load models.

Old answer

You should probably use a Keras Dense layer and set its weights in a standard way:

layer = tf.keras.layers.Dense(64, name='the_layer')
layer.set_weights([np.random.rand(784, 64), np.random.rand(64)])

If you need that these weights are not trainable, before compiling the keras model you set:

model.get_layer('the_layer').trainable=False

If you want direct access to the variables as tensors, they are:

kernel = layer.kernel    
bias = layer.bias

There are plenty of other options, but that depends on your exact intention, which is not clear in your question.

like image 101
Daniel Möller Avatar answered Nov 03 '22 00:11

Daniel Möller