Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Colab+TPU not supporting TF 2.3.0 tf.keras.layers.experimental.preprocessing

I was updating my model using TF 2.3.0 on Colab+TPU based on https://keras.io/examples/vision/image_classification_efficientnet_fine_tuning/, specifically following the Data augmentation and Transfer learning from pre-trained weights paragraphs.

When I launch model.fit I got this error:

InvalidArgumentError: 9 root error(s) found.
  (0) Invalid argument: {{function_node __inference_train_function_372657}} Compilation failure: Detected unsupported operations when trying to compile graph cluster_train_function_12053586239504196919[] on XLA_TPU_JIT: ImageProjectiveTransformV2 (No registered 'ImageProjectiveTransformV2' OpKernel for XLA_TPU_JIT devices compatible with node {{node EfficientNet/img_augmentation/random_rotation_2/transform/ImageProjectiveTransformV2}}){{node EfficientNet/img_augmentation/random_rotation_2/transform/ImageProjectiveTransformV2}}
    TPU compilation failed
     [[tpu_compile_succeeded_assert/_6138790737589773377/_7]]
     [[TPUReplicate/_compile/_14198390524791994190/_6/_238]]
 

I suppose the TPU still does not support tf.keras.layers.experimental.preprocessing because in the list of available TPU operations there is not the preprocessing option. Am I right?

There are multiple Benefits of doing preprocessing inside the model at inference time.

Where could I find a possible implementation date?

Thanks.

Davide

like image 284
Daviddd Avatar asked Sep 10 '25 03:09

Daviddd


2 Answers

A possible workaround is to incorporate the layers into the input pipeline. It's a bit of a hack, but I've tested it briefly and it seems to work on a TPU. For example, if you are using the tf.data.Dataset API, you can create a layer object and then call it within Dataset.map() to apply the augmentation to the pipeline:

# dummy data
images = tf.random.uniform((10, 224, 224, 1))
labels = tf.zeros((10, 1))
ds = tf.data.Dataset.from_tensor_slices((images, labels))
ds = ds.batch(10)

# now incorporate the augmentation 'layer' into the pipeline
augmentor = tf.keras.layers.experimental.preprocessing.RandomRotation((-0.1, 0.1))
# augment the images, pass the labels through untouched
ds = ds.map(lambda x, y: (augmentor.call(x), y))

# assume we've compiled a model elsewhere
model.fit(ds)

This doesn't compile the augmentation layers into the model as originally intended, but it should allow you to augment your training data without requiring a third party plugin. I intend to use this as a workaround until the issue is officially resolved.

like image 139
knuckles Avatar answered Sep 12 '25 23:09

knuckles


You are half right. The list of TPU operations includes lower-level TF functions, but not Keras layers. From the error message, it looks like your preprocessing layer tries to instantiate a ImageProjectiveTransformV2 op in the graph, which is not supported.

As a TPU-compatible alternative, I recommend you look at the official EfficientNet implementation in the TF model garden. In particular, preprocessing.py may be helpful to you.

like image 38
Will Cromar Avatar answered Sep 13 '25 00:09

Will Cromar