I am trying to convert a model loaded with hub.load to TFLite. The model in question is universal-sentence-encoder (4) found at https://tfhub.dev/google/universal-sentence-encoder/4 I tried in Python with Tensorflow version 2.1.0 and 2.2.0
import tensorflow as tf
import tensorflow_hub as hub
model = hub.load("https://tfhub.dev/google/universal-sentence-encoder/4")
converter = tf.lite.TFLiteConverter.from_keras_model(model )
converter.experimental_new_converter = True // tried with and without
tflite_model = converter.convert()
I get the following error:
converter = tf.lite.TFLiteConverter.from_keras_model(model)
File "...\lib\site-packages\tensorflow_core\lite\python\lite.py", line 394, in from_keras_model
if not isinstance(model.call, _def_function.Function):
AttributeError: '_UserObject' object has no attribute 'call'
From my understanding hub.load return a keras SavedModel, so shouldn't be convertible right away?
Try using hub.KerasLayer to load your model into a tf.keras.Model and then convert it to ŧflite using .from_keras_model.
There's no such thing as a "keras SavedModel". There's the SavedModel, which is .pb file + assets folder + variables folder. It's like a file format, a way to store your model. It has nothing to do with the in memory tf.keras.Models. hub.load does not return a tf.keras.Model, but rather "the most generic thing" you can save in the SavedModel file format, namely a _UserObject. This is because you can save other things than just tf.keras.Modelss in a SavedModels file format.
I know this was not your question, but if you do want to get your tf.keras.Model back after loading, you can use tf.keras.save_model to save it. Then it will come back as a tf.keras.Model after loading using tf.saved_model.load (so then it's no longer the most generic thing).
EDIT:
Just the code:
import tensorflow as tf
import tensorflow_hub as hub
model = tf.keras.Sequential()
model.add(tf.keras.layers.InputLayer(dtype=tf.string, input_shape=()))
model.add(hub.KerasLayer("https://tfhub.dev/google/universal-sentence-encoder/4"))
converter = tf.lite.TFLiteConverter.from_keras_model(model)
tflite_model = converter.convert()
which works (it starts converting), but you get a:
2020-05-05 10:48:44.927433: I tensorflow/lite/toco/import_tensorflow.cc:659] Converting unsupported operation: StatefulPartitionedCall
So this is the code to convert models saved in SavedModel format to tflite, but you get a google-universal-sentence-encoder specific error. No idea how to fix that tough.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With