Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How do i convert tensorflow 2.0 estimator model to tensorflow lite?

THe following code i have below produce the regular tensorflow model but when i try to convert it to tensorflow lite it doesn't work, i followed the following documentations.

https://www.tensorflow.org/tutorials/estimator/linear1 https://www.tensorflow.org/lite/guide/get_started

export_dir = "tmp"
serving_input_fn = tf.estimator.export.build_parsing_serving_input_receiver_fn(
  tf.feature_column.make_parse_example_spec(feat_cols))

estimator.export_saved_model(export_dir, serving_input_fn)

# Convert the model.
converter = tf.lite.TFLiteConverter.from_saved_model("tmp/1571728920/saved_model.pb")
tflite_model = converter.convert()

Error Message

Traceback (most recent call last):
  File "C:/Users/Dacorie Smith/PycharmProjects/JamaicaClassOneNotifableModels/ClassOneModels.py", line 208, in <module>
    tflite_model = converter.convert()
  File "C:\Users\Dacorie Smith\PycharmProjects\JamaicaClassOneNotifableModels\venv\lib\site-packages\tensorflow_core\lite\python\lite.py", line 400, in convert
    raise ValueError("This converter can only convert a single "
ValueError: This converter can only convert a single ConcreteFunction. Converting multiple functions is under development.

Extract from Documentation

TensorFlow Lite converter The TensorFlow Lite converter is a tool available as a Python API that converts trained TensorFlow models into the TensorFlow Lite format. It can also introduce optimizations, which are covered in section 4, Optimize your model.

The following example shows a TensorFlow SavedModel being converted into the TensorFlow Lite format:

import tensorflow as tf

converter = tf.lite.TFLiteConverter.from_saved_model(saved_model_dir) tflite_model = converter.convert() open("converted_model.tflite", "wb").write(tflite_model)

like image 305
DacorieS Avatar asked Oct 15 '22 10:10

DacorieS


1 Answers

Try to use a concrete function:

export_dir = "tmp"
serving_input_fn = tf.estimator.export.build_parsing_serving_input_receiver_fn(
  tf.feature_column.make_parse_example_spec(feat_cols))

estimator.export_saved_model(export_dir, serving_input_fn)

# Convert the model.
saved_model_obj = tf.saved_model.load(export_dir="tmp/1571728920/")
concrete_func = saved_model_obj.signatures['serving_default']

converter = tf.lite.TFLiteConverter.from_concrete_functions([concrete_func])

# print(saved_model_obj.signatures.keys())
# converter.optimizations = [tf.lite.Optimize.DEFAULT]
# converter.experimental_new_converter = True

tflite_model = converter.convert()

serving_default is the default key for signatures in a SavedModels.

If not working try to uncomment converter.experimental_new_converter = True and the two lines above it.

Short explanation

Based on Concrete functions guide

Eager execution in TensorFlow 2 evaluates operations immediately, without building graphs. To save the model you need graph/s which is wrapped in a python callables: a concrete functions.

like image 53
Trayan Momkov Avatar answered Oct 31 '22 19:10

Trayan Momkov