Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How to invoke the Flex delegate for tflite interpreters?

I have a TensorFlow model which I want to convert into a tflite model, which is going to be deployed on an ARM64 platform.

It happens to be that two operations of my model (RandomStandardNormal, Softplus) seem to require custom implementations. Due to execution time being not that important, I decided to go with a hybrid model that uses the extended runtime. I converted it via:

graph_def_file = './model/frozen_model.pb'
inputs = ['eval_inputs']
outputs = ['model/y']

converter = tf.lite.TFLiteConverter.from_frozen_graph(graph_def_file, inputs, outputs)
converter.target_ops = [tf.lite.OpsSet.TFLITE_BUILTINS, tf.lite.OpsSet.SELECT_TF_OPS]

tflite_file_name = 'vae_' + str(tf.__version__) + '.tflite'

tflite_model = converter.convert()
open(tflite_file_name, 'wb').write(tflite_model)

This worked and I ended up with a seemingly valid tflite model file. Whenever I try to load this model with an interpreter, I get an error (it does not matter if I use the Python or C++ API):

ERROR: Regular TensorFlow ops are not supported by this interpreter. Make sure you invoke the Flex delegate before inference.
ERROR: Node number 4 (FlexSoftplus) failed to prepare.

I have a hard time to find documentation on the tf website on how to invoke the Flex delegate for both APIs. I have stumbled across a header file ("tensorflow/lite/delegates/flex/delegate_data.h") which seems to be related to this issue, but including it in my C++ project yields another error:

In file included from /tensorflow/tensorflow/core/common_runtime/eager/context.h:28:0,
                 from /tensorflow/tensorflow/lite/delegates/flex/delegate_data.h:18,
                 from /tensorflow/tensorflow/lite/delegates/flex/delegate.h:19,
                 from demo.cpp:7:
/tensorflow/tensorflow/core/lib/core/status.h:23:10: fatal error: tensorflow/core/lib/core/error_codes.pb.h: No such file or directory
 #include "tensorflow/core/lib/core/error_codes.pb.h"
          ^~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
compilation terminated.

By any chance, has anybody encountered and resolved this before? If you have an example snippet, please share the link!

like image 327
DocDriven Avatar asked Aug 26 '19 13:08

DocDriven


1 Answers

When building TensorFlow Lite libraries using the bazel pipeline, the additional TensorFlow ops library can be included and enabled as follows:

Enable monolithic builds if necessary by adding the --config=monolithic build flag.

Add the TensorFlow ops delegate library dependency to the build dependencies: tensorflow/lite/delegates/flex:delegate.

Note that the necessary TfLiteDelegate will be installed automatically when creating the interpreter at runtime as long as the delegate is linked into the client library. It is not necessary to explicitly install the delegate instance as is typically required with other delegate types.

Python pip package

Python support is actively under development.

source: https://www.tensorflow.org/lite/guide/ops_select

like image 135
harsh Avatar answered Nov 15 '22 19:11

harsh