I have a TF model that works well, constructed with Python and TFlearn. Is there a way to run this model on another system without installing Tensorflow on it? It is already pre-trained, so I just need to run data through it.
I am aware of tfcompile (Thread here), but it seems quite complex to set up. Are there any alternatives?
The . pb format is the protocol buffer (protobuf) format, and in Tensorflow, this format is used to hold models. Protobufs are a general way to store data by Google that is much nicer to transport, as it compacts the data more efficiently and enforces a structure to the data.
A SavedModel contains a complete TensorFlow program, including trained parameters (i.e, tf. Variable s) and computation. It does not require the original model building code to run, which makes it useful for sharing or deploying with TFLite, TensorFlow. js, TensorFlow Serving, or TensorFlow Hub.
Is there a way to run this model on another system without installing Tensorflow on it? It is already pre-trained, so I just need to run data through it.
Yes
After you have your model trained . Use tf.python.tools.freeze_graph
and tf.python.tools.optimize_for_inference_lib
to freeze and optimize the model for inference on other devices like Android.
The output of the above will be
[These functions will converts all the Variables of the Model to Constant Operations and exports to a protobuf file]
Use the optimized graph protobuf file and load it using Inference methods available in Java and other Tensorflow APIs. Pass the data and get the output.
[ Note for this you didn't installed complete Tensorflow but you only needed the inference library]
A Simple example is demonstrated here :
https://omid.al/posts/2017-02-20-Tutorial-Build-Your-First-Tensorflow-Android-App.html
It is for Android but procedure should be same for Java.
For C++ :click here
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With