I am trying to use tf-serving to deploy my torch model. I have exported my torch model to onnx. How could I generate the pb model for tf-serving ?
tensorflow-onnx requires onnx-1.9 or better and will install/upgrade onnx if needed.
Use the onnx/onnx-tensorflow converter tool as a Tensorflow backend for ONNX.
Install onnx-tensorflow: pip install onnx-tf
Convert using the command line tool:
onnx-tf convert -t tf -i /path/to/input.onnx -o /path/to/output.pb
Alternatively, you can convert through the python API.
import onnx
from onnx_tf.backend import prepare
onnx_model = onnx.load("input_path") # load onnx model
tf_rep = prepare(onnx_model) # prepare tf representation
tf_rep.export_graph("output_path") # export the model
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With