Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How could I convert onnx model to tensorflow saved model? [duplicate]

I am trying to use tf-serving to deploy my torch model. I have exported my torch model to onnx. How could I generate the pb model for tf-serving ?

like image 720
coin cheung Avatar asked Nov 13 '19 10:11

coin cheung


People also ask

Does onnx support TensorFlow?

tensorflow-onnx requires onnx-1.9 or better and will install/upgrade onnx if needed.


1 Answers

Use the onnx/onnx-tensorflow converter tool as a Tensorflow backend for ONNX.

  1. Install onnx-tensorflow: pip install onnx-tf

  2. Convert using the command line tool: onnx-tf convert -t tf -i /path/to/input.onnx -o /path/to/output.pb

Alternatively, you can convert through the python API.

import onnx

from onnx_tf.backend import prepare

onnx_model = onnx.load("input_path")  # load onnx model
tf_rep = prepare(onnx_model)  # prepare tf representation
tf_rep.export_graph("output_path")  # export the model
like image 196
vini_s Avatar answered Sep 21 '22 20:09

vini_s