I would like to use NVIDIA TensorRT to run my Tensorflow models. Currenly, TensorRT supports Caffe prototxt network descriptor files.
I was not able to find source code to convert Tensorflow models to Caffe models. Are there any workarounds?
Abstract. TensorFlow-TensorRT (TF-TRT) is a deep-learning compiler for TensorFlow that optimizes TF models for inference on NVIDIA devices.
Note that in TensorFlow 2. x, TF-TRT only supports models saved in the TensorFlow SavedModel format. Next, when we call the converter convert() method, TF-TRT will convert the graph by replacing TensorRT compatible portions of the graph with TRTEngineOps.
4. Performance. We roughly test different numbers of batches to see their average cost time for each inference using onnxruntime-gpu vs TensorRT. As you can see from the following graph, inference with TensorRT is about 10x faster than running with onnxruntime-gpu when batch size is larger than 10.
TensorRT 3.0 supports import/conversion of TensorFlow graphs via it's UFF (universal framework format). Some layer implementations are missing and will require custom implementations via IPlugin interface.
Previous versions didn't support native import of TensorFlow models/checkpoints.
What you can also do is export the layers/network description into your own intermediate format (such as text file) and then use TensorRT C++ API to construct the graph for inference. You'd have to export the convolution weights/biases separately. Make sure to pay attention to weight format - TensorFlow uses NHWC while TensorRT uses NCHW. And for the weights, TF uses RSCK ([filter_height, filter_width, input_depth, output_depth]) and TensorRT uses KCRS.
See this paper for an extended discussion of tensor formats: https://arxiv.org/abs/1410.0759
Also this link has useful relevant info: https://www.tensorflow.org/versions/master/extend/tool_developers/
No workarounds are currently needed as the new TensorRT 3 added support for TensorFlow.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With