I'm mobile developer. And I want to use various Tensorflow Lite models(.tflite
) with MLKit.
But there are some issues, I have no idea of how to know .tflite
model's input/output feature info(these will be parameters for setup).
Is there any way to know that?
Sorry for bad English and thanks.
Update(18.06.13.):
I found this site https://lutzroeder.github.io/Netron/.
This visualize graph based on your uploaded model(like .mlmode
or .tflite
etc.) and find input/output form.
Here is example screenshot! https://lutzroeder.github.io/Netron example
Create a tflite interpreter and (optionally) perform inference. tflite_interpreter. get_tensor_details() will give a list of dictionaries that have weights, biases, their scales, zero_points..etc.
If you already have a tflite model that you did not produce yourself, and you
want to look inside the tflite file and understand your inputs and outputs, you can use flatc
tool and convert
the model to .json
file and read that.
First clone the flatbuffers repo and build flatc.
git clone https://github.com/google/flatbuffers.git
Then you have to have the tensorflow schema.fbs stored locally. Either checkout the tensorflow github or download
that one file.
Then you can run flatc
to generate the json
file from then input tflite model.
flatc -t schema.fbs -- input_model.tflite
This will create a input_model.json
file that can be easily read.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With