I have created a custom tensorflow lite model using retrain.py from https://github.com/tensorflow/hub/blob/master/examples/image_retraining/retrain.py using the following command
python retrain.py --image_dir newImageDirectory --tfhub_module https://tfhub.dev/google/imagenet/mobilenet_v2_100_224/feature_vector/1
Then I convert using toco
the output_graph.pb
file to a lite file. Using the below command:
bazel run tensorflow/contrib/lite/toco:toco -- --input_file=/tmp/output_graph.pb --output_file=/tmp/optimized.lite --input_format=TENSORFLOW_GRAPHDEF --output_format=TFLITE --inpute_shape=1,224,224,3 --input_array=input --output_array=final_result --inference_type=FLOAT --input_data_type=FLOAT
Then I take the new lite file and the labels.txt file and put them in tensorflow for poets 2 https://github.com/googlecodelabs/tensorflow-for-poets-2 to see if I can have it start to classify new categories. When the application launches I receive the following error:
Caused by: java.lang.NullPointerException: Can not allocate memory for the interpreter at org.tensorflow.lite.NativeInterpreterWrapper.createInterpreter(Native Method)
at org.tensorflow.lite.NativeInterpreterWrapper.<init>(NativeInterpreterWrapper.java:63)
at org.tensorflow.lite.NativeInterpreterWrapper.<init>(NativeInterpreterWrapper.java:51)
at org.tensorflow.lite.Interpreter.<init>(Interpreter.java:90)
at com.example.android.tflitecamerademo.ImageClassifier.<init>(ImageClassifier.java:97)
This is due to the computer you are using, make sure you have enough ram to allocate for the program, or try increasing your swap file size?
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With