Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Keras deep learning model to android

I'm developing a real-time object classification app for android. First I created a deep learning model using "keras" and I already have trained model saved as "model.h5" file. I would like to know how can I use that model in android for image classification.

like image 260
Vajira Prabuddhaka Avatar asked Aug 25 '17 03:08

Vajira Prabuddhaka


3 Answers

You cant export Keras directly to Android but you have to save the model

  • Configure Tensorflow as your Keras backend.

  • Save model wights using model.save(filepath) (you already done this)

Then load it with one of the following solutions:

Solution 1: Import model in Tensflow

1- Build Tensorflow model

  • Build tensorflow model from keras model use this code (link updated)

2- Build Android app and call Tensorflow. check this tutorial and this official demo from google to learn how to do it.

Solution 2: Import model in java
1- deeplearning4j a java library allow to import keras model: tutorial link
2- Use deeplearning4j in Android: it is easy since you are in java world. check this tutorial

like image 198
Alex Avatar answered Oct 19 '22 21:10

Alex


First you need to export the Keras model to a Tensorflow model :

def export_model_for_mobile(model_name, input_node_names, output_node_name):
    tf.train.write_graph(K.get_session().graph_def, 'out', \
        model_name + '_graph.pbtxt')

    tf.train.Saver().save(K.get_session(), 'out/' + model_name + '.chkp')

    freeze_graph.freeze_graph('out/' + model_name + '_graph.pbtxt', None, \
        False, 'out/' + model_name + '.chkp', output_node_name, \
        "save/restore_all", "save/Const:0", \
        'out/frozen_' + model_name + '.pb', True, "")

    input_graph_def = tf.GraphDef()
    with tf.gfile.Open('out/frozen_' + model_name + '.pb', "rb") as f:
        input_graph_def.ParseFromString(f.read())

    output_graph_def = optimize_for_inference_lib.optimize_for_inference(
            input_graph_def, input_node_names, [output_node_name],
            tf.float32.as_datatype_enum)

    with tf.gfile.FastGFile('out/tensorflow_lite_' + model_name + '.pb', "wb") as f:
        f.write(output_graph_def.SerializeToString())

You just need to know the input_nodes_names and output_node_names of your graph. This will create a new folder with several files. Among them, one starts with tensorflow_lite_. This is the file you shall move to your Android device.

Then import Tensorflow library on Android and use TensorFlowInferenceInterface to run your model.

implementation 'org.tensorflow:tensorflow-android:1.5.0'

You can check my simple XOR example on Github :

https://github.com/OmarAflak/Keras-Android-XOR

like image 29
Omar Aflak Avatar answered Oct 19 '22 23:10

Omar Aflak


If you want optimize way to do classification then i will suggest you to run inference of your model using armnn android libraries.

You have to follow few steps. 1. Install and setup arm nn libraries in ubuntu. You can take help from below url

https://github.com/ARM-software/armnn/blob/branches/armnn_19_08/BuildGuideAndroidNDK.md

  1. Just import your model and do inference. You can take help from below url

https://developer.arm.com/solutions/machine-learning-on-arm/developer-material/how-to-guides/deploying-a-tensorflow-mnist-model-on-arm-nn/deploying-a-tensorflow-mnist-model-on-arm-nn-single-page

  1. After compilation you will get binary which will take input and give you output

  2. You can run that binary inside any andriod appication

It is optimize way.

like image 2
Parag Jain Avatar answered Oct 19 '22 23:10

Parag Jain