Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Serving Keras Models With Tensorflow Serving

Right now we are successfully able to serve models using Tensorflow Serving. We have used following method to export the model and host it with Tensorflow Serving.

     ------------
      For exporting 
     ------------------
     from tensorflow.contrib.session_bundle import exporter

     K.set_learning_phase(0)
     export_path = ... # where to save the exported graph
     export_version = ... # version number (integer)

     saver = tf.train.Saver(sharded=True)
     model_exporter = exporter.Exporter(saver)
     signature = exporter.classification_signature(input_tensor=model.input,
                                          scores_tensor=model.output)
     model_exporter.init(sess.graph.as_graph_def(),
                default_graph_signature=signature)
     model_exporter.export(export_path, tf.constant(export_version), sess)

      --------------------------------------

      For hosting
      -----------------------------------------------

      bazel-bin/tensorflow_serving/model_servers/tensorflow_model_server --port=9000 --model_name=default --model_base_path=/serving/models

However our issue is - we want keras to be integrated with Tensorflow serving. We would like to serve the model through Tensorflow serving using Keras. The reason we would like to have that is because - in our architecture we follow couple of different ways to train our model like deeplearning4j + Keras , Tensorflow + Keras, but for serving we would like to use only one servable engine that's Tensorflow Serving. We don't see any straight forward way to achieve that. Any comments ?

Thank you.

like image 617
Sumit Ghosh Avatar asked Apr 27 '17 05:04

Sumit Ghosh


People also ask

Can I use TensorFlow and keras together?

Keras and TensorFlow are open source Python libraries for working with neural networks, creating machine learning models and performing deep learning. Because Keras is a high level API for TensorFlow, they are installed together.

Is TensorFlow faster than keras?

Both Keras and TensorFlow have training models, so there is no difference there. In terms of speed, TensorFlow is made to be fast and operate at a high performance. Therefore, it is much easier and more effective to scale TensorFlow. For Keras, while being written for simplicity it did lose some speed and performance.


1 Answers

Very recently TensorFlow changed the way it exports the model, so the majority of the tutorials available on web are outdated. I honestly don't know how deeplearning4j works, but I use Keras quite often. I managed to create a simple example that I already posted on this issue in TensorFlow Serving Github.

I'm not sure whether this will help you, but I'd like to share how I did and maybe it will give you some insights. My first trial prior to creating my custom model was to use a trained model available on Keras such as VGG19. I did this as follows.

Model creation

import keras.backend as K
from keras.applications import VGG19
from keras.models import Model

# very important to do this as a first thing
K.set_learning_phase(0)

model = VGG19(include_top=True, weights='imagenet')

# The creation of a new model might be optional depending on the goal
config = model.get_config()
weights = model.get_weights()
new_model = Model.from_config(config)
new_model.set_weights(weights)

Exporting the model

from tensorflow.python.saved_model import builder as saved_model_builder
from tensorflow.python.saved_model import utils
from tensorflow.python.saved_model import tag_constants, signature_constants
from tensorflow.python.saved_model.signature_def_utils_impl import     build_signature_def, predict_signature_def
from tensorflow.contrib.session_bundle import exporter

export_path = 'folder_to_export'
builder = saved_model_builder.SavedModelBuilder(export_path)

signature = predict_signature_def(inputs={'images': new_model.input},
                                  outputs={'scores': new_model.output})

with K.get_session() as sess:
    builder.add_meta_graph_and_variables(sess=sess,
                                         tags=[tag_constants.SERVING],
                                         signature_def_map={'predict': signature})
    builder.save()

Some side notes

  • It can vary depending on Keras, TensorFlow, and TensorFlow Serving version. I used the latest ones.
  • Beware of the names of the signatures, since they should be used in the client as well.
  • When creating the client, all preprocessing steps that are needed for the model (preprocess_input() for example) must be executed. I didn't try to add such step in the graph itself as Inception client example.

With respect to serving different models within the same server, I think that something similar to the creation of a model_config_file might help you. To do so, you can create a config file similar to this:

model_config_list: {
  config: {
    name: "my_model_1",
    base_path: "/tmp/model_1",
    model_platform: "tensorflow"
  },
  config: {
     name: "my_model_2",
     base_path: "/tmp/model_2",
     model_platform: "tensorflow"
  }
}

Finally, you can run the client like this:

bazel-bin/tensorflow_serving/model_servers/tensorflow_model_server --port=9000 --config_file=model_config.conf
like image 145
Thomas Paula Avatar answered Sep 17 '22 23:09

Thomas Paula