Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Keras h5 to Tensorflow serving in 2019?

i tried to follow this tutorial on how to convert a Keras H5 Model zu ProtoBuff and serving it using Tensorflow Serve: https://towardsdatascience.com/deploying-keras-models-using-tensorflow-serving-and-flask-508ba00f1037

That tutorial among many other resources on the web use "tf.saved_model.simple_save", which is deprecated and removed by now (March 2019). Converting the h5 into pb using freeze_session as shown here: How to export Keras .h5 to tensorflow .pb?

Seems to miss a "serve" Tag, as the tensorflow_model_server outputs:

Loading servable: {name: ImageClassifier version: 1} failed: Not found: Could not find meta graph def matching supplied tags: { serve }. To inspect available tag-sets in the SavedModel, please use the SavedModel CLI: saved_model_cli

checked it with saved_model_cli, there are no tags.

What is the way to make a h5 model serveable in tensorflow_server nowadays?

like image 426
Jens Caasen Avatar asked Mar 22 '19 15:03

Jens Caasen


People also ask

What is keras model H5?

H5 is a file format to store structured data, it's not a model by itself. Keras saves models in this format as it can easily store the weights and model configuration in a single file.

What is TF serving?

“TensorFlow Serving is a flexible, high-performance serving system for machine learning models, designed for production environments. TensorFlow Serving makes it easy to deploy new algorithms and experiments while keeping the same server architecture and APIs.


1 Answers

NOTE: This applies to TF 2.0+

I'm assuming you have your Keras model in model.h5. Firstly, just load the model with tensorflow's implementation of Keras:

from tensorflow import keras
model = keras.models.load_model('model.h5')

Then, simply export a SavedModel

keras.experimental.export_saved_model(model, 'path_to_saved_model')

Finally, apply any transformation you nomally d to go from SavedModel to the .pb inference file (e.g.: freezing, optimizing for inference, etc)

You can hve more details and a full example in TF's official guide for saving and serializing models in TF 2.0

like image 138
GPhilo Avatar answered Nov 15 '22 07:11

GPhilo