Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How to export Keras .h5 to tensorflow .pb?

I have fine-tuned inception model with a new dataset and saved it as ".h5" model in Keras. now my goal is to run my model on android Tensorflow which accepts ".pb" extension only. question is that is there any library in Keras or tensorflow to do this conversion? I have seen this post so far : https://blog.keras.io/keras-as-a-simplified-interface-to-tensorflow-tutorial.html but can't figure out yet.

like image 807
Solix Avatar asked Aug 02 '17 16:08

Solix


People also ask

How do I save the keras model in TensorFlow?

Using save_weights() method Now you can simply save the weights of all the layers using the save_weights() method. It saves the weights of the layers contained in the model. It is advised to use the save() method to save h5 models instead of save_weights() method for saving a model using tensorflow.

Is .h5 a keras model?

Keras H5 formatKeras also supports saving a single HDF5 file containing the model's architecture, weights values, and compile() information. It is a light-weight alternative to SavedModel.


1 Answers

Keras does not include by itself any means to export a TensorFlow graph as a protocol buffers file, but you can do it using regular TensorFlow utilities. Here is a blog post explaining how to do it using the utility script freeze_graph.py included in TensorFlow, which is the "typical" way it is done.

However, I personally find a nuisance having to make a checkpoint and then run an external script to obtain a model, and instead prefer to do it from my own Python code, so I use a function like this:

def freeze_session(session, keep_var_names=None, output_names=None, clear_devices=True):     """     Freezes the state of a session into a pruned computation graph.      Creates a new computation graph where variable nodes are replaced by     constants taking their current value in the session. The new graph will be     pruned so subgraphs that are not necessary to compute the requested     outputs are removed.     @param session The TensorFlow session to be frozen.     @param keep_var_names A list of variable names that should not be frozen,                           or None to freeze all the variables in the graph.     @param output_names Names of the relevant graph outputs.     @param clear_devices Remove the device directives from the graph for better portability.     @return The frozen graph definition.     """     graph = session.graph     with graph.as_default():         freeze_var_names = list(set(v.op.name for v in tf.global_variables()).difference(keep_var_names or []))         output_names = output_names or []         output_names += [v.op.name for v in tf.global_variables()]         input_graph_def = graph.as_graph_def()         if clear_devices:             for node in input_graph_def.node:                 node.device = ""         frozen_graph = tf.graph_util.convert_variables_to_constants(             session, input_graph_def, output_names, freeze_var_names)         return frozen_graph 

Which is inspired in the implementation of freeze_graph.py. The parameters are similar to the script too. session is the TensorFlow session object. keep_var_names is only needed if you want to keep some variable not frozen (e.g. for stateful models), so generally not. output_names is a list with the names of the operations that produce the outputs that you want. clear_devices just removes any device directives to make the graph more portable. So, for a typical Keras model with one output, you would do something like:

from keras import backend as K  # Create, compile and train model...  frozen_graph = freeze_session(K.get_session(),                               output_names=[out.op.name for out in model.outputs]) 

Then you can write the graph to a file as usual with tf.train.write_graph:

tf.train.write_graph(frozen_graph, "some_directory", "my_model.pb", as_text=False) 
like image 54
jdehesa Avatar answered Sep 29 '22 04:09

jdehesa