Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Tensorflow model saving and loading

Tags:

How can save a tensorflow model with model graph like we do in do keras. Instead of defining the whole graph again in prediction file, can we save whole model ( weight and graph) and import it later

In Keras:

checkpoint = ModelCheckpoint('RightLane-{epoch:03d}.h5',monitor='val_loss', verbose=0,  save_best_only=False, mode='auto')

will give one h5 file that we can use for prediction

model = load_model("RightLane-030.h5")

how to do same in native tensorflow

like image 908
Shobhit Verma Avatar asked Jul 13 '18 09:07

Shobhit Verma


People also ask

How do I save Keras model and load it?

Save Your Neural Network Model to JSON This can be saved to a file and later loaded via the model_from_json() function that will create a new model from the JSON specification. The weights are saved directly from the model using the save_weights() function and later loaded using the symmetrical load_weights() function.

How do I save and restore model in TensorFlow?

To save and restore your variables, all you need to do is to call the tf. train. Saver() at the end of you graph. This will create 3 files ( data , index , meta ) with a suffix of the step you saved your model.

What is saved model in TensorFlow?

A SavedModel contains a complete TensorFlow program, including trained parameters (i.e, tf. Variable s) and computation. It does not require the original model building code to run, which makes it useful for sharing or deploying with TFLite, TensorFlow. js, TensorFlow Serving, or TensorFlow Hub.

How do I save model weights in TensorFlow?

Call tf. keras. Model. save to save a model's architecture, weights, and training configuration in a single file/folder .


1 Answers

Method 1: Freeze graph and weights in one file (retraining might not be possible)

This option shows how to save the graph and weights in one file. Its intended use case is for deploying/sharing a model after it has been trained. To this end, we will use the protobuf (pb) format.

Given a tensorflow session (and graph), you can generate a protobuf with

# freeze variables
output_graph_def = tf.graph_util.convert_variables_to_constants(
                               sess=sess,
                               input_graph_def =sess.graph.as_graph_def(),
                               output_node_names=['myMode/conv/output'])

# write protobuf to disk
with tf.gfile.GFile('graph.pb', "wb") as f:
    f.write(output_graph_def.SerializeToString())

where output_node_names expects a list of name strings for the result nodes of the graph (cf. tensorflow documentation).

Then, you can load the protobuf and get the graph with its weight to perform forward passes easily.

with tf.gfile.GFile(path_to_pb, "rb") as f:
    graph_def = tf.GraphDef()
    graph_def.ParseFromString(f.read())
with tf.Graph().as_default() as graph:
    tf.import_graph_def(graph_def, name='')
    return graph

Method 2: Restoring metagraph and checkpoint (easy retraining)

If you want to be able to continue training the model, you might need to restore the full graph, i.e. the weights but also the loss function, some gradient informations (for Adam optimiser for instance), etc.

You need the meta and the checkpoint files generated by tensorflow when you use

saver = tf.train.Saver(...variables...)
saver.save(sess, 'my-model')

This will generate two files, my-model and my-model.meta.

From these two files, you can load the graph with:

  new_saver = tf.train.import_meta_graph('my-model.meta')
  new_saver.restore(sess, 'my-model')

For more details, you can look at the official documentation.

like image 77
BiBi Avatar answered Oct 11 '22 14:10

BiBi