Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Storing tensorflow models in memory

The program I'm writing involves switching between models during run-time.

I am currently using Saver to save/load models from the disk as specified here: https://www.tensorflow.org/api_docs/python/state_ops/saving_and_restoring_variables#Saver.

The models are fairly small and can be stored in memory, so I was wondering if anyone knows of a way to store and restore these models in-memory instead of saving them to disk.

I tried to modify the tensorflow source to save the model to memory however gen_io_ops seems to be generated during compile time. Another possible way is to use memory mapped files. Does anyone know of an easier way?

like image 705
anubhavashok Avatar asked Jan 02 '17 03:01

anubhavashok


People also ask

How do you save machine learning models TensorFlow?

It is advised to use the save() method to save h5 models instead of save_weights() method for saving a model using tensorflow. However, h5 models can also be saved using save_weights() method. The location along with the weights name is passed as a parameter in this method.

How do you save the best model in TensorFlow?

if save_best_only=True , it only saves when the model is considered the "best" and the latest best model according to the quantity monitored will not be overwritten. If filepath doesn't contain formatting options like {epoch} then filepath will be overwritten by each new better model.


1 Answers

I would just have two different sessions with their own computation graphs. Alternatively, you could just duplicate the computation graph (two copies of the variables, operations, etc) in the same session. Then you would call sess.run(comp1 if useCompOne else comp2), however you'd like to set it up.

like image 104
Jacob Holloway Avatar answered Sep 22 '22 21:09

Jacob Holloway