When I use simple_save to save a model, when I get a runtime error when I try to load the model.
The code to save is:
session = Session()
inputs = tf.placeholder(dtype=tf.float32, shape=(None, height, width, in_channel_size), name='input_img')
model = Some_Model(inputs, num_classes=no_of_defects, is_training=False)
logits, _ = model.build_model()
predictor = tf.nn.softmax(self.logits, name='logits_to_softmax')
feed_dict = {inputs: inputs}
prediction_probabilities = session.run(self.predictor, feed_dict=feed_dict)
tf.saved_model.simple_save(self.session, path,
inputs={"inputs" : self.inputs},
outputs={"predictor": self.predictor})
The code to load is:
tf.saved_model.loader.load(session, tag_constants.SERVING, path)
which gives the error:
RuntimeError: MetaGraphDef associated with tags serve could not be found in SavedModel. To inspect available tag-sets in the SavedModel, please use the SavedModel CLI: `saved_model_cli`
When I run
saved_model_cli show --dir path --tag_set serve --signature_def serving_default
I get
The given SavedModel SignatureDef contains the following input(s):
inputs['inputs'] tensor_info:
dtype: DT_FLOAT
shape: (-1, 512, 1024, 8)
name: input_img:0
The given SavedModel SignatureDef contains the following output(s):
outputs['predictor'] tensor_info:
dtype: DT_FLOAT
shape: (-1, 512, 1024, 25)
name: logits_to_softmax:0
Method name is: tensorflow/serving/predict
What am I doing wrong?
The problem is with the load call. It should be:
tf.saved_model.loader.load(session, [tag_constants.SERVING], path)
where tag_constants
is located at tf.saved_model.tag_constants
.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With