are there any tutorials available about export_savedmodel
?
I have gone through this article on tensorflow.org and unittest code on github.com, and still have no idea about how to construct the parameter serving_input_fn
of function export_savedmodel
Do it like this:
your_feature_spec = {
"some_feature": tf.FixedLenFeature([], dtype=tf.string, default_value=""),
"some_feature": tf.VarLenFeature(dtype=tf.string),
}
def _serving_input_receiver_fn():
serialized_tf_example = tf.placeholder(dtype=tf.string, shape=None,
name='input_example_tensor')
# key (e.g. 'examples') should be same with the inputKey when you
# buid the request for prediction
receiver_tensors = {'examples': serialized_tf_example}
features = tf.parse_example(serialized_tf_example, your_feature_spec)
return tf.estimator.export.ServingInputReceiver(features, receiver_tensors)
estimator.export_savedmodel(export_dir, _serving_input_receiver_fn)
Then you can request the served model with "predict" signature name by batch.
Source: https://www.tensorflow.org/guide/saved_model#prepare_serving_inputs
if you are using tensorflow straight from the master branch there's a module tensorflow.python.estimator.export that provides a function for that:
from tensorflow.python.estimator.export import export
feature_spec = {'MY_FEATURE': tf.constant(2.0, shape=[1, 1])}
serving_input_fn = export.build_raw_serving_input_receiver_fn(feature_spec)
Unfortunately at least for me it will not go further than that but I'm not sure if my model is really correct so maybe you have more luck than I do.
Alternatively, there are the following functions for the current version installed from pypi:
serving_input_fn = tf.contrib.learn.utils.build_parsing_serving_input_fn(feature_spec)
serving_input_fn = tf.contrib.learn.utils.build_default_serving_input_fn(feature_spec)
But I couldn't get them to work, too.
Probably, I'm not understanding this correctly so I hope you'll have more luck.
chris
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With