We train lots of variations of our model with different configuration and requiring different preprocessing of inputs (where the preprocessing is done outside of TensorFlow). I would like to export our models as SavedModels, and I am thinking that we will have an API server that will provide access to the models and handle preprocessing and talking to the TensorFlow server using config that it will retrieve from the model metadata via the TensorFlow server. The model metatdata might be a structured as a JSON, or possibly it could use a protocol buffer. I am unclear what best practices are around this. In particular, the MetaInfoDef protocol buffer has three different fields that seem designed to hold metadata (meta_graph_version
, any_info
, and tags
). But I couldn't find any examples in the wild of the use of any but the tags
field.
// User specified Version string. Can be the name of the model and revision,
// steps this model has been trained to, etc.
string meta_graph_version = 1;
[...]
// A serialized protobuf. Can be the time this meta graph is created, or
// modified, or name of the model.
google.protobuf.Any any_info = 3;
// User supplied tag(s) on the meta_graph and included graph_def.
//
// MetaGraphDefs should be tagged with their capabilities or use-cases.
// Examples: "train", "serve", "gpu", "tpu", etc.
// These tags enable loaders to access the MetaGraph(s) appropriate for a
// specific use-case or runtime environment.
repeated string tags = 4;
(although I am not sure that these three fields can all be retrieved in the same way using the client API to TensorFlow serving?)
@gmr, Adding the proto to a collection via tf.add_to_collection, along with builder.add_meta_graph_and_variables should resolve your issue.
Code for the same is mentioned below:
# Mention the path below where you want the model to be stored
export_dir = "/usr/local/google/home/abc/Jupyter_Notebooks/export"
tf.gfile.DeleteRecursively(export_dir)
tf.reset_default_graph()
# Check below for other ways of adding Proto to Collection
tf.add_to_collection("my_proto_collection", "my_proto_serialized")
builder = tf.saved_model.builder.SavedModelBuilder(export_dir)
with tf.Session() as session:
builder.add_meta_graph_and_variables(
session,
tags=[tf.saved_model.tag_constants.SERVING])
builder.save()
Code for other ways of adding proto to collection are shown below:
tf.add_to_collection("your_collection_name", str(your_proto))
or
any_buf = any_pb2.Any()
tf.add_to_collection("your_collection_name",
any_buf.Pack(your_proto))
The .pb file, saved_model.pb, which is saved in the path you mentioned (export_dir) looks something like mentioned below:
{ # (tensorflow.SavedModel) size=89B
saved_model_schema_version: 1
meta_graphs: { # (tensorflow.MetaGraphDef) size=85B
meta_info_def: { # (tensorflow.MetaGraphDef.MetaInfoDef) size=29B
stripped_op_list: { # (tensorflow.OpList) size=0B
} # meta_graphs[0].meta_info_def.stripped_op_list
tags : [ "serve" ] # size=5
tensorflow_version : "1.13.1" # size=9
tensorflow_git_version: "unknown" # size=7
} # meta_graphs[0].meta_info_def
graph_def: { # (tensorflow.GraphDef) size=4B
versions: { # (tensorflow.VersionDef) size=2B
producer : 23
} # meta_graphs[0].graph_def.versions
} # meta_graphs[0].graph_def
collection_def: { # (tensorflow.MetaGraphDef.CollectionDefEntry) size=46B
key : "my_proto_collection" # size=19
value: { # (tensorflow.CollectionDef) size=23B
bytes_list: { # (tensorflow.CollectionDef.BytesList) size=21B
value: [ "my_proto_serialized" ] # size=19
} # meta_graphs[0].collection_def[0].value.bytes_list
} # meta_graphs[0].collection_def[0].value
} # meta_graphs[0].collection_def[0]
} # meta_graphs[0]
}
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With