I trained a BoostedTreesClassifier and would like to use the "directional feature contributions" as laid out in this tutorial. Basically it lets you "interpret" the model's prediction and measure each feature's contribution by using the experimental_predict_with_explanations method. Works great after I train the model, then call the method.
But I want to export the trained estimator with the export_saved_model method. When I load the estimator back with tf.saved_model.load into a Python environment, I apparently lose that functionality because I can't call the experimental_predict_with_explanations method anymore. The loaded model only has the "predict" signature.
Ultimately I'd like to use this trained estimator with Tensorflow Serving. I don't suppose it's available with the "Predict" SignatureDef. Has anyone tried this before?
Trained Estimator
with Tensorflow Serving
is available with the "Predict"
SignatureDef
.
It can be achieved by using build_raw_serving_input_receiver_fn
instead of build_parsing_serving_input_receiver_fn
.
Respective Line of Code is shown below:
serving_input_receiver_fn = tf.estimator.export.build_raw_serving_input_receiver_fn(feature_placeholders)
Complete Code for Classification Model
with Predict SignatureDef
is shown below:
import tensorflow as tf
import iris_data
BATCH_SIZE = 100
STEPS = 1000
Export_Dir = 'Premade_Estimator_Export_Raw' #No need of Version Number
(train_x, train_y), (test_x, test_y) = iris_data.load_data()
type(train_x.values[0][0])
# Feature columns describe how to use the input.
my_feature_columns = []
for key in train_x.keys():
my_feature_columns.append(tf.feature_column.numeric_column(key=key))
print(my_feature_columns)
columns = [('SepalLength', tf.float32), ('SepalWidth', tf.float32),
('PetalLength', tf.float32), ('PetalWidth', tf.float32)]
feature_placeholders = {name: tf.placeholder(dtype, [1], name=name + "_placeholder") for name, dtype in columns}
print(feature_placeholders)
print(type(train_x))
# Build a DNN with 2 hidden layers and 10 nodes in each hidden layer.
classifier = tf.estimator.DNNClassifier(feature_columns=my_feature_columns,
hidden_units=[10, 10], # Two hidden layers of 10 nodes each.
n_classes=3) # The model must choose between 3 classes.
# Train the Model.
classifier.train(input_fn=lambda:iris_data.train_input_fn(train_x, train_y, BATCH_SIZE),steps=STEPS)
eval_result = classifier.evaluate(input_fn=lambda:iris_data.eval_input_fn(test_x, test_y, BATCH_SIZE))
print('\nTest set accuracy: {accuracy:0.3f}\n'.format(**eval_result))
# Generate predictions from the model
expected = ['Setosa', 'Versicolor', 'Virginica']
predict_x = {
'SepalLength': [5.1, 5.9, 6.9],
'SepalWidth': [3.3, 3.0, 3.1],
'PetalLength': [1.7, 4.2, 5.4],
'PetalWidth': [0.5, 1.5, 2.1],
}
predictions = classifier.predict(input_fn=lambda:iris_data.eval_input_fn(features = predict_x, labels = None,
batch_size=BATCH_SIZE))
template = ('\nPrediction is "{}" ({:.1f}%), expected "{}"')
for pred_dict, expec in zip(predictions, expected):
class_id = pred_dict['class_ids'][0]
probability = pred_dict['probabilities'][class_id]
print(template.format(iris_data.SPECIES[class_id],100 * probability, expec))
# This is the Important Step
serving_input_receiver_fn = tf.estimator.export.build_raw_serving_input_receiver_fn(feature_placeholders)
export_dir = classifier.export_saved_model(Export_Dir, serving_input_receiver_fn)
print('Exported to {}'.format(export_dir))
The SignatureDef
of the above Model is shown below:
MetaGraphDef with tag-set: 'serve' contains the following SignatureDefs:
signature_def['predict']:
The given SavedModel SignatureDef contains the following input(s):
inputs['PetalLength'] tensor_info:
dtype: DT_FLOAT
shape: (-1)
name: PetalLength_placeholder:0
inputs['PetalWidth'] tensor_info:
dtype: DT_FLOAT
shape: (-1)
name: PetalWidth_placeholder:0
inputs['SepalLength'] tensor_info:
dtype: DT_FLOAT
shape: (-1)
name: SepalLength_placeholder:0
inputs['SepalWidth'] tensor_info:
dtype: DT_FLOAT
shape: (-1)
name: SepalWidth_placeholder:0
The given SavedModel SignatureDef contains the following output(s):
outputs['all_class_ids'] tensor_info:
dtype: DT_INT32
shape: (-1, 3)
name: dnn/head/predictions/Tile:0
outputs['all_classes'] tensor_info:
dtype: DT_STRING
shape: (-1, 3)
name: dnn/head/predictions/Tile_1:0
outputs['class_ids'] tensor_info:
dtype: DT_INT64
shape: (-1, 1)
name: dnn/head/predictions/ExpandDims_2:0
outputs['classes'] tensor_info:
dtype: DT_STRING
shape: (-1, 1)
name: dnn/head/predictions/str_classes:0
outputs['logits'] tensor_info:
dtype: DT_FLOAT
shape: (-1, 3)
name: dnn/logits/BiasAdd:0
outputs['probabilities'] tensor_info:
dtype: DT_FLOAT
shape: (-1, 3)
name: dnn/head/predictions/probabilities:0
Method name is: tensorflow/serving/predict
Inference
can be performed using the below commands:
sudo docker pull tensorflow/serving
sudo docker run -p 8501:8501 --mount type=bind,source=/usr/local/google/home/Jupyter_Notebooks/TF_Serving/Serving_Made_Easy/Serving_Demystified/Premade_Estimator_Export_Raw,target=/models/Premade_Estimator_Export_Raw -e MODEL_NAME=Premade_Estimator_Export_Raw -t tensorflow/serving &
curl -d '{"signature_name":"predict","instances": [{"SepalLength":[5.1],"SepalWidth":[3.3],"PetalLength":[1.7],"PetalWidth":[0.5]}]}'
-X POST http://localhost:8501/v1/models/Premade_Estimator_Export_Raw:predict
Output
is shown below:
{"predictions": [{ "all_classes": ["0", "1", "2"], "probabilities": [0.996251881, 0.00374808488, 3.86118275e-15], "logits": [14.2761269, 8.69337177, -18.9079208], "class_ids": [0], "classes": ["0"], "all_class_ids": [0, 1, 2]}]}
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With