I followed the step in one of the TF beginner tutorial to create a simple classification model. They are the following:
from __future__ import absolute_import, division, print_function, unicode_literals
import numpy as np
import pandas as pd
import tensorflow as tf
from tensorflow import feature_column
from tensorflow.keras import layers
from sklearn.model_selection import train_test_split
URL = 'https://storage.googleapis.com/applied-dl/heart.csv'
dataframe = pd.read_csv(URL)
dataframe.head()
train, test = train_test_split(dataframe, test_size=0.2)
train, val = train_test_split(train, test_size=0.2)
def df_to_dataset(dataframe, shuffle=True, batch_size=32):
dataframe = dataframe.copy()
labels = dataframe.pop('target')
ds = tf.data.Dataset.from_tensor_slices((dict(dataframe), labels))
if shuffle:
ds = ds.shuffle(buffer_size=len(dataframe))
ds = ds.batch(batch_size)
return ds
batch_size = 5 # A small batch sized is used for demonstration purposes
train_ds = df_to_dataset(train, batch_size=batch_size)
val_ds = df_to_dataset(val, shuffle=False, batch_size=batch_size)
test_ds = df_to_dataset(test, shuffle=False, batch_size=batch_size)
feature_columns = []
for header in ['age', 'trestbps', 'chol', 'thalach', 'oldpeak', 'slope', 'ca']:
feature_columns.append(feature_column.numeric_column(header))
thal_embedding = feature_column.embedding_column(thal, dimension=8)
feature_columns.append(thal_embedding)
feature_layer = tf.keras.layers.DenseFeatures(feature_columns)
batch_size = 32
train_ds = df_to_dataset(train, batch_size=batch_size)
val_ds = df_to_dataset(val, shuffle=False, batch_size=batch_size)
test_ds = df_to_dataset(test, shuffle=False, batch_size=batch_size)
model = tf.keras.Sequential([
feature_layer,
layers.Dense(128, activation='relu'),
layers.Dense(128, activation='relu'),
layers.Dense(1, activation='sigmoid')
])
model.compile(optimizer='adam',
loss='binary_crossentropy',
metrics=['accuracy'],
run_eagerly=True)
model.fit(train_ds,
validation_data=val_ds,
epochs=5)
And I saved the model with:
model.save("model/", save_format='tf')
Then, I try to serve this model using this TF tutorial. I do the following:
docker pull tensorflow/serving
docker run -p 8501:8501 --mount type=bind,source=/path/to/model/,target=/models/model -e MODEL_NAME=mo
And I try to call the model this way:
curl -d '{"inputs": {"age": [0], "trestbps": [0], "chol": [0], "thalach": [0], "oldpeak": [0], "slope": [1], "ca": [0], "exang": [0], "restecg": [0], "fbs": [0], "cp": [0], "sex": [0], "thal": ["normal"], "target": [0] }}' -X POST http://localhost:8501/v1/models/model:predict
I get the following error:
{ "error": "indices = 1 is not in [0, 1)\n\t [[{{node StatefulPartitionedCall_51/StatefulPartitionedCall/sequential/dense_features/thal_embedding/thal_embedding_weights/GatherV2}}]]" }
It seems to be related to the embedding layer for the "thal" feature. But I have no idea what "indices = 1 is not in [0, 1)" means and why it happens.
When the error occurs, here is what the TF docker server logs:
2019-09-23 12:50:43.921721: W external/org_tensorflow/tensorflow/core/framework/op_kernel.cc:1502] OP_REQUIRES failed at lookup_table_op.cc:952 : Failed precondition: Table already initialized.
Any idea where the error comes from and how I could fix it?
Python version: 3.6
tensorflow version: 2.0.0-rc0
latest TensorFlow/serving (as of 20/09/2019)
Model signature:
signature_def['__saved_model_init_op']:
The given SavedModel SignatureDef contains the following input(s):
The given SavedModel SignatureDef contains the following output(s):
outputs['__saved_model_init_op'] tensor_info:
dtype: DT_INVALID
shape: unknown_rank
name: NoOp
Method name is:
signature_def['serving_default']:
The given SavedModel SignatureDef contains the following input(s):
inputs['age'] tensor_info:
dtype: DT_INT32
shape: (-1, 1)
name: serving_default_age:0
inputs['ca'] tensor_info:
dtype: DT_INT32
shape: (-1, 1)
name: serving_default_ca:0
inputs['chol'] tensor_info:
dtype: DT_INT32
shape: (-1, 1)
name: serving_default_chol:0
inputs['cp'] tensor_info:
dtype: DT_INT32
shape: (-1, 1)
name: serving_default_cp:0
inputs['exang'] tensor_info:
dtype: DT_INT32
shape: (-1, 1)
name: serving_default_exang:0
inputs['fbs'] tensor_info:
dtype: DT_INT32
shape: (-1, 1)
name: serving_default_fbs:0
inputs['oldpeak'] tensor_info:
dtype: DT_FLOAT
shape: (-1, 1)
name: serving_default_oldpeak:0
inputs['restecg'] tensor_info:
dtype: DT_INT32
shape: (-1, 1)
name: serving_default_restecg:0
inputs['sex'] tensor_info:
dtype: DT_INT32
shape: (-1, 1)
name: serving_default_sex:0
inputs['slope'] tensor_info:
dtype: DT_INT32
shape: (-1, 1)
name: serving_default_slope:0
inputs['thal'] tensor_info:
dtype: DT_STRING
shape: (-1, 1)
name: serving_default_thal:0
inputs['thalach'] tensor_info:
dtype: DT_INT32
shape: (-1, 1)
name: serving_default_thalach:0
inputs['trestbps'] tensor_info:
dtype: DT_INT32
shape: (-1, 1)
name: serving_default_trestbps:0
The given SavedModel SignatureDef contains the following output(s):
outputs['output_1'] tensor_info:
dtype: DT_FLOAT
shape: (-1, 1)
name: StatefulPartitionedCall:0
Method name is: tensorflow/serving/predict
I have come across the same issues. Change as the following format.
curl -d '{"inputs": {"age": [[0]], "trestbps": [[0]], "chol": [[0]], "thalach": [[0]], "oldpeak": [[0]], "slope": [[1]], "ca": [[0]], "exang": [[0]], "restecg": [[0]], "fbs": [[0]], "cp": [[0]], "sex": [[0]], "thal": [["normal"]], "target": [[0]] }}' -X POST http://localhost:8501/v1/models/model:predict
Note: all change to [["normal"]] or [[0]]
I am also trying to serve a model consisting an embedding layer, lstm layer etc but I am receing some other errors. I have even raised an issue on TF.
Anyways, the problem I see in your code is with the type of saved model you are using for serving with Docker. If you read here, it says following point-
A
SavedModel to serve
which is not the keras model.save
but is another TF API, here is the way describing to create SavedModel from keras trained model. Give this a try and let us know the results.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With