I've created a keras model with a tensorflow backend but am having difficulty exporting my model for use on ML Engine (as a saved_model.pb
). Here's what I'm doing:
dataset = tf.data.Dataset.from_tensor_slices((data_train, labels_train))
dataset = dataset.map(lambda x, y: ({'reviews': x}, y))
val_dataset = tf.data.Dataset.from_tensor_slices((data_test, labels_test))
val_dataset = val_dataset.map(lambda x, y: ({'reviews': x}, y))
dataset = dataset.batch(self.batch_size).repeat() # repeat infinitely
val_dataset = val_dataset.batch(self.batch_size).repeat()
Then I perform some preprocessing on my Dataset
objects:
dataset = dataset.map(lambda x, y: preprocess_text_and_y(x,y))
val_dataset = val_dataset.map(lambda x, y: preprocess_text_and_y(x,y))
I build my keras model and call .fit(...)
. It all works.
Then I try to export my model, with something like this:
def export(data_vocab):
estimator = tf.keras.estimator.model_to_estimator(model)
def serving():
data_table = tf.contrib.lookup.index_table_from_tensor(tf.constant(self.data_vocab),
default_value=0)
inputs = {
'reviews': tf.placeholder(shape=[1], dtype=tf.string)
}
preproc = inputs.copy()
preproc = preprocess_text(preproc, data_table)
return tf.estimator.export.ServingInputReceiver(preproc, inputs)
estimator.export_savedmodel('./test_export', serving)
And unfortunately, I get back:
ValueError: The last dimension of the inputs to `Dense` should be defined. Found `None`.
I googled around and found this:
How to use TensorFlow Dataset API in combination with dense layers
which says I need to call tf.set_shape(...)
. I'm preprocessing strings into an array of integers with length 100. I've tried adding x['reviews'].set_shape([100])
in my preprocess_text
function
But then that breaks training with:
ValueError: Shapes must be equal rank, but are 2 and 1
Any thoughts on how to fix?
Thanks!
If you set the shape after batching, you will need to set it to [None, 100]
to include the batch axis:
x['reviews'].set_shape([None, 100])
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With