I had deployed a model which used a tfhub model to tensorflow-serving using a docker.
Here is the tfhub model contained in my model:
https://tfhub.dev/google/universal-sentence-encoder-multilingual/1
Here is the command to run the docker
docker run -t --rm -p 8501:8501 \
-v "/docker_dir/model_tf_serving:/models/mymodel" \
-e MODEL_NAME=mymodel \
tensorflow/serving &
Error happened:
Not found: Op type not registered 'SentencepieceEncodeSparse' in binary running on c5e507bf091b. Make sure the Op and Kernel are registered in the binary running in this process. Note that if you are loading a saved graph which used ops from tf.contrib, accessing (e.g.) `tf.contrib.resampler` should be done before importing the graph, as contrib ops are lazily registered when the module is first accessed.
So is there a way to fix this?
I don't know if you're still having that problem but it worked for me to
pip install tensorflow-text
import tensorflow_text
since that tf-hub modules depend on that package.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With