Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Docker Tensorflow-Serving Predictions too large

I'm trying to serve my model using Docker + tensorflow-serving. However, due to restrictions with serving a model with an iterator (using
make_initializable_iterator() ), I had to split up my model.

I'm using grpc to interface with my model on docker. The problem is that my predicted tensor is about 10MB and about 4.1MB serialized. The error I'm getting is:

"grpc_message":"Received message larger than max (9830491 vs. 4194304)"

Is there a way to write out my predictions to disk instead of transmitting them in the grpc response? The output file is a 32-channel tensor so I'm unable to decode it as a png before saving to disk using tf.io.write_file.

Thanks!

like image 773
Jan-Michael Tressler Avatar asked Feb 17 '19 00:02

Jan-Michael Tressler


2 Answers

The code to set the size of Messages to Unlimited in gRPC Client Request using C++ is shown below:

grpc::ChannelArguments ch_args;
ch_args.SetMaxReceiveMessageSize(-1);
std::shared_ptr<grpc::Channel> ch = grpc::CreateCustomChannel("localhost:6060", grpc::InsecureChannelCredentials(), ch_args);
like image 28
Tensorflow Support Avatar answered Oct 16 '22 04:10

Tensorflow Support


Default message length is 4MB in gRPC, but we can extend size in your gRPC client and server request in python as something given below. You will be able to send and receive large messages without streaming

request = grpc.insecure_channel('localhost:6060', 
options=[('grpc.max_send_message_length', MAX_MESSAGE_LENGTH), 
('grpc.max_receive_message_length', MAX_MESSAGE_LENGTH)])

In GO lang we have functions refer the URLs

https://godoc.org/google.golang.org/grpc#MaxMsgSize https://godoc.org/google.golang.org/grpc#WithMaxMsgSize

like image 198
Prem Avatar answered Oct 16 '22 04:10

Prem