I must be doing something wrong... my gRPC server is implemented in node.js:
function handler(call, callback) {
console.log('Received request at ' + Date.now());
setTimeout(() => {
callback({ message: 'Done and done' });
}, 100);
}
If I call it 1,000 in Node, I get 1,000 responses in about 100ms:
const resps = [];
for (let i = 0; i < 1000; i += 1) {
client.doit({ data }, (err, resp) => {
resps.push(resp);
if (resps.length === 1000) {
onDone();
}
});
}
However, calling the server from Python using the service.future I can see the server only receiving a request after the previous one has returned:
for _ in range(1000):
message = Message(data=data)
resp = client.doit.future(message)
resp = resp.result()
resps.append(resp)
I get that Node's IO paradigm is different (everything is async; event loop; etc.), and the Python example above blocks on out.result()
, but my question is: can could I change/optimize the Python client so it can make multiple calls to my server without waiting for the first one to return?
This tutorial provides a basic Python programmer’s introduction to working with gRPC. By walking through this example you’ll learn how to: Define a service in a .proto file. Generate server and client code using the protocol buffer compiler. Use the Python gRPC API to write a simple client and server for your service.
It can be done asynchronously if your call to res.get can be done asynchronously (if it is defined with the async keyword). While grpc.server says it requires a futures.ThreadPoolExecutor, it will actually work with any futures.Executor that calls the behaviors submitted to it on some thread other than the one on which they were passed.
If you're using an earlier version, you can still use the asyncio API via the experimental API: from grpc.experimental import aio. An asyncio hello world example has also been added to the gRPC repo.
You can see the complete client example in greeter_async_client.cc. The server implementation requests an RPC call with a tag and then waits for the completion queue to return the tag. The basic flow for handling an RPC asynchronously is: Wait for the completion queue to return the tag.
You can make asynchronous unary calls in python like so:
class RpcHandler:
def rpc_async_req(self, stub):
def process_response(future):
duck.quack(future.result().quackMsg)
duck = Duck()
call_future = stub.Quack.future(pb2.QuackRequest(quackTimes=5)) #non-blocking call
call_future.add_done_callback(process_response) #non-blocking call
print('sent request, we could do other stuff or wait, lets wait this time. . .')
time.sleep(12) #the main thread would drop out here with no results if I don't sleep
print('exiting')
class Duck:
def quack(self, msg):
print(msg)
def main():
channel = grpc.insecure_channel('localhost:12345')
stub = pb2_grpc.DuckServiceStub(channel)
rpc_handler = RpcHandler()
rpc_handler.rpc_async_req(stub=stub)
if __name__ == '__main__':
main()
proto
syntax = "proto3";
package asynch;
service DuckService {
rpc Quack (QuackRequest) returns (QuackResponse);
}
message QuackRequest {
int32 quackTimes = 1;
}
message QuackResponse {
string quackMsg = 1;
}
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With