Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How to implement a async grpc python server?

Tags:

python

grpc

I need to call a celery task for each GRPC request, and return the result. In default GRPC implementation, each request is processed in a separate thread from a threadpool.

In my case, the server is supposed to process ~400 requests in batch mode per second. So one request may have to wait 1 second for the result due to the batch processing, which means the size of the threadpool must be larger than 400 to avoid blocking.

Can this be done asynchronously? Thanks a lot.

class EventReporting(ss_pb2.BetaEventReportingServicer, ss_pb2.BetaDeviceMgtServicer):
  def ReportEvent(self, request, context):
    res = tasks.add.delay(1,2)
    result = res.get() ->here i have to block
    return ss_pb2.GeneralReply(message='Hello, %s!' % result.message)
like image 201
xun changqing Avatar asked Jul 15 '16 03:07

xun changqing


2 Answers

As noted by @Michael in a comment, as of version 1.32, gRPC now supports asyncio in its Python API. If you're using an earlier version, you can still use the asyncio API via the experimental API: from grpc.experimental import aio. An asyncio hello world example has also been added to the gRPC repo. The following code is a copy of the example server:

import logging                                                                  
import asyncio                                                                  
from grpc import aio                                                            
                                                                                
import helloworld_pb2                                                           
import helloworld_pb2_grpc                                                      
                                                                                
                                                                                
class Greeter(helloworld_pb2_grpc.GreeterServicer):                             
                                                                                
    async def SayHello(self, request, context):                                 
        return helloworld_pb2.HelloReply(message='Hello, %s!' % request.name)    
                                                                                
                                                                                
async def serve():                                                              
    server = aio.server()                                                       
    helloworld_pb2_grpc.add_GreeterServicer_to_server(Greeter(), server)        
    listen_addr = '[::]:50051'                                                  
    server.add_insecure_port(listen_addr)                                       
    logging.info("Starting server on %s", listen_addr)                          
    await server.start()                                                        
    await server.wait_for_termination()                                         
                                                                                
                                                                                
if __name__ == '__main__':                                                      
    logging.basicConfig(level=logging.INFO)                                     
    asyncio.run(serve())

See my other answer for how to implement the client.

like image 141
alan Avatar answered Sep 29 '22 22:09

alan


It can be done asynchronously if your call to res.get can be done asynchronously (if it is defined with the async keyword).

While grpc.server says it requires a futures.ThreadPoolExecutor, it will actually work with any futures.Executor that calls the behaviors submitted to it on some thread other than the one on which they were passed. Were you to pass to grpc.server a futures.Executor implemented by you that only used one thread to carry out four hundred (or more) concurrent calls to EventReporting.ReportEvent, your server should avoid the kind of blocking that you describe.

like image 29
Nathaniel Manista At Google Avatar answered Sep 29 '22 22:09

Nathaniel Manista At Google