Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Python run_in_executor and forget?

How can I set a blocking function to be run in a executor, in a way that the result doesn't matter, so the main thread shouldn't wait or be slowed by it.

To be honest I'm not sure if this is even the right solution for it, all I want is to have some type of processing queue separated from the main process so that it doesn't block the server application from returning requests, as this type of web server runs one worker for many requests.

Preferably I would like to keep away from solutions like Celery, but if that's the most optimal I would be willing to learn it.

The context here is a async web server that generates pdf files with large images.

app = Sanic()
#App "global" worker
executor = ProcessPoolExecutor(max_workers=5)

app.route('/')
async def getPdf(request):
  asyncio.create_task(renderPdfsInExecutor(request.json))
  #This should be returned "instantly" regardless of pdf generation time
  return response.text('Pdf being generated, it will be sent to your email when done')

async def renderPdfsInExecutor(json):
  asyncio.get_running_loop.run_in_executor(executor, syncRenderPdfs, json)

def syncRenderPdfs(json)
  #Some PDF Library that downloads images synchronously
  pdfs = somePdfLibrary.generatePdfsFromJson(json)
  sendToDefaultMail(pdfs)

The above code gives the error (Yes, it is running as admin) :

PermissionError [WinError 5] Access denied
Future exception was never retrieved

Bonus question: Do I gain anything by running a asyncio loop inside the executor? So that if it is handling several PDF requests at once it will distribute the processing between them. If yes, how do I do it?

like image 888
Mojimi Avatar asked Feb 18 '19 14:02

Mojimi


People also ask

What is Run_in_executor Python?

run_in_executor is used to manage threads from within an event loop. To this end, it needs to wrap the thread into a Future, which needs to be assigned to an event loop (in one way or another). The reason the method is stored directly in a loop object is probably historical. It might as well have been asyncio.

What does Run_in_executor return?

The run_in_executor() method of the event loop takes an executor instance, a regular callable to invoke, and any arguments to be passed to the callable. It returns a Future that can be used to wait for the function to finish its work and return something. If no executor is passed in, a ThreadPoolExecutor is created.

How do I stop Asyncio from running?

Run an asyncio Event Loop run_until_complete(<some Future object>) – this function runs a given Future object, usually a coroutine defined by the async / await pattern, until it's complete. run_forever() – this function runs the loop forever. stop() – the stop function stops a running loop.

Which function is used to run Awaitables concurrently in Asyncio?

gather() method - It runs awaitable objects (objects which have await keyword) concurrently.


1 Answers

Ok, so first of all there is a misunderstanding. This

async def getPdf(request):
    asyncio.create_task(renderPdfsInExecutor(request.json))
    ...

async def renderPdfsInExecutor(json):
    asyncio.get_running_loop.run_in_executor(executor, syncRenderPdfs, json)

is redundant. It is enough to do

async def getPdf(request):
    asyncio.get_running_loop.run_in_executor(executor, syncRenderPdfs, request.json)
    ...

or (since you don't want to await) even better

async def getPdf(request):
    executor.submit(syncRenderPdfs, request.json)
    ...

Now the problem you get is because syncRenderPdfs throws PermissionError. It is not handled so Python warns you "Hey, some background code threw an error. But the code is not owned by anyone so what the heck?". That's why you get Future exception was never retrieved. You have a problem with the pdf library itself, not with asyncio. Once you fix that inner problem it is also a good idea to be safe:

def syncRenderPdfs(json)
    try:
        #Some PDF Library that downloads images synchronously
        pdfs = somePdfLibrary.generatePdfsFromJson(json)
        sendToDefaultMail(pdfs)
    except Exception:
        logger.exception('Something went wrong')  # or whatever

Your "permission denied" issue is a whole different thing and you should debug it and/or post a separate question for that.

As for the final question: yes, executor will queue and evenly distribute tasks between workers.

EDIT: As we've talked in comments the actual problem might be with the Windows environment you work on. Or more precisely with the ProcessPoolExecutor, i.e. spawning processes may change permissions. I advice using ThreadPoolExecutor, assuming it works fine on the platform.

like image 185
freakish Avatar answered Sep 30 '22 14:09

freakish