Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

passing file instance as an argument to the celery task raises "ValueError: I/O operation on closed file"

I need to pass file as argument to the celery task, but the passed file somehow got there closed. It happens just in case I'm executing the task asynchronous way. Is this an expected behavior?

views:

from engine.tasks import s3_upload_handler
def myfunc():
    f = open('/app/uploads/pic.jpg', 'rb')
    s3_file_handler.apply_async(kwargs={"uploaded_file" : f,"file_name" : "test.jpg"})

tasks:

def s3_upload_handler(uploaded_file,file_name):
    ...
    #some code for uploading to s3

traceback:

Traceback (most recent call last):
  File "/usr/local/lib/python2.7/dist-packages/celery/app/trace.py", line 240, in trace_task
    R = retval = fun(*args, **kwargs)
  File "/usr/local/lib/python2.7/dist-packages/celery/app/trace.py", line 437, in __protected_call__
    return self.run(*args, **kwargs)
  File "/app/photohosting/engine/tasks.py", line 34, in s3_upload_handler
    key.set_contents_from_file(uploaded_file)
  File "/usr/local/lib/python2.7/dist-packages/boto/s3/key.py", line 1217, in set_contents_from_file
    spos = fp.tell()
ValueError: I/O operation on closed file

flower logs:

kwargs  {
         'file_name': 'test.jpg', 
         'uploaded_file': <closed file '<uninitialized file>', 
          mode '<uninitialized file>' at 0x7f6ab9e75e40>
         }
like image 825
Valentin Kantor Avatar asked Sep 01 '14 07:09

Valentin Kantor


1 Answers

Yes, of course, the file would get there closed. Asynchronous celery tasks run in a completely separate process (moreover, they can even run on a different machine) and there is no way to pass an open file to it.

You should close the file in the process from where you call the task, and then pass its name and maybe position in file (if you need it) to the task and then reopen it in the task.

like image 109
Spc_555 Avatar answered Oct 13 '22 01:10

Spc_555