Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Django Ajax Upload Outside of a Form

Tags:

ajax

django

I am trying to use Valum's Ajax Upload to do file uploads on a Django- based site I am making. Currently I am avoiding a form simply because AU sends the upload as the entirety of the POST data in an ajax request. Right now I have a very naive approach to doing this:

upload = SimpleUploadedFile( filename, request.raw_post_data )
...then I loop through the chunks to write to disk...

This works great... on small files. I've tested with PDFs, various other files, and up to the ~20MB Google Chrome deb package and they all do great. However, if I move up to something like a CD or DVD iso it bombs horribly. Often Django sends back an Out of Memory response. On the surface this makes sense since SimpleUploadedFile is an in-memory version of the upload classes. I cannot see how to use TemporaryUploadedFile because it does not take actual content in its constructor. As a side note: I would think after using up the available RAM it would go to virtual memory, but whatever.

So, my question is, how do I get this to work? Is there a better way to read in the file? I tried directly reading the raw_post_data via Python's IO (system uses 2.6.5) but FileIO's ascii encoder/decoder will obviously complain about non-ascii characters when working with binary files. I have not been able to find info on changing the encoder/decoder.

I wouldn't mind passing the data into a form and having Django do the work of picking the right upload class and so on, but I cannot figure out how to pass it because something like

upload_form = UploadForm( request.POST, request.FILES )

will not work because the POST contains the file and not the normal Django info and FILES does not exist.

As I said, I'm not worried about the method of the solution, just that I get something that works! Thanks!

like image 925
Alex Kuhl Avatar asked Nov 16 '10 15:11

Alex Kuhl


1 Answers

Well I found two solutions if anyone is interested.

The first is a pure-Python way of doing it that is moderately successful.

with BufferedReader( BytesIO( request.raw_post_data ) ) as stream:
  with BufferedWriter( FileIO( "/tmp/foo.bar", "wb" ) ) as destination:
    foo = stream.read( 1024 )
    while foo:
      destination.write( foo )
      foo = stream.read( 1024 )

It worked on testing for small files (up to 20MB) but failed when I tried it with ISOs (~600MB) or larger files. I didn't try anything between the 20MB and 600MB so not sure where the break point is. I've copied the bottom of the trace below, I'm not sure what the root problem is in this situation. There seemed to be a struggle with memory, but I had enough RAM+swap to hold the file three times over so not sure why there was an issue. Not sure if using other forms of Python read/write or not using buffers would help here.

[error] [client 127.0.0.1]   File "/usr/local/lib/python2.6 /dist-packages/django/core/handlers/wsgi.py", line 69, in safe_copyfileobj, referer: http://localhost/project/
[error] [client 127.0.0.1]     buf = fsrc.read(min(length, size)), referer: http://localhost/project/
[error] [client 127.0.0.1] TemplateSyntaxError: Caught IOError while rendering: request data read error, referer: http://localhost/project/

The solution that has worked with everything I've thrown at it, up to 2GB files at least, required Django 1.3. They have added file-like support for reading directly from HttpRequest so I took advantage of that.

with BufferedWriter( FileIO( "/tmp/foo.bar", "wb" ) ) as destination:
  foo = request.read( 1024 )
  while foo:
    destination.write( foo )
    foo = request.read( 1024 ) 
like image 151
Alex Kuhl Avatar answered Nov 06 '22 13:11

Alex Kuhl