Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How can I avoid OutOfMemoryErrors when using Commons FileUpload's DiskFileItem to upload large files?

I am getting OutOfMemoryErrors when uploading large (>300MB) files to a servlet utilizing Commons FileUpload 1.2.1. It seems odd, because the entire point of using DiskFileItem is to prevent the (possibly large) file from residing in memory. I am using the default size threshold of 10KB, so that's all that should ever be loaded into the heap, right? Here is the partial stack trace:

java.lang.OutOfMemoryError
       at java.io.FileInputStream.readBytes(Native Method)
       at java.io.FileInputStream.read(FileInputStream.java:177)
       at org.apache.commons.fileupload.disk.DiskFileItem.get(DiskFileItem.java:334)
       at org.springframework.web.multipart.commons.CommonsMultipartFile.getBytes(CommonsMultipartFile.java:114)

Why is this happening? Is there some configuration I'm missing? Any tips/tricks to avoid this situation besides increasing my heap size?

I really shouldn't have to increase my heap, because in theory the most that should be loaded into memory from this operation is a little over 10KB. Plus, my heap max (-Xmx) is already set for 1GB which should be plenty.

like image 221
Robert Campbell Avatar asked Nov 07 '09 17:11

Robert Campbell


1 Answers

When dealing with file uploads, especially big ones, you should process those files as streams which you slurp into a medium-size in-memory buffer and copy directly into your output file. The wrong way to do it is to inhale the whole thing into memory before writing it out.

The doc on commons-upload mentions, just below the middle, how to "Process a file upload". If you remember to copy from the inputstream to the outputstream in reasonably sized chunks (say, 1 MB), you should have no problem.

like image 120
Carl Smotricz Avatar answered Sep 28 '22 00:09

Carl Smotricz