Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Using ServletOutputStream to write very large files in a Java servlet without memory issues

I am using IBM Websphere Application Server v6 and Java 1.4 and am trying to write large CSV files to the ServletOutputStream for a user to download. Files are ranging from a 50-750MB at the moment.

The smaller files aren't causing too much of a problem but with the larger files it appears that it is being written into the heap which is then causing an OutOfMemory error and bringing down the entire server.

These files can only be served out to authenticated users over HTTPS which is why I am serving them through a Servlet instead of just sticking them in Apache.

The code I am using is (some fluff removed around this):

    resp.setHeader("Content-length", "" + fileLength);
    resp.setContentType("application/vnd.ms-excel");
    resp.setHeader("Content-Disposition","attachment; filename=\"export.csv\"");

    FileInputStream inputStream = null;

    try
    {
        inputStream = new FileInputStream(path);
        byte[] buffer = new byte[1024];
        int bytesRead = 0;

        do
        {
            bytesRead = inputStream.read(buffer, offset, buffer.length);
            resp.getOutputStream().write(buffer, 0, bytesRead);
        }
        while (bytesRead == buffer.length);

        resp.getOutputStream().flush();
    }
    finally
    {
        if(inputStream != null)
            inputStream.close();
    }

The FileInputStream doesn't seem to be causing a problem as if I write to another file or just remove the write completely the memory usage doesn't appear to be a problem.

What I am thinking is that the resp.getOutputStream().write is being stored in memory until the data can be sent through to the client. So the entire file might be read and stored in the resp.getOutputStream() causing my memory issues and crashing!

I have tried Buffering these streams and also tried using Channels from java.nio, none of which seems to make any bit of difference to my memory issues. I have also flushed the OutputStream once per iteration of the loop and after the loop, which didn't help.

like image 686
Martin Avatar asked Mar 26 '09 10:03

Martin


4 Answers

The average decent servletcontainer itself flushes the stream by default every ~2KB. You should really not have the need to explicitly call flush() on the OutputStream of the HttpServletResponse at intervals when sequentially streaming data from the one and same source. In for example Tomcat (and Websphere!) this is configureable as bufferSize attribute of the HTTP connector.

The average decent servletcontainer also just streams the data in chunks if the content length is unknown beforehand (as per the Servlet API specification!) and if the client supports HTTP 1.1.

The problem symptoms at least indicate that the servletcontainer is buffering the entire stream in memory before flushing. This can mean that the content length header is not set and/or the servletcontainer does not support chunked encoding and/or the client side does not support chunked encoding (i.e. it is using HTTP 1.0).

To fix the one or other, just set the content length beforehand:

response.setContentLengthLong(new File(path).length());

Or when you're not on Servlet 3.1 yet:

response.setHeader("Content-Length", String.valueOf(new File(path).length()));
like image 125
BalusC Avatar answered Nov 02 '22 20:11

BalusC


Does flush work on the output stream.

Really I wanted to comment that you should use the three-arg form of write as the buffer is not necessarily fully read (particularly at the end of the file(!)). Also a try/finally would be in order unless you want you server to die unexpectedly.

like image 21
Tom Hawtin - tackline Avatar answered Nov 02 '22 21:11

Tom Hawtin - tackline


I'm also not sure if flush() on ServletOutputStream works in this case, but ServletResponse.flushBuffer() should send the response to the client (at least per 2.3 servlet spec).

ServletResponse.setBufferSize() sounds promising, too.

like image 1
david a. Avatar answered Nov 02 '22 20:11

david a.


I have used a class that wraps the outputstream to make it reusable in other contexts. It has worked well for me in getting data to the browser faster, but I haven't looked at the memory implications. (please pardon my antiquated m_ variable naming)

import java.io.IOException;
import java.io.OutputStream;

public class AutoFlushOutputStream extends OutputStream {

    protected long m_count = 0;
    protected long m_limit = 4096; 
    protected OutputStream m_out;

    public AutoFlushOutputStream(OutputStream out) {
        m_out = out;
    }

    public AutoFlushOutputStream(OutputStream out, long limit) {
        m_out = out;
        m_limit = limit;
    }

    public void write(int b) throws IOException {

        if (m_out != null) {
            m_out.write(b);
            m_count++;
            if (m_limit > 0 && m_count >= m_limit) {
                m_out.flush();
                m_count = 0;
            }
        }
    }
}
like image 1
Kevin Hakanson Avatar answered Nov 02 '22 19:11

Kevin Hakanson