Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

response.flushBuffer() is not working

I am trying to implement a servlet for streaming large objects:

    oracle.sql.BLOB blob = rs.getBLOB('obj');
    InputStream in = blob.getBinaryStream();

    int bufferSize = 1024;
    byte[] buffer = new byte[bufferSize];

    ServletOutputStream out = response.getOutputStream();

    int counter=0
    while((length=in.read(buffer)) != -1){
        out.write(buffer,0,length);
        counter++;
        if(counter % 10 == 0){
        counter=0;
        response.flushBuffer();
    }

This code suppose to send data to client chunk by chunck. Now what's happening is that when I stream large object (100 MB), memory goes up, and server dies sometimes if there are more than one parallel downloads/stream.

Why this flushBuffer() is not sending data to client? The client get popup for open/save file only after response closes.

like image 587
Madhu Avatar asked May 11 '12 13:05

Madhu


2 Answers

You have to set the Content-Length header before writing the data, or the server is forced to buffer all the data until the stream is closed, at which point it can calculate the value itself, write the header, and send all the data. As soon as you get the output stream, before you write any data at all, set the content length:

response.setHeader("Content-Length", String.valueOf(blob.length()));

Most servers are smart enough to flush the buffer themselves at this point, so you probably don't even need to call flushBuffer() -- although it doesn't hurt.

like image 95
Ernest Friedman-Hill Avatar answered Oct 19 '22 22:10

Ernest Friedman-Hill


First of all, you need a response header for the servlet response so that the container can know after how many bytes can the information end:

response.setHeader("Content-Length", String.valueOf(blob.length()));
like image 37
GingerHead Avatar answered Oct 19 '22 23:10

GingerHead