Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

URLConnection getContentLength() is returning a negative value

Here is my code:

url = paths[0];
HttpURLConnection connection = (HttpURLConnection) url.openConnection();
int length = connection.getContentLength(); // i get negetive length
InputStream is = (InputStream) url.getContent();
byte[] imageData = new byte[length]; 
int buffersize = (int) Math.ceil(length / (double) 100);
int downloaded = 0;
int read;
while (downloaded < length) {
    if (length < buffersize) {
        read = is.read(imageData, downloaded, length);
    } else if ((length - downloaded) <= buffersize) {
        read = is.read(imageData, downloaded, length - downloaded);
    } else {
        read = is.read(imageData, downloaded, buffersize);
    }
    downloaded += read;
    publishProgress((downloaded * 100) / length);
}
Bitmap bitmap = BitmapFactory.decodeByteArray(imageData, 0,
        length);
if (bitmap != null) {
    Log.i(TAG, "Bitmap created");
} else {
    Log.i(TAG, "Bitmap not created");
}
is.close();
return bitmap;

I looked at this in Java documentation and the length is negative because of the following reason:

"the number of bytes of the content, or a negative number if unknown. If the content >length is known but exceeds Long.MAX_VALUE, a negative number is returned."

What might be the reason for this? I am trying to download an image. I would like to point out that this is the fourth method that I am trying to download images. The other three are mentioned here.

Edit:

As requested, here is the full method I am using.

protected Bitmap getImage(String imgurl) {

    try {
        URL url = new URL(imgurl);
        HttpURLConnection connection = (HttpURLConnection) url.openConnection();
        int length = connection.getContentLength();
        InputStream is = (InputStream) url.getContent();
        byte[] imageData = new byte[length];
        int buffersize = (int) Math.ceil(length / (double) 100);
        int downloaded = 0;
        int read;
        while (downloaded < length) {
            if (length < buffersize) {
                read = is.read(imageData, downloaded, length);
            } else if ((length - downloaded) <= buffersize) {
                read = is.read(imageData, downloaded, length
                        - downloaded);
            } else {
                read = is.read(imageData, downloaded, buffersize);
            }
            downloaded += read;
        //  publishProgress((downloaded * 100) / length);
        }
        Bitmap bitmap = BitmapFactory.decodeByteArray(imageData, 0,length);
        if (bitmap != null) {
             System.out.println("Bitmap created");
        } else {
            System.out.println("Bitmap not created");
        }
        is.close();
        return bitmap;
    } catch (MalformedURLException e) {
        System.out.println(e);
    } catch (IOException e) {
        System.out.println(e);
    } catch (Exception e) {
        System.out.println(e);
    }
    return null;
}
like image 383
pradeep Avatar asked Mar 25 '11 04:03

pradeep


2 Answers

There are two possible common explanations for this:

  1. The content length is not known. Or more specifically, the server is not setting a "Content-Length" header in the response message.

  2. The content length is greater than Integer.MAX_VALUE. If that happens, getContentLength() returns -1. (The javadocs recommend that getContentLengthLong() is used instead of getContentLength() to avoid that problem.)

Either way, it is better to NOT preallocate a fixed sized byte array to hold the image.

  • One alternative is to create a local ByteArrayOutputStream and copy bytes read from the socket to that. Then call toByteArray to grab the full byte array.

  • Another alternative is to save the data in a temporary file in the file system.

Apparently a common underlying cause of this is that some implementations will by default request "gzip" encoding for the response data. That forces the server to set the content length to -1. You can prevent that like this:

connection.setRequestProperty("Accept-Encoding", "identity");

... but that means that the response won't be compressed. So that is (IMO) a substandard solution.


Your existing client-side code is broken in another respect as well. If you get an IOException or some other exception, the code block will "exit abnormally" without closing the URLConnection. This will result in the leakage of a file descriptor. Do this too many times and your application will fail due to exhaustion of file descriptors ... or local port numbers.

It is best practice to use a try / finally to ensure that URLConnections, Sockets, Streams and so on that tie down external resources are ALWAYS closed.


Preallocating a buffer based on the (purported) content length sets you up for a denial of service attack. Imagine what if the bad guys send you a lot of request with dangerously large "Content-Length" headers and then slow-send the data. OOMEs or worse.

like image 154
Stephen C Avatar answered Sep 22 '22 04:09

Stephen C


By default, this implementation of HttpURLConnection requests that servers use gzip compression.
Since getContentLength() returns the number of bytes transmitted, you cannot use that method to predict how many bytes can be read from getInputStream().
Instead, read that stream until it is exhausted: when read() returns -1.
Gzip compression can be disabled by setting the acceptable encodings in the request header:

 urlConnection.setRequestProperty("Accept-Encoding", "identity");

So try this:

HttpURLConnection connection = (HttpURLConnection) url.openConnection();
connection.setRequestProperty("Accept-Encoding", "identity"); // <--- Add this line
int length = connection.getContentLength(); // i get negetive length

Source (Performance paragraph): http://developer.android.com/reference/java/net/HttpURLConnection.html

like image 40
Jérôme Teisseire Avatar answered Sep 22 '22 04:09

Jérôme Teisseire