I am downloading a *.rar file from my webserver using HttpURLConnection.
I have added some temporary lines of code to roughly measure (it was pretty inaccurate) the download speed when running the download via Java/HttpURLConnection. The download speed was fluctuating somewhere between 400kb/s and 2mb/s, approximately.
Downloading the same file through my browser (Mozilla FireFox) I get the full 12.5mb/s that my webserver supports. Using FireFox I download the file in about 6 seconds, whereas the Java code downloads the file in 12-30 seconds.
Here is a cleaned-up snippet of the code I am testing with, just to illustrate the approach of trying to download the file.
[...]
httpConn = (HttpURLConnection) (new URL(downloadURL)).openConnection();
httpConn.setRequestMethod("POST");
httpConn.setRequestProperty("Cookie", cookie);
[...]
try (InputStream is = httpConn.getInputStream();
FileOutputStream fos = new FileOutputStream(targetFile)) {
int bytesRead;
byte[] buffer = new byte[4096];
while ((bytesRead = is.read(buffer)) != -1) {
fos.write(buffer, 0, bytesRead);
}
}
[...]
I assume there is a bottleneck somewhere within this piece of code that I can't get ahold of.
How do I maximize the download speed?
The problem was the buffer size. Increasing buffer size from byte[4096] to byte[256000] increased the download speed significantly. Furthermore I reduced I/O operations within the while ((bytesRead = is.read(buffer)) != -1) loop, which contained a progress bar update, by making it run in its own thread.
Credits go to @Joachim Isaksson who pointed the buffer-bottleneck out in the comments above.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With