I did the download a web page with the HttpURLConnection.getInputStream() and to get the content to a String, i do the following method:
String content="";
isr = new InputStreamReader(pageContent);
br = new BufferedReader(isr);
try {
    do {
            line = br.readLine();
            content += line;
        } while (line != null);
        return content;
    } catch (Exception e) {
        System.out.println("Error: " + e);
        return null;
    }
The download of the page is fast, but the processing to get the content to String is very slow. There is another way faster to get the content to a String?
I transform it to String to insert in the database.
Read into buffer by number of bytes, not something arbitrary like lines. That alone should be a good start to speeding this up, as the reader will not have to find the line end.
Use a StringBuffer instead.
Edit for an example:
StringBuffer buffer=new StringBuffer();
for(int i=0;i<20;++i)
  buffer.append(i.toString());
String result=buffer.toString();
                        If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With