Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

URLConnection is not allowing me to access data on Http errors (404,500,etc)

Tags:

I am making a crawler, and need to get the data from the stream regardless if it is a 200 or not. CURL is doing it, as well as any standard browser.

The following will not actually get the content of the request, even though there is some, an exception is thrown with the http error status code. I want the output regardless, is there a way? I prefer to use this library as it will actually do persistent connections, which is perfect for the type of crawling I am doing.

package test;  import java.net.*; import java.io.*;  public class Test {      public static void main(String[] args) {           try {              URL url = new URL("http://github.com/XXXXXXXXXXXXXX");             URLConnection connection = url.openConnection();              DataInputStream inStream = new DataInputStream(connection.getInputStream());             String inputLine;              while ((inputLine = inStream.readLine()) != null) {                 System.out.println(inputLine);             }             inStream.close();         } catch (MalformedURLException me) {             System.err.println("MalformedURLException: " + me);         } catch (IOException ioe) {             System.err.println("IOException: " + ioe);         }     } } 

Worked, thanks: Here is what I came up with - just as a rough proof of concept:

import java.net.*; import java.io.*;  public class Test {      public static void main(String[] args) { //InputStream error = ((HttpURLConnection) connection).getErrorStream();          URL url = null;         URLConnection connection = null;         String inputLine = "";          try {              url = new URL("http://verelo.com/asdfrwdfgdg");             connection = url.openConnection();              DataInputStream inStream = new DataInputStream(connection.getInputStream());              while ((inputLine = inStream.readLine()) != null) {                 System.out.println(inputLine);             }             inStream.close();         } catch (MalformedURLException me) {             System.err.println("MalformedURLException: " + me);         } catch (IOException ioe) {             System.err.println("IOException: " + ioe);              InputStream error = ((HttpURLConnection) connection).getErrorStream();              try {                 int data = error.read();                 while (data != -1) {                     //do something with data...                     //System.out.println(data);                     inputLine = inputLine + (char)data;                     data = error.read();                     //inputLine = inputLine + (char)data;                 }                 error.close();             } catch (Exception ex) {                 try {                     if (error != null) {                         error.close();                     }                 } catch (Exception e) {                  }             }         }          System.out.println(inputLine);     } } 
like image 886
MichaelICE Avatar asked Feb 03 '12 13:02

MichaelICE


People also ask

What is the difference between URLConnection and HttpURLConnection?

URLConnection is the base class. HttpURLConnection is a derived class which you can use when you need the extra API and you are dealing with HTTP or HTTPS only. HttpsURLConnection is a 'more derived' class which you can use when you need the 'more extra' API and you are dealing with HTTPS only.

What is HTTP URL connection?

A URLConnection with support for HTTP-specific features. See the spec for details. Each HttpURLConnection instance is used to make a single request but the underlying network connection to the HTTP server may be transparently shared by other instances.

What is URL and URLConnection in Java?

URLConnection Class in Java is an abstract class that represents a connection of a resource as specified by the corresponding URL. It is imported by the java.net package.


2 Answers

Simple:

URLConnection connection = url.openConnection(); InputStream is = connection.getInputStream(); if (connection instanceof HttpURLConnection) {    HttpURLConnection httpConn = (HttpURLConnection) connection;    int statusCode = httpConn.getResponseCode();    if (statusCode != 200 /* or statusCode >= 200 && statusCode < 300 */) {      is = httpConn.getErrorStream();    } } 

You can refer to Javadoc for explanation. The best way I would handle this is as follows:

URLConnection connection = url.openConnection(); InputStream is = null; try {     is = connection.getInputStream(); } catch (IOException ioe) {     if (connection instanceof HttpURLConnection) {         HttpURLConnection httpConn = (HttpURLConnection) connection;         int statusCode = httpConn.getResponseCode();         if (statusCode != 200) {             is = httpConn.getErrorStream();         }     } } 
like image 104
Buhake Sindi Avatar answered Oct 12 '22 23:10

Buhake Sindi


You need to do the following after calling openConnection.

  1. Cast the URLConnection to HttpURLConnection

  2. Call getResponseCode

  3. If the response is a success, use getInputStream, otherwise use getErrorStream

(The test for success should be 200 <= code < 300 because there are valid HTTP success codes apart from than 200.)


I am making a crawler, and need to get the data from the stream regardless if it is a 200 or not.

Just be aware that it if the code is a 4xx or 5xx, then the "data" is likely to be an error page of some kind.


The final point that should be made is that you should always respect the "robots.txt" file ... and read the Terms of Service before crawling / scraping the content of a site whose owners might care. Simply blatting off GET requests is likely to annoy site owners ... unless you've already come to some sort of "arrangement" with them.

like image 37
Stephen C Avatar answered Oct 12 '22 23:10

Stephen C