Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

JSoup.connect throws 403 error while apache.httpclient is able to fetch the content

I am trying to parse HTML dump of any given page. I used HTML Parser and also tried JSoup for parsing.

I found useful functions in Jsoup but I am getting 403 error while calling Document doc = Jsoup.connect(url).get();

I tried HTTPClient, to get the html dump and it was successful for the same url.

Why is JSoup giving 403 for the same URL which is giving content from commons http client? Am I doing something wrong? Any thoughts?

like image 529
instanceOfObject Avatar asked Apr 12 '12 09:04

instanceOfObject


1 Answers

Working solution is as follows (Thanks to Angelo Neuschitzer for reminding to put it as a solution):

Document doc = Jsoup.connect(url).userAgent("Mozilla").get();
Elements links = doc.getElementsByTag(HTML.Tag.CITE.toString);
for (Element link : links) {
            String linkText = link.text();
            System.out.println(linkText);
}

So, userAgent does the trick :)

like image 65
instanceOfObject Avatar answered Nov 12 '22 06:11

instanceOfObject