Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Pagination using scrapy

I'm trying to crawl this website: http://www.aido.com/eshop/cl_2-c_189-p_185/stationery/pens.html

I can get all the products in this page, but how do I issue the request for "View More" link at the bottom of the page?

My code till now is:

rules = (
    Rule(SgmlLinkExtractor(restrict_xpaths='//li[@class="normalLeft"]/div/a',unique=True)),
    Rule(SgmlLinkExtractor(restrict_xpaths='//div[@id="topParentChilds"]/div/div[@class="clm2"]/a',unique=True)),
    Rule(SgmlLinkExtractor(restrict_xpaths='//p[@class="proHead"]/a',unique=True)),
    Rule(SgmlLinkExtractor(allow=('http://[^/]+/[^/]+/[^/]+/[^/]+$', ), deny=('/about-us/about-us/contact-us', './music.html',  ) ,unique=True),callback='parse_item'),
)

Any help?

like image 688
Vanddel Avatar asked Apr 21 '13 07:04

Vanddel


1 Answers

First of all, you should take a look at this thread on how to deal with scraping ajax dynamically loaded content: Can scrapy be used to scrape dynamic content from websites that are using AJAX?

So, clicking on "View More" button fires up an XHR request:

http://www.aido.com/eshop/faces/tiles/category.jsp?q=&categoryID=189&catalogueID=2&parentCategoryID=185&viewType=grid&bnm=&atmSize=&format=&gender=&ageRange=&actor=&director=&author=&region=&compProductType=&compOperatingSystem=&compScreenSize=&compCpuSpeed=&compRam=&compGraphicProcessor=&compDedicatedGraphicMemory=&mobProductType=&mobOperatingSystem=&mobCameraMegapixels=&mobScreenSize=&mobProcessor=&mobRam=&mobInternalStorage=&elecProductType=&elecFeature=&elecPlaybackFormat=&elecOutput=&elecPlatform=&elecMegaPixels=&elecOpticalZoom=&elecCapacity=&elecDisplaySize=&narrowage=&color=&prc=&k1=&k2=&k3=&k4=&k5=&k6=&k7=&k8=&k9=&k10=&k11=&k12=&startPrize=&endPrize=&newArrival=&entityType=&entityId=&brandId=&brandCmsFlag=&boutiqueID=&nmt=&disc=&rat=&cts=empty&isBoutiqueSoldOut=undefined&sort=12&isAjax=true&hstart=24&targetDIV=searchResultDisplay

which returns text/html of the next 24 items. Note this hstart=24 parameter: first time you click "View more" it's equal to 24, second time - 48 etc..this should be your lifesaver.

Now, you should simulate these requests in your spider. The recommended way to do this is to instantiate scrapy's Request object providing callback where you'll extract the data.

Hope that helps.

like image 142
alecxe Avatar answered Nov 15 '22 07:11

alecxe