Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Python selenium screen capture not getting whole page

I am trying to create a generic webcrawler that will go to a site and take a screenshot. I am using Python, Selnium, and PhantomJS. The problem is that the screenshot is not capturing all the images on a page. For example, if I go to you tube, it doesn't capture images below the main page image. (I don't have high enough rep to post screen shot) I think this may have something to do with dynamic content, but I have tried the wait functions such as implicitly wait and on set_page_load_timeout methods. Because this is a generic crawler I can't wait for a specific event (I want to crawl hundreds of sites).

Is it possible to create a generic webcrawler that can do the screen capture I am trying to do? Code I am using is:

phantom = webdriver.PhantomJS()
phantom.set_page_load_timeout(30)
phantom.get(response.url)
img = phantom.get_screenshot_as_png() #64-bit encoded string
phantom.quit

Here is the image

like image 455
Malcolm Avatar asked Mar 09 '26 11:03

Malcolm


1 Answers

Your suggestion solved the problem. Used the following code (stolen in part from answer to another question):

driver = webdriver.PhantomJS()    
driver.maximize_window()
driver.get('http://youtube.com')  
scheight = .1
while scheight < 9.9:
    driver.execute_script("window.scrollTo(0, document.body.scrollHeight/%s);" % scheight)
    scheight += .01        
driver.save_screenshot('screenshot.png')
like image 90
Malcolm Avatar answered Mar 12 '26 01:03

Malcolm



Donate For Us

If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!