Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Adding a wait-for-element while performing a SplashRequest in python Scrapy

I am trying to scrape a few dynamic websites using Splash for Scrapy in python. However, I see that Splash fails to wait for the complete page to load in certain cases. A brute force way to tackle this problem was to add a large wait time (eg. 5 seconds in the below snippet). However, this is extremely inefficient and still fails to load certain data (sometimes it take longer than 5 seconds to load the content). Is there some sort of a wait-for-element condition that can be put through these requests?

yield SplashRequest(
          url, 
          self.parse, 
          args={'wait': 5},
          'User-Agent':"Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/51.0.2704.106 Safari/537.36",
          }
)
like image 814
NightFury13 Avatar asked Dec 10 '16 11:12

NightFury13


1 Answers

Yes, you can write a Lua script to do that. Something like that:

function main(splash)
  splash:set_user_agent(splash.args.ua)
  assert(splash:go(splash.args.url))

  -- requires Splash 2.3  
  while not splash:select('.my-element') do
    splash:wait(0.1)
  end
  return {html=splash:html()}
end

Before Splash 2.3 you can use splash:evaljs('!document.querySelector(".my-element")') instead of not splash:select('.my-element').

Save this script to a variable (lua_script = """ ... """). Then you can send a request like this:

yield SplashRequest(
    url, 
    self.parse, 
    endpoint='execute',
    args={
        'lua_source': lua_script,
        'ua': "Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/51.0.2704.106 Safari/537.36"
    }
}

See scripting tutorial and reference for more details on how to write Splash Lua scripts.

like image 98
Mikhail Korobov Avatar answered Sep 20 '22 11:09

Mikhail Korobov