Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

python selenium wait for page to load

I've written a script that gets data from a page, but sometimes the page takes time to load and so when it pull the html into a soup object sometimes It pulls nothing as the page still needs to finish.

I wrote the following code for waiting the page to finish.

def scrape_page(url):
     browser.get(url)    
     try:
        WebDriverWait(browser, 10).until(EC.presence_of_element_located(browser.find_element_by_id ("selection-box")))
        #Extract Source Code 
        html = browser.page_source;
        soup = BeautifulSoup(html)

It works

But I'm getting the following error when I call the function;

TypeError: find_element() argument after * must be a sequence, not WebElement
like image 621
Grant McKinnon Avatar asked May 20 '15 03:05

Grant McKinnon


2 Answers

I think you should use presence_of_element_located like this:

element = WebDriverWait(driver, 10).until(
        EC.presence_of_element_located((By.ID, "myDynamicElement"))
    )

as described in the manual.

like image 71
WKPlus Avatar answered Oct 05 '22 19:10

WKPlus


I apply this func to every WebElement I need to use.

from selenium import webdriver

def FindElem(Driver: webdriver, XPath: str, Timeout: int = 300):
    while Timeout > 0:
        try:
            return Driver.find_element_by_xpath(XPath)
        except: # if element isn't already loaded or doesn't exist
            time.sleep(1)
            Timeout -= 1
    raise RuntimeError(f"Page loading timeout") # or whatever the hell you want

Usage:

Driver = webdriver.Firefox()
webdriver.get("http://somewhere.com/somepage.html")
MyWebElement = FindElem(Driver, "//input[@name='email']") # raise exception if timeout
like image 23
Emil Valeev Avatar answered Oct 05 '22 19:10

Emil Valeev