Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Try except in python/selenium still throwing NoSuchElementException error

I am trying to capture some website elements using selenium in python, and I am using try/except in case there is a specific element that cannot be found in that particular page. This all well and good, however the script still throws a NoSuchElementException even though I am expecting it and have told the script to handle it a certain way or pass.

The only thing I can think that may be an issue is that this try/except is nested inside another try/except, so that i goes like this

for month in (month_start, month_end):

    for date in (date_start, date_end):

        try: #this is just a general try in case there is a breakdown in finding one of the elements
            driver.get(url)
            results = driver.find_elements_by_xpath("""//*[@class="results"]/div""")

            for result in results:
                sample_element = result.find_element_by_xpath("blah").text

                #seems to be where the problem starts
                specific_element = ""
                try:
                    #this is a specific element that I know may not exist
                    specific_element = result.find_element_by_xpath(""".//*[@class="specific"]/div""").text
                except NoSuchElementException:
                    specific_element = ""
                    #I have tried pass instead as well with no luck
                #throws an error here and won't continue if the specific element is not found
        except:
            #pretty much a copy of the above try with some changes

So I generally think that I have a decent understanding of the try/except function in python but this is doing my head in. The "result in results" loop will happily continue until it doesn't find specific_element and just throws a "selenium.common.exceptions.NoSuchElementException: Message: no such element: Unable to locate element:"

If being nested inside the try/except is the cause of the whole issue, could you please explain why that is the case, and recommend a possible solution that I could research or implement? Or maybe I have missed something fundamental.

like image 500
Ryan N. Avatar asked Sep 13 '25 08:09

Ryan N.


1 Answers

I don't do python but if it were me, I would remove all the try/catches and replace them with find_elements_* and check for empty list. For example

replace

specific_element = result.find_element_by_xpath(""".//*[@class="specific"]/div""").text

with

elements = result.find_elements_by_xpath(".//*[@class='specific']/div")
if elements
    specific_element = elements[0]

This basically just makes sure that any element is found before accessing it and you can avoid all the try/catches. I'm of the opinion that exceptions should be exceptional... rare. They shouldn't be expected or used as flow control, etc.

like image 87
JeffC Avatar answered Sep 15 '25 22:09

JeffC