from selenium import webdriver
driver = webdriver.Chrome()
driver.set_page_load_timeout(7)
def urlOpen(url):
try:
driver.get(url)
print driver.current_url
except:
return
Then I have URL lists and call above methods.
if __name__ == "__main__":
urls = ['http://motahari.ir/', 'http://facebook.com', 'http://google.com']
# It doesn't print anything
# urls = ['http://facebook.com', 'http://google.com', 'http://motahari.ir/']
# This prints https://www.facebook.com/ https://www.google.co.kr/?gfe_rd=cr&dcr=0&ei=3bfdWdzWAYvR8geelrqQAw&gws_rd=ssl
for url in urls:
urlOpen(url)
The problem is when website 'http://motahari.ir/' throws Timeout Exception, websites 'http://facebook.com' and 'http://google.com' always throw Timeout Exception.
Browser keeps waiting for 'motahari.ir/' to load. But the loop just goes on (It doesn't open 'facebook.com' but wait for 'motahari.ir/') and keep throwing timeout exception
Initializing a webdriver instance takes long, so I pulled that out of the method and I think that caused the problem. Then, should I always reinitialize webdriver instance whenever there's a timeout exception? And How? (Since I initialized driver outside of the function, I can't reinitialize it in except)
You will just need to clear the browser's cookies before continuing. (Sorry, I missed seeing this in your previous code)
from selenium import webdriver
driver = webdriver.Chrome()
driver.set_page_load_timeout(7)
def urlOpen(url):
try:
driver.get(url)
print(driver.current_url)
except:
driver.delete_all_cookies()
print("Failed")
return
urls = ['http://motahari.ir/', 'https://facebook.com', 'https://google.com']
for url in urls:
urlOpen(url)
Output:
Failed
https://www.facebook.com/
https://www.google.com/?gfe_rd=cr&dcr=0&ei=o73dWfnsO-vs8wfc5pZI
P.S. It is not very wise to do try...except...
without a clear Exception type, is this might mask different unexpected errors.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With