My python application loads webpages using Selenium Webdriver for a total of 20000 pages more or less in several hours of work. My problem is that "something" is creating a lot of tmp files, filling all my hard drive. For example, this morning the application generates 70GB of tmp files in 6 hours of work :( after rebooting Ubuntu, all these files are gone. I think the responsible is Firefox.
This situation happens both on Linux and OS X.
def launchSelenium (url):
    profile = webdriver.FirefoxProfile()
    profile.set_preference("network.proxy.type", 1)
    profile.set_preference("network.proxy.http", "127.0.0.1")
    profile.set_preference("network.proxy.http_port", 8080)
    profile.set_preference("webdriver.load.strategy", "fast")
    profile.set_preference("permissions.default.stylesheet", 2)
    profile.set_preference("permissions.default.images", 2)
    profile.set_preference("dom.ipc.plugins.enabled.libflashplayer.so", "false")
    profile.set_preference("browser.sessionstore.enabled", "false")
    profile.set_preference("browser.cache.disk.enable", "false")
    profile.update_preferences()
    driver = webdriver.Firefox(firefox_profile=profile)
    driver.get(url)
    try:
        element = driver.find_element_by_xpath("//button[@title='Statistics']").click()
    except NoSuchElementException:
        print "Not available"
        driver.close()
        return 0
    driver.close()
    return 1
I added the last two preferences in Firefox Profile trying to solve this problem but nothing changed.
Am I doing something wrong? There's a bug in Selenium? Thanks
Ok, the solution to the problem is to substitute:
driver.close()
with:
driver.quit()
Bye
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With