Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Capybara/Selenium gets a Net::ReadTimeout randomly on location.reload()

I'm using Capybara, the selenium-webdriver gem, and chromedriver in order to drive my javascript enabled tests.

The problem is that about 50% of our builds fail due to a Net::ReadTimeout error. At first this was manifesting as a 'could not find element' error, but after I upped Capybara's default max wait time to 30 seconds, I started seeing the timeout.

I examined the screenshots of when the timeout happens, it's stuck on a 'Successfully logged in' modal that we show briefly before using the Javascript function, location.reload(), to reload the page.

I've ran the test locally and can sometimes reproduce it, also randomly. Sometimes it zips by this modal and does the reload so fast you can barely see it, and other times it just hangs forever.

I don't feel like it's an asset compilation issue, since the site has already loaded at that point in order for the user to access the login form.

Wondering if anyone has seen this before and knows a solution.

The specific code:

    visit login_path

    page.within '#sign-in-pane__body' do
      fill_in 'Email', with: user.email
      click_button 'Submit'
    end

    expect(page).to have_content 'Enter Password'

    page.within '#sign-in-pane__body' do
      fill_in 'Password', with: user.password
      click_button 'Submit'
    end

    expect(page).to have_text 'Home page landing text'

The hang up happens between click_button 'Submit' and expecting the home page text.

The flow of the logic causing the timeout is the user submits the login form, we wait for the server to render a .js.erb template that triggers a JS event upon successful login. When that trigger happens we show a modal saying that login was successful, then execute a location.reload().

like image 428
Zachary Wright Avatar asked Apr 20 '17 00:04

Zachary Wright


3 Answers

It turned out this wasn't exclusive to doing a location.reload() in JS. It sometimes happened just visiting a page.

The solution for me was to create an HTTP client for the selenium driver and specify a longer timeout:

Capybara.register_driver :chrome do |app|
  client = Selenium::WebDriver::Remote::Http::Default.new
  client.read_timeout = 120

  Capybara::Selenium::Driver.new(app, {browser: :chrome, http_client: client})
end
like image 200
Zachary Wright Avatar answered Oct 31 '22 13:10

Zachary Wright


Solved similar problem by using my own version of visit method:

 def safe_visit(url)
  max_retries = 3
  times_retried = 0
  begin
    visit url
  rescue Net::ReadTimeout => error
    if times_retried < max_retries
      times_retried += 1
      puts "Failed to visit #{url}, retry #{times_retried}/#{max_retries}"
      retry
    else
      puts error.message
      puts error.backtrace.inspect
      exit(1)
    end
  end
end
like image 2
Leo Yarushin Avatar answered Oct 31 '22 12:10

Leo Yarushin


Here is what you need to do if you need to configure it for headless chrome

Capybara.register_driver :headless_chrome do |app|
  client = Selenium::WebDriver::Remote::Http::Default.new
  client.timeout = 120 # instead of the default 60
  options = Selenium::WebDriver::Chrome::Options.new
  options.headless!

  Capybara::Selenium::Driver.new(app, {
    browser: :chrome,
    http_client: client,
    options: options
  })
end

Capybara.default_driver = :headless_chrome
Capybara.javascript_driver = :headless_chrome

Passing headless argument in capabilities was not working for me.

capabilities = Selenium::WebDriver::Remote::Capabilities.chrome(
   chromeOptions: { args: %w[headless disable-gpu] }
)

Here is more details about why headless in capabilities was not working.

like image 1
Atif Saddique Avatar answered Oct 31 '22 12:10

Atif Saddique