I use rspec 2.6.0 and Capybara 1.1.1 for acceptance testing.
With a view like the following:
<tr >
<td>Team 3 Name</td>
<td>true</td>
<td><a href="/teams/3">Show</a></td>
<td><a href="/teams/3/edit">Edit</a></td>
<td><a href="/teams/3">Deactivate</a></td>
</tr>
<tr >
<td>Team 4 Name</td>
<td>true</td>
<td><a href="/teams/4">Show</a></td>
<td><a href="/teams/4/edit">Edit</a></td>
<td><a href="/teams/4">Deactivate</a></td>
</tr>
I want to write an acceptance test that states: "Team 3 does NOT have the 'Deactivate' link." I expect the following to fail:
within('tr', :text => 'Team 3 Name') do |ref|
page.should_not have_selector('a', :text => 'Deactivate')
end
But it passes. To further test what is going on, I wrote the absurd:
lock = false
within('tr', :text => 'Team 3 Name') do |ref|
page.should have_selector('a', :text => 'Deactivate')
page.should_not have_selector('a', :text => 'Deactivate')
lock = true
end
lock.should be_true
Which passes as well.
I am assuming from this that the scope the have_selector() call is using is not limited by the within() block, but I am not sure why this is. The capybara documentation uses this pattern and does not seem to mention any gotchas. What is the correct way to use within to limit the scope of my select? Thank you. /Salernost
Still learning Capybara myself, but have you tried have_link
instead of have_selector
? Also I don't think you need |ref|
. For example:
lock = false
within('tr', :text => 'Team 3 Name') do # omit |ref|
page.should have_link('Deactivate')
page.should_not have_link('Deactivate')
lock = true
end
lock.should be_true
Having come a little further with Capybara, I see several potential issues here:
within
may silently ignore the text
field. You'll notice that the examples only show CSS or XPath finders without additional arguments.within
does use text
, it may not work here because you are asking it to look at the <tr>
, but the text is in the <td>
.page
subject still targets the entire page even if you are in a within
block. The within
examples are mostly about using fill_in
or click
. The exception is the example under Beware the XPath // trap.As for creating a within
block, you can either give your table rows unique ids and search for them using CSS, or you may be able to write a specific XPath targeting the first matching row.
The problem with the latter is that you want use the within
on the <tr>
, but the text you are using for your targeting is inside a <td>
subelement. So for example, this XPath should find the table cell containing the text Team 3 Name
but then you are only working within
that first cell, not the whole row.
within(:xpath, "//tr/td[normalize-space(text())='Team 3 Name'") do
There are ways to "back up" to a parent element using XPath but I don't know how to do it and I've read that it's not good practice. I think your best bet here might be to just generate ids so your rows start like this:
<tr id="team_3">
then target them with a simple
within("tr#team_3")
I would also recommend Mark Berry's final approach he mentioned of adding id's to each of your table elements.
<tr id="team_3">
then target with
within("tr#team_3")
Capybara has given me issues when selecting by xpath in that it doesn't seem to work consistently, especially with CI services.
I also want to note on the same answer this section:
It's quite possible that the page subject still targets the entire page even if you are in a within block. The within examples are mostly about using fill_in or click. The exception is the example under Beware the XPath // trap.
This may have been the case in an older version, but in the current version of Capybara, calling page
inside of a within
block only inspects the part of the page targeted. So, using Mark's above example:
within("tr#team_3") do
expect(page).to have_content 'Team 3 Name'
# => true
expect(page).to have_content 'Team 4 Name'
# => false
end
have_selector
seems to ignore :text
and :content
options. I had to use something like this instead:
within 'a' do
page.should have_content 'Deactivate'
end
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With