The documentation for jquery's closest
says the following:
.closest( selector [, context ] )
...
context
Type: Element
A DOM element within which a matching element may be found. If no context is passed in then the context of the jQuery set will be used instead.
As I understand it, the bolded text means that the two statements should be equivalent:
set.closest("a");
set.closest("a", set.context);
where set
is some jquery set.
However, this does not seem to be the case:
var context = $("#inner")[0];
var set = $("#el", context);
// the set's context is correctly the "inner" element
set.text("context: " + set.context.id);
// if the set's context is used, this closest should match nothing, but it matches and sets the color
set.closest("#outer").css("color", "red");
// with the context explicitly set, the "outer" is not found and no background color is set
set.closest("#outer", set.context).css("background-color", "blue");
#outer{
width: 100px;
height: 100px;
border: 1px solid black;
}
<script src="https://ajax.googleapis.com/ajax/libs/jquery/1.11.1/jquery.min.js"></script>
<div id="outer">
<div id="inner">
<div id="el"></div>
</div>
</div>
As you can see, when no context is explicitly set, the set's context does not seem to be used as the #outer
element is found by closest
. When explicitly set, the #outer
is correctly not found.
Is the documentation just incorrect or am I missing something?
This is clearly a bug, and not how it was intended to work.
The source for closest()
is
function (selectors, context) {
var cur,
i = 0,
l = this.length,
matched = [],
pos = rneedsContext.test(selectors) || typeof selectors !== "string"
?
jQuery(selectors, context || this.context)
:
0;
for (; i < l; i++) {
for (cur = this[i]; cur && cur !== context; cur = cur.parentNode) {
// Always skip document fragments
if (cur.nodeType < 11 && (pos ? pos.index(cur) > -1 :
// Don't pass non-elements to Sizzle
cur.nodeType === 1 && jQuery.find.matchesSelector(cur, selectors))) {
matched.push(cur);
break;
}
}
}
return this.pushStack(matched.length > 1 ? jQuery.unique(matched) : matched);
}
What's notable is the way pos
is defined, it's the collection to be searched for a closest parent element, and rneedsContext
is the regex
/^[\x20\t\r\n\f]*[>+~]|:(even|odd|eq|gt|lt|nth|first|last)(?:\([\x20\t\r\n\f]*((?:-\d)?\d*)[\x20\t\r\n\f]*\)|)(?=[^-]|$)/i
If the passed in selector doesn't match that regex, no context is used whatsoever, pos
would equal 0
, and the check for cur
in that collection is just skipped all together, which seems mighty strange.
A quick test shows
var reg = /^[\x20\t\r\n\f]*[>+~]|:(even|odd|eq|gt|lt|nth|first|last)(?:\([\x20\t\r\n\f]*((?:-\d)?\d*)[\x20\t\r\n\f]*\)|)(?=[^-]|$)/i;
reg.test('#outer'); // false, no context used
reg.test('#outer:first'); // true, context used
reg.test('#outer:eq(0)'); // true, context used
So if you add a pseudo selector, it suddenly uses the context ?
I doubt this is what was intended, and it seems like a strange thing to do, and it surely doesn't do what the documentation says it should do.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With