I noticed that on my site the Google Chrome developer tools “Performance Monitor” shows different numbers than those determined via getEventListeners()
and document.querySelectorAll('*')
.
As you can see in the screenshot below, the number of events calculated in the console is significantly higher than the one stated in the Chrome dev tools while the number of DOM Nodes is significantly lower.
Any idea how this can be explained?
function countDomEventListeners (elements) {
return Array.from(elements).reduce((count, node) => {
const listeners = getEventListeners(node)
for (var eventName in listeners) {
count += listeners[eventName].length
}
return count
}, 0)
}
elements = document.querySelectorAll('*')
console.log('DOM Nodes:', elements.length)
console.log('DOM event listeners:', countDomEventListeners(elements))
I built a little example with around 1020 DOM nodes and 1000 events on the <li/>
elements. Each click removes its own event listener and after 10 clicks, all event listeners are removed.
The Google Chrome performance monitor does indeed behave in unexpected ways. It sometimes shows around 35, 50, 150, or more events and ~2000 or ~4000 DOM nodes while the countDomEventListeners()
function in the original question always calculates the expected numbers (~1020 DOM nodes and exactly 1000 events).
When using the "Elements" tab in the Google Chrome dev tools and hovering over DOM nodes there, both the DOM nodes and events counts in the performance monitor do heavily fluctuate and go up, which brings me to the conclusion that the dev tools themselves leak into the performance monitor statistics.
I've reported the problem here.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With