Are there standards for acceptable loss of beacon data when measuring performance of web pages, and standard way to account for these losses?
Many users don't have a browser that supports Navigator.sendBeacon, and even that API can't guarantee lossless reporting.
It seems that the data that is most likely to be lost is also the most interesting data, in some ways -- very slow pages, bad internet connections, spotty upload, etc. I'm wondering if there are known methods for accounting for this.
Also known as real-user measurement, real-user metrics, or end-user experience monitoring (EUM), it's a form of passive monitoring, relying on Web-monitoring services that continuously observe your system in action, tracking availability, functionality, and responsiveness.
Real User Monitoring (RUM) is one application in SAP Focused Run. It provides permanent measurement of all real user requests within a system landscape covering performance as well as utilization aspects. For a details of the User Interface, please check the RUM User Interface description.
While real user monitoring tracks the level of satisfaction of your customers in real-time, synthetic monitoring tests the performance of your website in a controlled environment by running predefined scripts against your website to track service availability, benchmark performance against competitors, and response ...
"standards for acceptable loss", is that a joke :), if you refer to W3.org as they say it is a problem for all developers to ensure the data is submitted correctly without loss. but you will find some techniques that are used in hope that everything will work fine "without grantee" :D. also read this
The Beacon specification defines an interface that web developers can use to asynchronously transfer small HTTP data from the User Agent to a web server.
The specification addresses the needs of analytics and diagnostics code that typically attempt to send data to a web server prior to the unloading of the document. Sending the data any sooner may result in a missed opportunity to gather data. However, ensuring that the data has been sent during the unloading of a document is something that has traditionally been difficult for developers.
User agents will typically ignore asynchronous XMLHttpRequests made in an unload handler. To solve this problem, analytics and diagnostics code will typically make a synchronous XMLHttpRequest in an unload or beforeunload handler to submit the data. The synchronous XMLHttpRequest forces the User Agent to delay unloading the document, and makes the next navigation appear to be slower. There is nothing the next page can do to avoid this perception of poor page load performance.
There are other techniques used to ensure that data is submitted. One such technique is to delay the unload in order to submit data by creating an Image element and setting its src attribute within the unload handler. As most user agents will delay the unload to complete the pending image load, data can be submitted during the unload. Another technique is to create a no-op loop for several seconds within the unload handler to delay the unload and submit data to a server.
Not only do these techniques represent poor coding patterns, some of them are unreliable and also result in the perception of poor page load performance for the next navigation.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With