Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Measuring latency

I'm working on a multiplayer project in Java and I am trying to refine how I gather my latency measurement results.

My current setup is to send a batch of UDP packets at regular intervals that get timestamped by the server and returned, then latency is calculated and recorded. I take number of samples then work out the average to get the latency.

Does this seem like a reasonable solution to work out the latency on the client side?

like image 235
TechyAdam Avatar asked Mar 25 '11 16:03

TechyAdam


2 Answers

I would have the client timestamp the outgoing packet, and have the response preserve the original timestamp. This way you can compute the roundtrip latency while side-stepping any issues caused by the server and client clocks not being exactly synchronized.

like image 89
NPE Avatar answered Oct 05 '22 23:10

NPE


You could also timestamp packets used in your game protocol . So you will have more data to integrate your statistics. (This method is also useful to avoid the overhead caused by an additional burst of data. You simply used the data you are already exchanging to do your stats)

You could also start to use other metrics (for example variance) in order to make a more accurate estimation of your connection quality.

like image 25
Heisenbug Avatar answered Oct 06 '22 00:10

Heisenbug