Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

compare client device clock with server clock exactly upto milliseconds

Tags:

javascript

php

I am finding a way to get the difference between client clock and server clock.

Till now i have tried the following approach.

collecting:

  1. client request time
  2. server time
  3. client response time

the problem is we get unknown delay between request to reach server and response to reach client.

Here's an implementation of this scheme using JavaScript and PHP:

time.js

var request = new XMLHttpRequest();
request.onreadystatechange = readystatechangehandler;
request.open("POST", "http://www.example.com/sync.php", true);
request.setRequestHeader("Content-Type", "application/x-www-form-urlencoded");
request.send("original=" + (new Date).getTime());

function readystatechangehandler() {
var returned = (new Date).getTime();
if (request.readyState === 4 && request.status === 200) {
    var timestamp = request.responseText.split('|');
    var original = + timestamp[0];
    var receive = + timestamp[1];
    var transmit = + timestamp[2];
    var sending = receive - original;
    var receiving = returned - transmit;
    var roundtrip = sending + receiving;
    var oneway = roundtrip / 2;
    var difference = sending - oneway; // this is what you want
    // so the server time will be client time + difference
}
}

Sync.php

<?php
$receive = round(microtime(true) * 1000);
echo $_POST["original"] . '|';
echo $receive . '|';
echo round(microtime(true) * 1000);
?>

Even With this approach i get 50-500 ms error. If the delay is high, the error will be more.

But i wonder how a company named "adtruth" claims that they were able to differentiate devices based on clock time. they call it as "Time differential Linking" The key to device recognition AdTruth-style is its patented technology called TDL, for time-differential linking. While in the billions of connected devices there may be thousands with the same configuration, no two will have their clocks set to the same time -- at least, not when you take it down to the millisecond. Says Ori Eisen, founder of 41st Parameter and AdTruth, "We take these disparate time stamps and compare them to the server master clock. If there is any doubt, the TDL is the tie-breaker."

http://www.admonsters.com/blog/adtruth-joins-w3c-qa-ori-eisen-founder-and-chief-innovation-officer

Here is the link to their "Time differential linking" patent

http://www.google.com/patents/US7853533

like image 606
Vaibhav Chiruguri Avatar asked Aug 07 '13 12:08

Vaibhav Chiruguri


2 Answers

Its actually quite simple, first have the client calculate a fixed time - 2005, Jan 31, 18:34:20.050 (in milli seconds). Then calculate the time on the client machine (current time) and calculate the delta between the current time and the fixed time. Send the client time and delta back to the server. On the server, from the same fixed time, if you add the same delta, what would the server current time (no longer current due to response time lag etc) be. The different between the client current time and server current time would give you the time difference between client and server.

like image 76
Ron Avatar answered Oct 11 '22 13:10

Ron


var oneway = roundtrip / 2;

Why do you assume that the network is symmetrical? Actually it's a fairly reasonable assumption - you could try to calibrate the connection by send data in both directions to get an estimate of the throughput and latency (see boomerang's bw module for an example of the server-client measurement) however a basic feature of TCP is that the congestion window adapts progressively - and hence even on a static connection, the throughput changes markedly in the early stages of the connection (exactly the point at which you're likely to try and capture the client device identity).

Do try to make sure that the response is less than 1kb including headers (to make sure it fits in a single packet) and that keep alive is enabled. A GET request will be slightly smaller than a POST, although using websockets would give a more accurate figure.

A more realistic approach would be to capture several samples at known intervals capturing then calculating the average, e.g.

estmatedRtt=300;
for (var x=0; x<10; x++) {
    setTimeout(estimatedRtt * x * 1.3, captureOffset);
}
like image 23
symcbean Avatar answered Oct 11 '22 14:10

symcbean