I'm stumped. This makes no sense to me. The following code:
SimpleDateFormat sdf = new SimpleDateFormat("yyyy-MM-dd HH:mm:ss.SSS");
long lFirst = 1383460981000L;
long lSecond = 1383464581000L;
System.out.println(lFirst);
System.out.println(lSecond);
java.util.Date first = new Date(lFirst);
java.util.Date second = new Date(lSecond);
System.out.println(sdf.format(first));
System.out.println(sdf.format(second));
System.out.println(first.getTime());
System.out.println(second.getTime());
System.out.println("Diff" + (first.getTime() - second.getTime()));
System.out.println("Hours diff: " + (((float)(second.getTime()-first.getTime()))/1000f/60f/60f));
produces the following output:
1383460981000
1383464581000
2013-11-03 01:43:01.000
2013-11-03 01:43:01.000
1383460981000
1383464581000
Diff-3600000
Hours diff: 1.0
How do these two different long values produce the exact same date? I ran into this while performing a data migration from one database type to another and validating the results. I couldn't comprehend the validation failures I was seeing so I created this little snippet of code to compare the values and sure enough. While I'm willing to accept that there's some timezone weirdness going on in my databases, that would no' seem to be the issue in this code sample.
This is probably the moment of the daylight savings change: the time goes from 1 AM to 2 AM, and goes back to 1 AM again. So, if you add an hour to 1:43:01, you go back to 1:43:01.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With