I wrote a small program in Java that do divide operation on pair of million of random numbers and calculate the average time for divide operation on same machine running different OS. After running the program, I find that, on Windows divide operation takes on average 1.6 * 10^-5 ms whereas on Linux (Ubuntu 12.04) it takes almost 8 factor less i.e. 3.2 * 10^-6 ms. I'm not sure why Java program would run much faster on Ubuntu and not on Windows. Is it only the driver thing? Or arithmetical operations are more optimized on Ubuntu?
The actual calculation is done by the processor and it's completely independent of the operating system.
Besides, java programs run in a virtual java machine.
Perhaps you had a greater number of threads running on windows so the program got less processor time.
The times itself are so small, the difference cannot (and should not) be measured the way you are doing it. Meaningful data is only attained when run for longer times, and, on same hardware, etc.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With