Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Arithmetical Operations Execution Speed on Linux and Windows

I wrote a small program in Java that do divide operation on pair of million of random numbers and calculate the average time for divide operation on same machine running different OS. After running the program, I find that, on Windows divide operation takes on average 1.6 * 10^-5 ms whereas on Linux (Ubuntu 12.04) it takes almost 8 factor less i.e. 3.2 * 10^-6 ms. I'm not sure why Java program would run much faster on Ubuntu and not on Windows. Is it only the driver thing? Or arithmetical operations are more optimized on Ubuntu?

like image 413
Shivam Avatar asked Oct 26 '25 03:10

Shivam


2 Answers

The actual calculation is done by the processor and it's completely independent of the operating system.

Besides, java programs run in a virtual java machine.

Perhaps you had a greater number of threads running on windows so the program got less processor time.

like image 138
Ionut Hulub Avatar answered Oct 28 '25 17:10

Ionut Hulub


The times itself are so small, the difference cannot (and should not) be measured the way you are doing it. Meaningful data is only attained when run for longer times, and, on same hardware, etc.

like image 41
Dhaivat Pandya Avatar answered Oct 28 '25 17:10

Dhaivat Pandya



Donate For Us

If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!