Suppose I have, "Heap Sort" method which its complexity time is O(nlogn). When I measure the execution time of this method upon 1000000 inputs I got 0.375770669 seconds. How can I calculate the execution time of this method theoretically?
There is no way to calculate this theoretically. It depends on many factors such as:
Even if you knew all of those, the calculation would essentially be a forensic simulation of the Java JIT compiler and hardware execution. It is far too complicated to contemplate.
The best you can reasonably expect to achieve in terms of a theoretical measure of "speed" is to count abstract operations at the source-code level. Even drilling down and counting bytecodes executed is probably too difficult to be practical.
I want to compare between the measured one and the theoretical one.
Basically, you can't.
What you can possibly do is to run your code for various number of inputs such as 1000, 10000, 100000, 1000000, 10000000 etc and record the time spent in sorting. Then plot those records times against # of elements in a X-Y graph and see if you get O(nlogn)
complexity curve.
Also look at this document for heap sort analysis and demo: http://www.inf.fh-flensburg.de/lang/algorithmen/sortieren/heap/heapen.htm
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With