Is there any way to know how many seconds does it take a loop to execute in java?
For example:
for(int i=0; i < 1000000; i++) {
//Do some difficult task goes in here
}
It does not have to be accurate 100%, but its just to have an idea of how long it could take. The algorithm inside is some kind of key generator that writes to a .txt file. I expect it to take even a few mins, so for my first test i want to count the seconds.
The procedure takes 10 to 20 minutes. You'll be able to go home afterward as soon as you feel up to it.
So, loop terminates. So, number of times the loop executes is 2.
We see that for creating same number of elements, for loop takes “14 seconds”, while list comprehension taks just about “9 seconds”. It is clear that for loop is much slower compared to list comprehension.
Here you can try this:
long startTime = System.currentTimeMillis();
long endTime = 0;
for(int i=0; i < 1000000; i++) {
//Something
}
endTime = System.currentTimeMillis();
long timeneeded = ((startTime - endTime) /1000);
One way to time an operation is to take an average with nanoTime() You may want to adjust the number of iterations and you will get less variation with an average. nanoTime is better than currentTimeMillis in that it is more accurate and monotonically increasing (it won't go backwards while the application is running)
long start = System.nanoTime();
int runs = 1000*1000;
for(int i=0;i<runs;i++) {
// do test
}
long time = System.nanoTime() - start;
System.out.printf("The average time taken was %.1f ns%n", (double) time / runs);
Using printf allows you to format the result. You can divide by 1000 to get micro-seconds or 1000000 for micro-seconds.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With