I'm new to the D language and need to measure the execution time of an algorithm. What are my options? Is there already some built-in solution? I could not find anything conclusive on the web.
The difference between the end time and start time is the execution time. Get the execution time by subtracting the start time from the end time.
Execution time : The execution time or CPU time of a given task is defined as the time spent by the system executing that task in other way you can say the time during which a program is running.
Using <time. The function clock() returns the number of clock ticks since the program started executing. If you divide it by the constant CLOCKS_PER_SEC you will get how long the program has been running, in seconds.
One way is to use -profile
command line parameter. After you run the program, it will create file trace.log where you can find run time for each function. This of course will slow down your program as the compiler will insert time counting code into each your function. This method is used to find relative speed of functions, to identify which you should optimize to improve app speed with minimum effort.
Second options is to use std.datetime.StopWatch class. See the example in the link.
Or even better suited might be to directly use std.datetime.benchmark function.
Don't forget:
-release -O -inline -noboundscheck
.Additionally you may consider using LDC or GDC compilers. Both of them provide better optimizations / app run speed than DMD.
If your algorithm can be called from the commandline there's a nifty utility written in D that will run your program a number of time and print out the distribution of the average time taken and all sort of other useful numbers.
It's called avgtime and it's here: https://github.com/jmcabo/avgtime
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With