Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Measuring execution time in the D language

I'm new to the D language and need to measure the execution time of an algorithm. What are my options? Is there already some built-in solution? I could not find anything conclusive on the web.

like image 319
clstaudt Avatar asked Jul 25 '13 11:07

clstaudt


People also ask

How do you calculate Execution time?

The difference between the end time and start time is the execution time. Get the execution time by subtracting the start time from the end time.

What is execution time of a program?

Execution time : The execution time or CPU time of a given task is defined as the time spent by the system executing that task in other way you can say the time during which a program is running.

How do you time how long a function takes C++?

Using <time. The function clock() returns the number of clock ticks since the program started executing. If you divide it by the constant CLOCKS_PER_SEC you will get how long the program has been running, in seconds.


2 Answers

One way is to use -profile command line parameter. After you run the program, it will create file trace.log where you can find run time for each function. This of course will slow down your program as the compiler will insert time counting code into each your function. This method is used to find relative speed of functions, to identify which you should optimize to improve app speed with minimum effort.

Second options is to use std.datetime.StopWatch class. See the example in the link.

Or even better suited might be to directly use std.datetime.benchmark function.

Don't forget:

  1. When benchmarking use these dmd compiler flags to achieve maximum optimization -release -O -inline -noboundscheck.
  2. Never benchmark debug builds.
  3. Make sure you don't call any library code inside benchmarked functions - You would be benchmarking performance of library implementation instead of your own code.

Additionally you may consider using LDC or GDC compilers. Both of them provide better optimizations / app run speed than DMD.

like image 82
Michal Minich Avatar answered Oct 21 '22 04:10

Michal Minich


If your algorithm can be called from the commandline there's a nifty utility written in D that will run your program a number of time and print out the distribution of the average time taken and all sort of other useful numbers.

It's called avgtime and it's here: https://github.com/jmcabo/avgtime

like image 27
Sir.Rainbow Avatar answered Oct 21 '22 04:10

Sir.Rainbow