I am trying to measure the computation time of a function in R using system.time()
.
I want to run the function a few hundred times to get an average but I don't want
to copy and paste that many times. Is there an easier way to do that?
You can use the replicate() function to run a function n times. For example, you can get 3 sets of 5 numbers from a random normal distribution by setting n to 3 and expr to rnorm(5) .
microbenchmark: Accurate Timing FunctionsProvides infrastructure to accurately measure and compare the execution time of R expressions. Version: 1.4.9. Imports: graphics, stats.
R – Current Time To get current time in R, call Sys. time() function. Sys. time() returns absolute date-time value with an accuracy of sub-second.
The microbenchmark package takes a ,times=
option and has the added bonus of being a bit more accurate.
> library(microbenchmark)
> m <- microbenchmark( seq(10)^2, (1:10)^2, times=10000)
> m
Unit: nanoseconds
expr min lq median uq max
1 (1:10)^2 2567 3423 3423 4278 41918
2 seq(10)^2 44484 46195 46195 47051 1804147
> plot(m)
And using the not-yet-released autoplot() method for ggplot2:
autoplot(m)
system.time(replicate ( ... stuff ..) )
Or: (hey, I'm not ashamed to have the same answer as Dirk.)
require(rbenchmark)
benchmark( stuff... ) # Nice for comparative work
You want to use the rbenchmark package and its function benchmark()
which does just about everything for you.
Here is the first example from its help page:
R> example(benchmark)
bnchmrR> # example 1
bnchmrR> # benchmark the allocation of one 10^6-element numeric vector,
bnchmrR> # replicated 100 times
bnchmrR> benchmark(1:10^6)
test replications elapsed relative user.self sys.self user.child sys.child
1 1:10^6 100 0.327 1 0.33 0 0 0
For truly expression-level benchmarking, there is also the microbenchmark package.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With