Background:
This is the "microbenchmark" package for R:
https://cran.r-project.org/web/packages/microbenchmark/index.html
The first line in the reference manual says that it is built for "Accurate Timing Functions".
One problem with this is the intrinsic computer-time vs. computer-memory trade-off. Some solutions are memory intensive, but CPU fast. Some are CPU intensive, but have a very small memory footprint.
Question:
How do I simultaneously, and with good resolution, benchmark/microbenchmark not only the execution time, but the memory use during execution in R?
Better late than never: You can use bench::mark()
to measure both time and memory usage of code (and some more variables).
I.e., (taken from the help page for ?mark
)
library(bench)
dat <- data.frame(x = runif(100, 1, 1000), y=runif(10, 1, 1000))
mark(
dat[dat$x > 500, ],
dat[which(dat$x > 500), ],
subset(dat, x > 500)
)
#> # A tibble: 3 x 6
#> expression min median `itr/sec` mem_alloc `gc/sec`
#> <bch:expr> <bch:tm> <bch:tm> <dbl> <bch:byt> <dbl>
#> 1 dat[dat$x > 500, ] 21.7µs 23.6µs 40663. 4.15KB 89.7
#> 2 dat[which(dat$x > 500), ] 22.2µs 24.1µs 40228. 2.77KB 92.7
#> 3 subset(dat, x > 500) 36µs 39.2µs 23867. 20.12KB 86.2
Created on 2020-03-02 by the reprex package (v0.3.0)
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With