I'm running a large number of iterations in parallel. Certain iterates take much (say 100x) longer than others. I want to time these out, but I'd rather not have to dig into the C code behind the function (call it fun.c) doing the heavy lifting. I am hoping there is something similar to try() but with a time.out option. Then I could do something like:
for (i in 1:1000) { try(fun.c(args),time.out=60))->to.return[i] }
So if fun.c took longer than 60 seconds for a certain iterate, then the revamped try() function would just kill it and return a warning or something along those lines.
Anybody have any advice? Thanks in advance.
See this thread: http://r.789695.n4.nabble.com/Time-out-for-a-R-Function-td3075686.html
and ?evalWithTimeout
in the R.utils
package.
Here's an example:
require(R.utils) ## function that can take a long time fn1 <- function(x) { for (i in 1:x^x) { rep(x, 1000) } return("finished") } ## test timeout evalWithTimeout(fn1(3), timeout = 1, onTimeout = "error") # should be fine evalWithTimeout(fn1(8), timeout = 1, onTimeout = "error") # should timeout
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With