Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

when is R's `ByteCompile` counter-productive?

Tags:

r

The R docs describe the ByteCompile field in the "DESCRIPTION file" section as:

The ‘ByteCompile’ logical field controls if the package code is to be byte-compiled on installation: the default is currently not to, so this may be useful for a package known to benefit particularly from byte-compilation (which can take quite a long time and increases the installed size of the package)

I infer the only detrimental side-effects to byte-compiling are (a) time-to-install and (b) installation size. I haven't found a package that takes too long during installation/byte-compiling, and the general consensus is that GBs are cheap (for storage).

Q: When should I choose to not byte-compile packages I write? (Does anybody have anecdotal or empirical limits beyond which they choose against it?)

Edit: As noted in the comments of an older question, the rationale that debugging is not possible with byte-compiled code has been debunked. Other related questions on SO have discussed how to do it (either manually with R CMD INSTALL --byte-compile ... or with install.packages(..., type="source", INSTALL_opts="--byte-compile")), but have not discussed the ramifications of or arguments against doing so.

like image 955
r2evans Avatar asked Jun 20 '16 21:06

r2evans


1 Answers

I have yet to find a downside for byte-compiling, other than the ones you mention: slightly increased file size and installation time.

In the past, compiling certain code could cause slow-down but in recent versions of R (version >3.3.0), this doesn't seem to be a problem.

like image 53
csgillespie Avatar answered Oct 20 '22 19:10

csgillespie