Ive moved on to a new server and Installed R version 3.0 on it. (gplots library was no longer available for 2.14)
Using a script that worked for version 2.14 I now encounter a problem generating a heatmap.
In R version 3 I get an error:
Error in lapply(args, is.character) : node stack overflow
Error in dev.flush() : node stack overflow
Error in par(op) : node stack overflow
In R version 2.14 I get an error:
Error: evaluation nested too deeply: infinite recursion / options(expressions=)?
Which I can resolve by increasing the options(expressions=500000)
In R version 3 increasing this option does not resolve the issue. And Im still stuck with the same error.
The script is the same for both:
y=read.table("test", row.names=1, sep="\t", header=TRUE)
hr <- hclust(dist(as.matrix(y)))
hc <- hclust(dist(as.matrix(t(y))))
mycl <- cutree(hr, k=7); mycolhc <- rainbow(length(unique(mycl)), start=0.1, end=0.9); mycolhc <- mycolhc[as.vector(mycl)]
install.packages("gplots")
library("gplots", character.only=TRUE)
myheatcol <- redgreen(75)
pdf("heatmap.pdf")
heatmap.2(as.matrix(y), Rowv=as.dendrogram(hr), Colv=as.dendrogram(hc), col=myheatcol,scale="none", density.info="none", trace="none", RowSideColors=mycolhc, labRow=FALSE)
dev.off()
Where "test" is a tdl file with headers and row names and a 40*5000 0/1 matrix
Any help would be appreciated
PS: When I reduce my data set to 2000 lines I no longer get the error.
PSS: Increasing the dataset to 2500 lines resulted in the same error; However, removing all non-informative lines (all 1s) left me with 3700 lines. Using this data set did not result in the error.
I'm the author of the gplots package. The 'node stack overflow' error occurs when a byte-compiled function has too many recursive calls.
In this case, it occurs because the function that plots dendrogram objects (stats:::plotNode) is implemented using a recursive algorithm and the dendrogram object is deeply nested.
Ultimately, the correct solution is to modify plotNode to use an iterative algorithm, which will prevent the recursion depth error from occuring.
In the short term, it is possible to force stats:::plotNode to be run as interpreted code rather then byte-compiled code via a nasty hack.
Here's the recipe:
## Convert a byte-compiled function to an interpreted-code function
unByteCode <- function(fun)
{
FUN <- eval(parse(text=deparse(fun)))
environment(FUN) <- environment(fun)
FUN
}
## Replace function definition inside of a locked environment **HACK**
assignEdgewise <- function(name, env, value)
{
unlockBinding(name, env=env)
assign( name, envir=env, value=value)
lockBinding(name, env=env)
invisible(value)
}
## Replace byte-compiled function in a locked environment with an interpreted-code
## function
unByteCodeAssign <- function(fun)
{
name <- gsub('^.*::+','', deparse(substitute(fun)))
FUN <- unByteCode(fun)
retval <- assignEdgewise(name=name,
env=environment(FUN),
value=FUN
)
invisible(retval)
}
## Use the above functions to convert stats:::plotNode to interpreted-code:
unByteCodeAssign(stats:::plotNode)
## Now raise the interpreted code recursion limit (you may need to adjust this,
## decreasing if it uses to much memory, increasing if you get a recursion depth error ).
options(expressions=5e4)
## heatmap.2 should now work properly
heatmap.2( ... )
In another post this is from stats:::midcache.dendrogram
's function setmid
. setmid
calls itself recursively, and this recursion might be too deep -- probably the dendrogram is too dense to make any sense visually? You see where the error occurs by looking at the last few lines of traceback()
after the error occurs.
To make further progress with this, you need to be able to provide a minimal reproducible example (using heatmap
rather than heatmap.2
, or even more refined based on your interpretation of traceback()) , perhaps by making the data file available, or by providing a recipe to simulate the data (m <- matrix(runif(1000), 40)
?) in a way that reliably reproduces the error.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With