Welcome to OStack Knowledge Sharing Community for programmer and developer-Open, Learning and Share
Welcome To Ask or Share your Answers For Others

Categories

0 votes
539 views
in Technique[技术] by (71.8m points)

r - Error in heatmap.2 (gplots)

Ive moved on to a new server and Installed R version 3.0 on it. (gplots library was no longer available for 2.14)

Using a script that worked for version 2.14 I now encounter a problem generating a heatmap.

In R version 3 I get an error:

Error in lapply(args, is.character) : node stack overflow
Error in dev.flush() : node stack overflow
Error in par(op) : node stack overflow

In R version 2.14 I get an error:

Error: evaluation nested too deeply: infinite recursion / options(expressions=)?

Which I can resolve by increasing the options(expressions=500000)

In R version 3 increasing this option does not resolve the issue. And Im still stuck with the same error.

The script is the same for both:

y=read.table("test", row.names=1, sep="", header=TRUE)
hr <- hclust(dist(as.matrix(y)))
hc <- hclust(dist(as.matrix(t(y))))
mycl <- cutree(hr, k=7); mycolhc <- rainbow(length(unique(mycl)), start=0.1, end=0.9); mycolhc     <- mycolhc[as.vector(mycl)] 

install.packages("gplots")
library("gplots", character.only=TRUE)
myheatcol <- redgreen(75)

pdf("heatmap.pdf")
heatmap.2(as.matrix(y), Rowv=as.dendrogram(hr), Colv=as.dendrogram(hc), col=myheatcol,scale="none", density.info="none", trace="none", RowSideColors=mycolhc, labRow=FALSE)
dev.off()

Where "test" is a tdl file with headers and row names and a 40*5000 0/1 matrix

Any help would be appreciated

PS: When I reduce my data set to 2000 lines I no longer get the error.

PSS: Increasing the dataset to 2500 lines resulted in the same error; However, removing all non-informative lines (all 1s) left me with 3700 lines. Using this data set did not result in the error.

See Question&Answers more detail:os

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome To Ask or Share your Answers For Others

1 Answer

0 votes
by (71.8m points)

I'm the author of the gplots package. The 'node stack overflow' error occurs when a byte-compiled function has too many recursive calls.

In this case, it occurs because the function that plots dendrogram objects (stats:::plotNode) is implemented using a recursive algorithm and the dendrogram object is deeply nested.

Ultimately, the correct solution is to modify plotNode to use an iterative algorithm, which will prevent the recursion depth error from occuring.

In the short term, it is possible to force stats:::plotNode to be run as interpreted code rather then byte-compiled code via a nasty hack.

Here's the recipe:

## Convert a byte-compiled function to an interpreted-code function 
unByteCode <- function(fun)
    {
        FUN <- eval(parse(text=deparse(fun)))
        environment(FUN) <- environment(fun)
        FUN
    }

## Replace function definition inside of a locked environment **HACK** 
assignEdgewise <- function(name, env, value)
    {
        unlockBinding(name, env=env)
        assign( name, envir=env, value=value)
        lockBinding(name, env=env)
        invisible(value)
    }

## Replace byte-compiled function in a locked environment with an interpreted-code
## function
unByteCodeAssign <- function(fun)
    {
        name <- gsub('^.*::+','', deparse(substitute(fun)))
        FUN <- unByteCode(fun)
        retval <- assignEdgewise(name=name,
                                 env=environment(FUN),
                                 value=FUN
                                 )
        invisible(retval)
    }

## Use the above functions to convert stats:::plotNode to interpreted-code:
unByteCodeAssign(stats:::plotNode)

## Now raise the interpreted code recursion limit (you may need to adjust this,
##  decreasing if it uses to much memory, increasing if you get a recursion depth error ).
options(expressions=5e4)

## heatmap.2 should now work properly 
heatmap.2( ... )

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome to OStack Knowledge Sharing Community for programmer and developer-Open, Learning and Share
Click Here to Ask a Question

...