Welcome to OStack Knowledge Sharing Community for programmer and developer-Open, Learning and Share
Welcome To Ask or Share your Answers For Others

Categories

0 votes
567 views
in Technique[技术] by (71.8m points)

file - How do I create a progress bar for data loading in R?

Is it possible to create a progress bar for data loaded into R using load()?

For a data analysis project large matrices are being loaded in R from .RData files, which take several minutes to load. I would like to have a progress bar to monitor how much longer it will be before the data is loaded. R already has nice progress bar functionality integrated, but load() has no hooks for monitoring how much data has been read. If I can't use load directly, is there an indirect way I can create such a progress bar? Perhaps loading the .RData file in chucks and putting them together for R. Does any one have any thoughts or suggestions on this?

See Question&Answers more detail:os

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome To Ask or Share your Answers For Others

1 Answer

0 votes
by (71.8m points)

I came up with the following solution, which will work for file sizes less than 2^32 - 1 bytes.

The R object needs to be serialized and saved to a file, as done by the following code.

saveObj <- function(object, file.name){
    outfile <- file(file.name, "wb")
    serialize(object, outfile)
    close(outfile)
}

Then we read the binary data in chunks, keeping track of how much is read and updating the progress bar accordingly.

loadObj <- function(file.name){
    library(foreach)
    filesize <- file.info(file.name)$size
    chunksize <- ceiling(filesize / 100)
    pb <- txtProgressBar(min = 0, max = 100, style=3)
    infile <- file(file.name, "rb")
    data <- foreach(it = icount(100), .combine = c) %do% {
        setTxtProgressBar(pb, it)
        readBin(infile, "raw", chunksize)
    }
    close(infile)
    close(pb)
    return(unserialize(data))
}

The code can be run as follows:

> a <- 1:100000000
> saveObj(a, "temp.RData")
> b <- loadObj("temp.RData")
  |======================================================================| 100%
> all.equal(b, a)
[1] TRUE

If we benchmark the progress bar method against reading the file in a single chunk we see the progress bar method is slightly slower, but not enough to worry about.

> system.time(unserialize(readBin(infile, "raw", file.info("temp.RData")$size)))
   user  system elapsed
  2.710   0.340   3.062
> system.time(b <- loadObj("temp.RData"))
  |======================================================================| 100%
   user  system elapsed
  3.750   0.400   4.154

So while the above method works, I feel it is completely useless because of the file size restrictions. Progress bars are only useful for large files that take a long time to read in.

It would be great if someone could come up with something better than this solution!


与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome to OStack Knowledge Sharing Community for programmer and developer-Open, Learning and Share
Click Here to Ask a Question

...