I use read.csv.ffdf from ff package to load a 830MB CSV file, which is about?8800000 rows and 19 columns:?
library(ff)
library(ffbase)
green_2018_ff <- read.csv.ffdf("green_2018.csv", header = TRUE)
But when I check the the size of green_2018_ff using object_size from pryr package, the object is about 1.13GB in memory:
library(pryr)
object_size(green_2018_ff) #1.13GB
I used to consider that the ffdf is only a memory mapping object, it should be very small in memory, much smaller than the origin CSV. Is there anything wrong with my code or data? Thanks.?
question from:
https://stackoverflow.com/questions/65929763/why-the-ffdf-object-is-so-large 与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…