Welcome to OStack Knowledge Sharing Community for programmer and developer-Open, Learning and Share
Welcome To Ask or Share your Answers For Others

Categories

0 votes
815 views
in Technique[技术] by (71.8m points)

performance - How to speed up GLM estimation?

I am using RStudio 0.97.320 (R 2.15.3) on Amazon EC2. My data frame has 200k rows and 12 columns.

I am trying to fit a logistic regression with approximately 1500 parameters.

R is using 7% CPU and has 60+GB memory and is still taking a very long time.

Here is the code:

glm.1.2 <- glm(formula = Y ~ factor(X1) * log(X2) * (X3 + X4 * (X5 + I(X5^2)) * (X8 + I(X8^2)) + ((X6 + I(X6^2)) * factor(X7))), 
  family = binomial(logit), data = df[1:150000,])

Any suggestions to speed this up by a significant amount?

See Question&Answers more detail:os

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome To Ask or Share your Answers For Others

1 Answer

0 votes
by (71.8m points)

You could try the speedglm function from the speedglm package. I haven't used it on problems as large as you describe, but especially if you install a BLAS library (as @Ben Bolker suggested in the comments) it should be easy to use and give you a nice speed bump.

I remember seeing a table benchmarking glm and speedglm, with and without an performance-tuned BLAS library, but I can't seem to find it today. I remember that it convinced me that I would want both.


与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome to OStack Knowledge Sharing Community for programmer and developer-Open, Learning and Share
Click Here to Ask a Question

...