Welcome to OStack Knowledge Sharing Community for programmer and developer-Open, Learning and Share
Welcome To Ask or Share your Answers For Others

Categories

0 votes
853 views
in Technique[技术] by (71.8m points)

statistics - R-squared in lm() for zero-intercept model

I run an lm() in R and this is the results of the summary:

Multiple R-squared:  0.8918,    Adjusted R-squared:  0.8917 
F-statistic:  9416 on 9 and 10283 DF,  p-value: < 2.2e-16

and it seems that it is a good model, but if I calculate the R^2 manually I obtain this:

model=lm(S~0+C+HA+L1+L2,data=train)
pred=predict(model,train)
rss <- sum((model$fitted.values - train$S) ^ 2)
tss <- sum((train$S - mean(train$S)) ^ 2)
1 - rss/tss
##[1] 0.247238
rSquared(train$S,(train$S-model$fitted.values))
##          [,1]
## [1,] 0.247238

What's wrong?

str(train[,c('S','Campionato','HA','L1','L2')])
Classes ‘tbl_df’, ‘tbl’ and 'data.frame':   10292 obs. of  5 variables:
 $ S         : num  19 18 9 12 12 8 21 24 9 8 ...
 $ C         : Factor w/ 6 levels "D","E","F","I",..: 4 4 4 4 4 4 4 4 4 4 ...
 $ HA        : Factor w/ 2 levels "A","H": 1 2 1 1 2 1 2 2 1 2 ...
 $ L1        : num  0.99 1.41 1.46 1.43 1.12 1.08 1.4 1.45 0.85 1.44 ...
 $ L2        : num  1.31 0.63 1.16 1.15 1.29 1.31 0.7 0.65 1.35 0.59 ...
question from:https://stackoverflow.com/questions/65926198/rsquared-in-linear-regresion-using-r

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome To Ask or Share your Answers For Others

1 Answer

0 votes
by (71.8m points)

You are running a model without the intercept (the ~0 on the right hand side of your formula). For these kinds of models the calculation of R^2 is problematic and will produce misleading values. This post explains it very well: https://stats.stackexchange.com/a/26205/99681


与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome to OStack Knowledge Sharing Community for programmer and developer-Open, Learning and Share
Click Here to Ask a Question

...