Welcome to OStack Knowledge Sharing Community for programmer and developer-Open, Learning and Share
Welcome To Ask or Share your Answers For Others

Categories

0 votes
588 views
in Technique[技术] by (71.8m points)

machine learning - How to interpret MSE in Keras Regressor

I am trying to build a model to predict house prices.

I have some features X (no. of bathrooms , etc.) and target Y (ranging around $300,000 to $800,000)

I have used sklearn's Standard Scaler to standardize Y before fitting it to the model.

Here is my Keras model:

def build_model():
    model = Sequential()
    model.add(Dense(36, input_dim=36, activation='relu'))
    model.add(Dense(18, input_dim=36, activation='relu'))
    model.add(Dense(1, activation='sigmoid'))
    model.compile(loss='mse', optimizer='sgd', metrics=['mae','mse'])
    return model

I am having trouble trying to interpret the results -- what does a MSE of 0.617454319755 mean?

Do I have to inverse transform this number, and square root the results, getting an error rate of 741.55 in dollars?

math.sqrt(sc.inverse_transform([mse]))

I apologise for sounding silly as I am starting out!

See Question&Answers more detail:os

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome To Ask or Share your Answers For Others

1 Answer

0 votes
by (71.8m points)

I apologise for sounding silly as I am starting out!

Do not; this is a subtle issue of great importance, which is usually (and regrettably) omitted in tutorials and introductory expositions.

Unfortunately, it is not as simple as taking the square root of the inverse-transformed MSE, but it is not that complicated either; essentially what you have to do is:

  1. Transform back your predictions to the initial scale of the original data
  2. Get the MSE between these invert-transformed predictions and the original data
  3. Take the square root of the result

in order to get a performance indicator of your model that will be meaningful in the business context of your problem (e.g. US dollars here).

Let's see a quick example with toy data, omitting the model itself (which is irrelevant here, and in fact can be any regression model - not only a Keras one):

from sklearn.preprocessing import StandardScaler
from sklearn.metrics import mean_squared_error
import numpy as np

# toy data
X = np.array([[1,2], [3,4], [5,6], [7,8], [9,10]])
Y = np.array([3, 4, 5, 6, 7])

# feature scaling
sc_X = StandardScaler()
X_train = sc_X.fit_transform(X)

# outcome scaling:
sc_Y = StandardScaler()
Y_train = sc_Y.fit_transform(Y.reshape(-1, 1))
Y_train
# array([[-1.41421356],
#        [-0.70710678],
#        [ 0.        ],
#        [ 0.70710678],
#        [ 1.41421356]])

Now, let's say that we fit our Keras model (not shown here) using the scaled sets X_train and Y_train, and get predictions on the training set:

prediction = model.predict(X_train) # scaled inputs here
print(prediction)
# [-1.4687586  -0.6596055   0.14954728  0.95870024  1.001172  ]

The MSE reported by Keras is actually the scaled MSE, i.e.:

MSE_scaled = mean_squared_error(Y_train, prediction)
MSE_scaled
# 0.052299712818541934

while the 3 steps I have described above are simply:

MSE = mean_squared_error(Y, sc_Y.inverse_transform(prediction))  # first 2 steps, combined
MSE
# 0.10459946572909758
np.sqrt(MSE)  # 3rd step
# 0.323418406602187

So, in our case, if our initial Y were US dollars, the actual error in the same units (dollars) would be 0.32 (dollars).

Notice how the naive approach of inverse-transforming the scaled MSE would give a very different (and incorrect) result:

np.sqrt(sc_Y.inverse_transform([MSE_scaled]))
# array([2.25254588])

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome to OStack Knowledge Sharing Community for programmer and developer-Open, Learning and Share
Click Here to Ask a Question

...