RMSE vs. standard error

RMSE is the square root of the mean squared error. Standard Error of Estimate (SEE) = square root of sum of squares divided by n-k-1 So does RMSE= SEE?

Way to confuse. Throw in a quant question, and stare at the blank faces of candidates. By the way i’d think the answer to your question is NO. SEE = std deviation of error terms. SEE = sqrt(variance of error) SEE = sqrt(SSE/n-k-1) where as MSE = SSE/ n-k-1

they are not the same thing, but closely related. RMSE is for the MEAN, not the total errors. it is the average error.

RMSE is sqrt(MSE). Same thing as far as I can tell. It’s a tool used to gauge in-sample and out-fo-sample forecasting accuracy. Low RMSE relative to another model = better forecasting.

As is with SEE

So it boils down to whether MSE = Sum of squares / n, or MSE = sum of squares / n-k-1. On an Anove table you will find MSS and the associated degrees of freedom is n-k-1. I think denominator for MSE = n, denominator in the SEE is n-k-1 and that’s my story.