Sign up  |  Log in

Standard error of Estimate vs Standard error of Forecast

Does anyone know why we have to use Standard Error of Forecast (Sf)  rather than SEE -> Standard Error of Estimate… in a confidence interval calculation? 

"Wiley's prep material was a huge part of my success." - Lindsey G., USA

Pretty sure those two things are not the same.. can’t remember right now.  


Yeap :D. I think it must be different.

Normally we use standard error of that parameter to calculate its confidence interval like Xbar +/- sigma/(n)^1/2 

As I understand SEE is the standard error of the estimate Y, while we dont use SEE

I am so confused :))

Anyone help me pls