# Conceptually, what is difference between standard error of estimate and standard error of forecast?

I know that SEE is used to calculate SEF, which, in turn, is used to construct confidence intervals for the predicted value.  I’m looking for more of a conceptual explanation of the difference between what is measured by the two.

A clear and personalized study plan is here! Schweser's upgraded content and redesigned study platform are exactly what you need to pass the Level II exam. Save 10% when you preorder a Premium Package for a limited time.

SEE = Standard error (standard deviation of errors terms). The more variation there, the less precise is your model overall.

Both cover the two types of uncertainties you have in your regression model.

cfageist wrote:

When you say “correlation coefficients”, do you mean regression coefficients (i.e., b0, b1, etc.)?  If so, I thought that standard error of regression coefficients are not the same as standard error of forecast?

cfageist wrote:
SEE = Standard error (standard deviation of errors terms). The more variation there, the less precise is your model overall.

SEF = Addresses the variability in your correlation regression coefficients (betas slopes and intercept).

There: that’s better.

Simplify the complicated side; don't complify the simplicated side.

Financial Exam Help 123: The place to get help for the CFA® exams
http://financialexamhelp123.com/

S2000magician wrote:

cfageist wrote:
SEE = Standard error (standard deviation of errors terms). The more variation there, the less precise is your model overall.

SEF = Addresses the variability in your correlation regression coefficients (betas slopes and intercept).

There: that’s better.

Thank you for correcting!

My pleasure.

Simplify the complicated side; don't complify the simplicated side.

Financial Exam Help 123: The place to get help for the CFA® exams
http://financialexamhelp123.com/