# Conceptually, what is difference between standard error of estimate and standard error of forecast?

I know that SEE is used to calculate SEF, which, in turn, is used to construct confidence intervals for the predicted value.  I’m looking for more of a conceptual explanation of the difference between what is measured by the two.

Personalized CFA® exam prep is here! Through this exclusive Analyst Forum offer, you can save 10% on a June 2020 Level II CFA PremiumPlus™ or Premium Study Package. Use promo code KS-AF10 at check out. Offer expires Oct 31, 2019.

SEE = Standard error (standard deviation of errors terms). The more variation there, the less precise is your model overall.

Both cover the two types of uncertainties you have in your regression model.

cfageist wrote:

When you say “correlation coefficients”, do you mean regression coefficients (i.e., b0, b1, etc.)?  If so, I thought that standard error of regression coefficients are not the same as standard error of forecast?

cfageist wrote:
SEE = Standard error (standard deviation of errors terms). The more variation there, the less precise is your model overall.

SEF = Addresses the variability in your correlation regression coefficients (betas slopes and intercept).

There: that’s better.

Simplify the complicated side; don't complify the simplicated side.

Financial Exam Help 123: The place to get help for the CFA® exams
http://financialexamhelp123.com/

S2000magician wrote:

cfageist wrote:
SEE = Standard error (standard deviation of errors terms). The more variation there, the less precise is your model overall.

SEF = Addresses the variability in your correlation regression coefficients (betas slopes and intercept).

There: that’s better.

Thank you for correcting!

My pleasure.

Simplify the complicated side; don't complify the simplicated side.

Financial Exam Help 123: The place to get help for the CFA® exams
http://financialexamhelp123.com/