Conceptually, what is difference between standard error of estimate and standard error of forecast?
I know that SEE is used to calculate SEF, which, in turn, is used to construct confidence intervals for the predicted value. I’m looking for more of a conceptual explanation of the difference between what is measured by the two.
Master the Level II curriculum by creating custom quizzes in the SchweserPro™ QBank. Question difficulty automatically adapts to your ability level on a given topic, measuring your knowledge and keeping you motivated. Included in every Schweser Study Package.
Share this
Study together. Pass together.
Join the world's largest online community of CFA, CAIA and FRM candidates.
Studying With
SEE = Standard error (standard deviation of errors terms). The more variation there, the less precise is your model overall.
SEF = Addresses the variability in your correlation coefficients (betas).
Both cover the two types of uncertainties you have in your regression model.
When you say “correlation coefficients”, do you mean regression coefficients (i.e., b0, b1, etc.)? If so, I thought that standard error of regression coefficients are not the same as standard error of forecast?
There: that’s better.
Simplify the complicated side; don't complify the simplicated side.
Financial Exam Help 123: The place to get help for the CFA® exams
http://financialexamhelp123.com/
Studying With
Thank you for correcting!
My pleasure.
Simplify the complicated side; don't complify the simplicated side.
Financial Exam Help 123: The place to get help for the CFA® exams
http://financialexamhelp123.com/