Sum of Square Errors (SSE) vs Standard Error of Estimate (SEE)
What’s the difference?
The Schweser notes they seem to define them similarly. Sure I’m overlooking something.
Thanks!
Master the Level II curriculum by creating custom quizzes in the SchweserPro™ QBank. Question difficulty automatically adapts to your ability level on a given topic, measuring your knowledge and keeping you motivated. Included in every Schweser Study Package.
Share this
Study together. Pass together.
Join the world's largest online community of CFA, CAIA and FRM candidates.
Studying With
SEE is the sqrt of SSE.Corrcet me if I'm worng
Studying With
Actually, SEE = Square root of MSE.
MSE = SSE / (n-k-1).
Studying With
Ah, true.
Corrcet me if I'm worng
The way I think about it is SSE is an error amount that is in a total sense, i.e. it is a sum of all errors. This varies depending on your population and has no comparibility, much like variance. But if you “standadize it” like what you do in standard deviation, you have a measure that is compariable regardless of what the population is like, because by square rooting the sum per the degree freedom, you effectively put errors on the same scale, and it is more useful this way.