Sign up  |  Log in

Sum of Square Errors (SSE) vs Standard Error of Estimate (SEE)

What’s the difference?

The Schweser notes they seem to define them similarly. Sure I’m overlooking something.

Thanks!

Kick start your CFA® Program prep with Top Instructors you’ll love and a course that offers free updates until you pass – We’ve got you covered.

SEE is the sqrt of SSE.

Corrcet me if I'm worng

MrSmart wrote:

SEE is the sqrt of SSE.

Actually, SEE = Square root of MSE.

MSE = SSE / (n-k-1).

Gurifissu wrote:

MrSmart wrote:

SEE is the sqrt of SSE.

Actually, SEE = Square root of MSE.

MSE = SSE / (n-k-1).

Ah, true.

Corrcet me if I'm worng

The way I think about it is SSE is an error amount that is in a total sense, i.e. it is a sum of all errors. This varies depending on your population and has no comparibility, much like variance. But if you “standadize it” like what you do in standard deviation, you have a measure that is compariable regardless of what the population is like, because by square rooting the sum per the degree freedom, you effectively put errors on the same scale, and it is more useful this way.