Sum of Square Errors (SSE) vs Standard Error of Estimate (SEE)

What’s the difference?

The Schweser notes they seem to define them similarly. Sure I’m overlooking something.

Thanks!

SEE is the sqrt of SSE.

Actually, SEE = Square root of MSE.

MSE = SSE / (n-k-1).

Ah, true.

The way I think about it is SSE is an error amount that is in a total sense, i.e. it is a sum of all errors. This varies depending on your population and has no comparibility, much like variance. But if you “standadize it” like what you do in standard deviation, you have a measure that is compariable regardless of what the population is like, because by square rooting the sum per the degree freedom, you effectively put errors on the same scale, and it is more useful this way.