Is there a mathematical link between the standard error of estimate and the sum of square error?

SEE=square root of (SSE/n-2)


So Sum (y - y’)^2 = Sum (Yi - b0 - b1Xi)^2 ?

Correct. You should note that the degrees of freedom isn’t always (n-2). If you have more independent variables and still fit an intercept, the DF are (n-k-1); n=sample size, k=estimated parameters aside from the intercept, 1 for the intercept.

Keep in mind the various terms that can be used to refer to SSE and SEE. This is what is the most confusing part about the whole thing.