- one of the assumptions of multiple regression model is: the variance of the residual term is constant for all observations. What does ‘variance’ mean here? can i equalize it with ‘squared residuals’? Since one observation only has one residual term, am i right? 2. in time-series analysis, autocorrelation analysis is often conducted on AR models, in which a list of lags will be presented. for example, the autocorrelation of lag 12 is 0.012334. what does lag 12 mean??? i’m so confuesed…
- yes variance is squared residual. [(xt - Xbar)^2] xbar = avg (xt). 2. lag 12 -> Observation 12 - Observation 11. Say it is monthly returns data. Lag 12 = Return Month 12 - Return Month 11.
hey CP, that really helps, thanks! a small question tho, by Lag 12 = Return Month 12 - Return Month 11, do you mean ‘the error term of montn 12 - the error term of month 11’?
I believe it is the # of original observations. Remember Error (1) = Observation 2 - Observation 1 … this is the Lagged error. schweser does tend to confuse you a bit without presenting the entire problem. read the text book… your auto-correlation is on the error terms. But the error terms themselves are derived from the original data series.
that cleared things up, many thanks!
Lag = # of periods behind. Lag12 = Obs 12 - Obs 1. (not obs 11 - that would be lag 1). Autocorrelation is measured for residuals. This is to see that the model captures all the timeseries pattern - any pattern still left tells you that the model has failed (to adequately describe the time series).
- Actually you can still run multiple variable linear regression even the variance is not constant - you can adjust the variance by some factor(scaler); Make sure you mention linear as in non-linear world,the variance might not apply.