In Reading 11, section 5.2 (Unit Root Test), it says in 3rd paragraph that for a time series AR(1) the absolute value of the lag coefficient, b1, must be less than 1 and that the time series would not be covariance stationary if b1 were greater or equal to 1.

I understand that b1 cannot be equal to 1, because the mean reverting level would be undefined: b0/(1-b1) the denominator would be 0. However, I don’t understand why b1 being greater than 1 would result in the time series not being covariance stationary. Could someone please explain this to me?

I know that the mean, variance and covariance of a time series in all periods must be finite and constant for a time series to be covariance stationary. I just don’t see how any of these are violated with b1 > 1.

Another question from the same 5.2 section: In the Dickey-Fuller test, the null hypothesis is H0: g1 = 0 and Ha: g1 < 0. Where g1 = (b1 -1). How come Ha is not g1 not equal to 0? They are omitting the possibility of b1 being greater than 1. Also, in all the hypothesis testing examples I have seen, the null and alternative hypothesis covers all scenarios, but in this one it omits the scenario of g1 > 0.

I did what you suggested and I can clearly see that there is no mean reverting level, like in the case of b1 > 1, the xs just keep increasing. So the mean is not finite and is not constant, thus is not covariance stationary.