Why must the absolute value of the lag coefficient is less than 1 in order to have mean reverting series?

Xt = b0 / (1 - b1)

If b1 = 1 => Xt = b0 / 0 = indef

If b1 = 2 => Xt = b0 / -1 = - b0

Why must the absolute value of the lag coefficient is less than 1 in order to have mean reverting series?

Xt = b0 / (1 - b1)

If b1 = 1 => Xt = b0 / 0 = indef

If b1 = 2 => Xt = b0 / -1 = - b0

Because if b1 = 1, then you have the unitary root problem. If a time series presents unitary root, its behavior is erratic, so it has no mean reversion.

Thanks alot

I encourage you to open Excel and see what happens when |b1| > 1.

Let us know what you discover.