When there is no difference between the first and second prices in a time-series, then u_0 will be zero. As a result, since v_1=u_0*u_0, so v_1 will be 0 as well. This make the first term of the likelihood function to become infinity times 0. How do people solve this problem? Do they just start the time series when the first and second prices are different? Or is there another technique to circumvent this problem? Thanks a lot!

I think I get it now. v_1 shouldn’t be involved in the likelihood function because it doesn’t contain any parameters to estimate. Is that right?

is this an L2 Question?. i hope not

What are you trying to do here exactly (and what are your v1 and u0’s)?

ymc, you can read more about GARCH here: http://www.wilmott.com/messageview.cfm?catid=19&threadid=4392

Hi Joey Suppose the prices are S_0, S_1, …, S_n Then u_i=(S_i - S_(i-1))/S_(i-1), v_1 = u_0*u_0 v_i = omega + alpha*u_(i-1)*u_(i-1) + beta*v_(i-1) The log likelihood function is sum_of_i=1_to_n{-log(v_i) - u_i*u_i/v_i} The problem is to find omega, alpha and beta to maximize the log likelihood function.

So then following everyone else you do the Gaussian quasi-MLE and what’s the issue?

The issue is when u_0=0, then v_1=0 as well. Then -log(0)-u_1*u_1/0 is not defined. But if I understand correctly, the sum should start from 2 instead of 1 as my book suggests.

I’ll help you out with this tomorrow. Too much wine for dinner…