Having a bit of a mental lapse…could someone explain why the following is considered a nonlinear relationship: y= (3x-6)^2 if x=2, then y=0 if x=3, then y=9 conversely if x=-2, then y=0 if x=-3, then y= 225 either way, the correlation coefficient comes out at approximately one suggesting a linear relationship. It’s not clicking for me today. TIA.
correction…if x=-2, then y=144
Suggest you get out Excel and graph it (it will look like a parabola opening up). It won’t look like a line. Further, what’s the correlation of the pts (2,0) (-1,81) (5, 81)
Thx Joey. I did battle with the CFAI text and that recharged the brain cells. As usual, the text supports your conclusion.
You can still do a linear regression on that; only the regression cooefficients have to be linear. The relationship between x and y is still nonlinear.
Say what? You can do polynomial regression, if that’s what you mean…
From the text: “Even if the dependent variable is nonlinear, linear regression can be used as long as the regression is linear in the parameters. So, for example, linear regression can be use to estimate the equation Y=b0+b1X^2+E” Pg 234 It only makes sense that this is case given that in many situations we must take a linear regression of log transformations in order to get standard distributions.
WestCoastCFA Wrote: ------------------------------------------------------- > From the text: > > “Even if the dependent variable is nonlinear, > linear regression can be used as long as the > regression is linear in the parameters. So, for > example, linear regression can be use to estimate > the equation Y=b0+b1X^2+E” > Lol. So if you don’t know how to do polynomial regression, just bag that linear term and do linear regression. Of course, when someone asks you what happened to the linear term, you’re going to have to think fast. > Pg 234 > > It only makes sense that this is case given that > in many situations we must take a linear > regression of log transformations in order to get > standard distributions. Not in as many situations as you might think and you shouldn’t ever do it without checking to see if that makes the error terms right. Suppose you believe Y = a* Exp(b*X + error) which means your error terms are multiplicative (e.g. log normal). Then you can take lns and get ln Y = ln a + b*x + error and everything is fine. But what if your errors are due to measurement or something so you have Y = a* Exp(b*X + error1) + error2? Now taking ln’s messes up everything.
I dont see what youre on about. I said nothing about the usages of said transformations, just that you didnt have to throw regression out the window because of a nonlinear relation.