Proof of the intercept

What is the y-intercept term, b0? A) 47.6712. B) 34.7400. C) 92.2840. D) 512.3600. Your answer: C was incorrect. The correct answer was A) 47.6712. The mean of the aggregate revenue (Y) is 3,645/10 = 364.50 and of the advertising expenditure (X) is 91.2/10 = 9.12. The y-intercept, b0 = MeanY – Slope * MeanX = 364.50 – 34.74 * 9.12 = 47.6712. Anybody feel like explaining the intercept formula to me?

MeanY= b0+Slope(MeanX) (Equation for linear regression) Rearranging, we get B0= MeanY- Slope (MeanX) Edit: Take a look at p. 233 Book 1 CFAI text

Oh no. I can’t be this dumb! What the hell is happening to me? I can’t believe that I could study all of this regression stuff and get most of the problems right and not see that. I am toast on this exam. The conceptual stuff is going to kill me. Thanks for the proof ruhi. You seem to be well along in your studies. How are you feeling about it?

ruhi22 Wrote: ------------------------------------------------------- > MeanY= b0+Slope(MeanX) (Equation for linear > regression) > > Rearranging, we get B0= MeanY- Slope (MeanX) > > Edit: Take a look at p. 233 Book 1 CFAI text I don’t even need to look now. It is so clear…makes perfect sense. This is the essence of regression. Everything else just has to do with the residual errors pretty much.

mwvt9, happens to everyone. Take a break from your favorite subject. I have been asking crap loads of dumb questions in Ethics too. I’m not feeling very confident. I had revised everything in FSA just a week back, and I’ve already forgotten a lot of it. :-\ I don’t know what to do or how to go about this curriculum. There is just too much to be done.

I don’t think you have anything to worry about based on what I have seen. You were tops in the class for level I and I’m sure you will be again at level II. I bet your last round of revision right before the exam will make you feel better. I know it did for me. I felt like I couldn’t remember anything a couple weeks before the exam, but after that last revision (which didn’t take long) it all started to come into focus. Anyway, they have to let some people pass right? I think I am going to call it a night.

Thank you for the kind words. Hopefully, it will get better (or worse). We’ll see! Staying at AF and discussing questions definitely makes me feel more motivated. I get too bored studying alone. This way, I know that there are lots of other fellow sufferers who are burning the mid night oil with me.

ruhi22 Wrote: ------------------------------------------------------- > MeanY= b0+Slope(MeanX) (Equation for linear > regression) > > Rearranging, we get B0= MeanY- Slope (MeanX) > > Edit: Take a look at p. 233 Book 1 CFAI text Better take a look at that again. The linear regression equation says E(Y|X) = b0 + b1*X and is valid for some infinite number of (X,Y) pairs. The cool thing is that least squares guarantees that the regression line goes through the point (X-bar, Y-bar). That shouldn’t be obvious to you (or at least it wasn’t to me) but is not hard to prove. Then you get that ruhi’s eqn is true at that point.

Yeah. I got that. By definition the line that minimizes the squared errors has to go throught X bar and Y bar. I was picturing the graph in my head that showed the line. My mind goes to this picuture everytime I am tring to remember what SST, SSE and RSS are.

JoeyDVivre Wrote: ------------------------------------------------------- > ruhi22 Wrote: > -------------------------------------------------- > ----- > > MeanY= b0+Slope(MeanX) (Equation for linear > > regression) > > > > Rearranging, we get B0= MeanY- Slope (MeanX) > > > > Edit: Take a look at p. 233 Book 1 CFAI text > > > Better take a look at that again. The linear > regression equation says E(Y|X) = b0 + b1*X and is > valid for some infinite number of (X,Y) pairs. > The cool thing is that least squares guarantees > that the regression line goes through the point > (X-bar, Y-bar). That shouldn’t be obvious to you > (or at least it wasn’t to me) but is not hard to > prove. Then you get that ruhi’s eqn is true at > that point. Joey, i like what you said. Here is something easy too: Y_i = b0+b1*X_i+epsilon_i average both sides Mean(Y) = b0+b1*Mean(X) + Mean(epsilon) since Mean(epsilon) = 0 //due to Least-Squares method Mean(Y) = b0+b1*Mean(X)

Dude, how did you fail this exam? Were you drunk?

to clarify – he passed L1 in Dec, and will smash L2 in June… CP

Oh sorry. I am confusing him with someone else. My bad maratikus. Maybe I am drunk…no…I would feel better.

I am so glad I was wrong about maratikus. If I was right I would have no chance of passing and that would be a bit depressing.

maratikus Wrote: ------------------------------------------------------- > JoeyDVivre Wrote: > -------------------------------------------------- > ----- > > ruhi22 Wrote: > > > -------------------------------------------------- > > > ----- > > > MeanY= b0+Slope(MeanX) (Equation for linear > > > regression) > > > > > > Rearranging, we get B0= MeanY- Slope (MeanX) > > > > > > Edit: Take a look at p. 233 Book 1 CFAI text > > > > > > Better take a look at that again. The linear > > regression equation says E(Y|X) = b0 + b1*X and > is > > valid for some infinite number of (X,Y) pairs. > > The cool thing is that least squares guarantees > > that the regression line goes through the point > > (X-bar, Y-bar). That shouldn’t be obvious to > you > > (or at least it wasn’t to me) but is not hard > to > > prove. Then you get that ruhi’s eqn is true at > > that point. > > Joey, i like what you said. Here is something > easy too: > > Y_i = b0+b1*X_i+epsilon_i > > average both sides > > Mean(Y) = b0+b1*Mean(X) + Mean(epsilon) > > since Mean(epsilon) = 0 //due to Least-Squares > method > > Mean(Y) = b0+b1*Mean(X) That’s a good way of looking at it. Easy to remmber too.

no worries, mwvt9. I enjoy learning from you, cpk123 and others. Because of our teamwork we have a much better chance of being well-prepared and passing the exam.

cpk123 Wrote: ------------------------------------------------------- > to clarify – he passed L1 in Dec, and will smash > L2 in June… > > CP yeah, I think the smart money is on maratikus for June.