T test and F test

I am still confuse on when you used the T test and F test, can anyone shed some light when do I have to use the T test and the F test? Thanks

Let’s assume this linear model we are regressing: y = Bo + B1x1 + B2x2 + ei Least squares are run and we get an equation of the estimates of the predictors, B_hats. Can’t really type it out, but same equation as above, by y and B’s get a hat on them ("^"), and there error term disappears since the expected value is zero. The t-test looks at the significance of one B (with a hat). that is that null hypothesis, Ho: B1 = 0. If we reject, then that means B1 is significantly different than zero. Therefore, we would keep that factor in our model, it’s helping to explain the estimate we want, y. The F-test looks at the significance of ALL the B’s at once. that is that the null hypothersis, Ho: B1 = B2 = 0. ALL of them are equal to zero. If we reject, ALL the estimates produce a non-significant (not zero) explanation of dependent variable y_hat. In short, t-test is significance of one parameter, Beta; F-test is the entire set of B’s. Sorry if I overexplained, I tend to do that.

T - Test - for testing individual variables. F - Test - for testing the model as a whole (whether it is significant or not)

That’s a shorter way of saying it! To be precise, though, t-test is testing parameter estimates, not variables. You use it to test equal means M1 = M2. Or B = 0 in regressions. F-test is for equal variance testing, or b1 = b2 = … = bn = 0 in regression.

The equal means / equal variance bit was new :slight_smile: Thanks.

Thanks to all who answer my question, the answer was crystal clear!