Construct a 90 percent confidence interval for the starting salaries of 100 recently hired employees with average starting salaries of $50,000 and a standard deviation of $3,000 assuming the population has a normal distribution. A) 50000 +/- 1.65(3000). B) 50000 +/- 1.65(300). C) 30000 +/- 1.65(5000). D) 50000 +/- 1.65(30000). The correct answer was A. 90% confidence interval is X ± 1.65s = 50000 ± 1.65(3000) = $45,050 to $54,950 ------------------------ I’m confused because my original answer was B. I divided the std dev by the square root of 100 (i.e. 3000/10 = 300). Any comments on this question?

Assumption: standard deviation = standard error when the population size = sample size

Thanks for your answer. It’s awesome but: why can we have that assumption? or how do you know the pop size = sample size. Thanks.

one unofficial way i use is look for the word ‘sample’. sampling distribution is when in a large # of population, you take a smaller subset, many times. here, if you read the question, you can pretty much discern that is not really what the question is implying - that there are 100 employees, and you have all data for the 100 employees. if the question said out of 1000 employees, a sample of 100 employees were used, i think then your calculation would work.

Thanks guys. Finished Quant. kkkkkkk