Quantitative Question

The least accurate statement about measures of dispersion for a distribution is that the: A. range provides no information about the shape of the data distribution. B. mean absolute deviation will always be smaller than the standard deviation. C. arithmetic average of the deviations around the mean will always be equal to one. D. variance of a sample with 50 observations is computed by dividing the sum of the squared deviations from the mean by 49.

c is wrong. arithmetic average of deviations around the mean would be 0. since you are doing sum(x) - n.xBar and xbar = sum(x)/n

I think it may be B. If you had a data set that was using decimals, and they were squared, the numbers get smaller…so standard deviation would be smaller than mean absolute deviation. However, if you see from my other post, I am not exactly a statistical whiz…

and cp proves me wrong.

Mean absolute deviation will always be smaller than the standard deviation Can you explain more about MAD and Std deviation? I still struggle with the relationship. Arithmetic average of the deviations around the mean will always be equal to one. This should be zero. Imagine a normal distribution with 0 as the Y axis. Distributions are normally placed on either side of the bell curve thus creating an average of 0.

MAD is simply the absolute value measure of dispersion, the St. Dev squares the difference from the mean thus MAD < St. Dev ALWAYS! The answer is C, simply b/c the arithmetic mean is the only measure of dispersion where it would equal 0.