can someone explain the difference between the standard error of the sample mean and the standard deviation of the sample? thanks!

sample mean is arithmetic average. std deviation = sqrt(variance) variance = sum (mean deviations)^2 / number of observations - 1 i’d like to know what standard error is … cuz i haven’t done this yet.

pepp Wrote: ------------------------------------------------------- > sample mean is arithmetic average. > > std deviation = variance / number of observations > - 1 > Yikes. s.d. = Sqrt(Var) > variance = sum (mean deviations)^2 > > i’d like to know what standard error is … cuz i > haven’t done this yet.

anyone else?

SEE = sigma/ (sq rt of N)

ok. conceptually what is the difference between the standard error of the sample mean and the sample standard deviation. for me they are the quite the same . anyone?

Pepps, you use the standard error when you calculate the confidence interval: standard error= Sigma / (n)^1/2 I hope this helps

standard deviation is for population standard error is for sample sample mean is an estimation of the population mean standard error is an estimation of the standard deviation hope that helps

^No When you calculate a statistic like the sample mean, the data that go into the calculation are random so the statistic must be random. If it’s random, it has a distribution and thus a std. dev. (not exactly true but close enuf). The distribution of a statistic is called its sampling distribution. The standard error is the standard deviaation of the sampling distribution. In the case of X-bar, this s.e. is sigma/sqrt(n) where sigma is the std.dev of the observations.