Ask NSDL Archive

Ask NSDL Archive

http://ask.nsdl.org
http://ask.nsdl.org | nsdl@nsdl.org

Home

About

Statistics

Question

1. We would like to estimate the average amount of time that an undergraduate business student studies outside of class each week for the business statistics course. A random sample of 100 is selected. The sample mean is 2.2 hours and the s2 = 2.25 hours. How large sample would be needed if we wanted to estimate the true mean with a maximum error of estimation of 15 minutes with 95% confidence?

Answer

Hello again, Farrah. Your notation is a little confusing, so I'll give you the answers in a couple of different ways. You say that the sample mean is 2.2 hours, with s2 = 2.25 hours. That could be s^2 (sample variance), but then the units would have to be hours^2, not hours. (The ^ indicates exponentiation -- also written **). So, you want the radius of the confidence interval to be 15 minutes, or 0.25 hours. A 95% confidence interval has a radius of 1.96 standard deviations, so you want s = 0.125. Now the standard error of the mean is equal to the standard deviation divided by the square root of n. So if you mean that s^2 = 2.25 hours^2, then s = 1.5 hours, and you would need a sample size of (s*1.96/0.25)^2 = 139. If you mean that s = 2.25 hours, then you would need 312.


This site was whacked using the TRIAL version of WebWhacker. This message does not appear on a licensed copy of WebWhacker.