Explore BrainMass

# Confidence Intervals and Normality Assumption

This content was COPIED from BrainMass.com - View the original, and get the already-completed solution here!

Can you help me with two word problems on confidence intervals and formulas?

*A sample of 20 pages was taken without replacement from the 1,591 page phone directory Ameritech Pages Plus Yellow Pages. On each page, the mean area devoted to display ads was measured (a display ad is a large block of multicolored maps, text, etc). The data(in square millimeters) are shown below:

0 260 356 403 536 0 268 369 428 536
268 396 469 536 162 338 403 536 536 130

(a) Construct a 95 percent confidence interval for the true mean

(b) Why might normality be an issue?

(c) What sample size would be needed to obtain an error of +_ 10 square millimeters with 99 percent confidence?

(d)If not a reasonable requirement, what would be reasonable?

*Biting an unpopped kernel of popcorn hurts, so a connoisseur of popcorn counted 773 kernels and put them in a popper. After popping the unpopped kernels were counted there were 86.

(a)Construct a 90 percent confidence interval for the proportion of all kernels that would not pop

(b)Check the normality assumption

(c)Does the Very Quick Rule work well in this scenario? Why or why not?

(d) Why might this sample not be typical?