Pulse rate is an important measure of the fitness of a person's cardiovascular system. The mean pulse rate for all U.S. adult males is approximately 72 heart beats per minute. A random sample of 21 U.S. male adults who jog at least 15 miles per week had a mean pulse rate of 52.6 beats per minute and a standard deviation of 3.22 beats per minute.
What is the 95% confidence interval for the mean pulse rate of all U.S. adult males who jog at least 15 miles per week.
How do I interpret the interval found above? and does it appear that jogging at least 15 miles per week reduces the mean pulse rate for adult males?
What assumptions are required for the validity of the CI?
This solution contains calculations and over 300 words of assistance.