1. A large discount chain compares the performance of its credit managers in Ohio and Illinois by comparing the mean dollar amounts owed by customers with delinquent charge accounts in these two states. Two random samples of 10 delinquent accounts are selected from all delinquent accounts in Ohio and Illinois. The sample of Ohio gives a mean dollar amount of $524 with a standard deviation of $68. The second sample from Illinois yields a mean dollar amount of $473 with a standard deviation of $22. (i) Calculate a 95% confidence interval for the difference between the mean dollar amounts owed in Ohio and Illinois; (ii) Can we conclude at a 5% level of significance that there is a difference between the mean dollars owed by customers with delinquent charge accounts in Ohio and Illinois?
2. Professor Brunner had students in his Marketing class rate his performance as Excellent, Good, Fair or Poor. A graduate student collected the ratings and assured the students that Professor Brunner would not receive them until after the course grades had been submitted. The rating (i.e. the treatment) a student gave the professor was matched with his/her course grade, which could range from 0 to 100. The data is as follows:
Excellent Good Fair Poor
94 75 70 68
90 68 73 70
85 77 76 72
80 83 78 65
88 80 74
Is there a difference in the mean scores of the students in each of the four rating categories at a 99% confidence level?
Step by step method for testing the hypothesis under 5 step approach is discussed here. Excel template for each problem is also included. This template can be used to obtain the answers of similar problems.