Share
Explore BrainMass

Waiting time in a queue

Use the following data for the next question. Gerry Putz is a graduate assistant at Hoosier State College. As part of his duties, he holds regular office hours each week. The class he helps teach is very large - a requirement for all incoming freshmen - so demand for his services during office hours is fairly heavy and consistent. For a project in his advanced service management class, he had a classmate observe the arrivals of students over a two-week period, and found the time between student arrivals to be exponentially distributed, with a mean of 10 minutes. He also found that it took him 8 minutes on average to help a student, and these times were also exponentially distributed. He feels he is overworked, and wants to apply for a grant to buy a computer, which he expects will decrease the average service time. Before he applies, he wants to know some things....

1-The college has heard Gerry's cries for help, but rather than buy him a computer, they have provided a second graduate assistant to help students. The new assistant works alongside (but independent of ) Gerry, and is just as well qualified as Gerry however, now that there are two of them, they can take more time while helping students. So while the arrival rate of students has not changed from the prior problem, the average time each G.A. takes to help a student has increased to 12 minutes. Students still wait in a single line outside the office, and simply take the first available G.A. Given this new scenario, how long will students have to wait in line, on average?

Attachments

Solution Preview

See attached file for complete answer.

Time between student arrivals= 10 minutes
λ=arrival rate per hour= 6 =60/10

Service time= 12 minutes
μ=departure rate per hour= 5 =60/12

ρ=λ/μ= 1.2 =6/5

c=no of servers= no of GA = 2
Lq=expected no of students ...

Solution Summary

The solution provides answer to a queuing problem

$2.19