152k views
1 vote
A taco truck is parked at a local lunch site and customers queue up to buy tacos at a rate of one per minute. The arrivals of customers are completely independent of one another. It takes 40 seconds on average to serve a customer (using a single server), with a standard deviation of 20 seconds. Determine the average time (in seconds) it takes a customer from when they arrive to the truck until they receive their taco. (Enter an integer answer.)

User Zorox
by
5.4k points

1 Answer

4 votes

Answer:2 min

Step-by-step explanation:

Given

Arrival rate
\lambda =(60)/(1) per hour

Service rate
\mu =(60* 60)/(40) per hour


\rho =(\lambda )/(\mu )=(40)/(60)=(2)/(3)

Now average waiting time in the system is given by

waiting time
L_s=(1)/(\mu -\lambda )


L_s=(1)/(90-60)=(1)/(30) hr


L_s=(60)/(30)=2 min

User Dharman
by
5.7k points