Little’s Law or Little’s Formula is a way to relate the average length of a queue and the average waiting time in a queue. It was proved by John Little in 1960 while Little was at the Case Institute of Technology. The paper proving the result is one of the most cited in the field of Operations Research.
Let be the average time a customer spends waiting in a queue. Let be the average length of the queue. Let be the arrival rate to the queue—the average number of people that arrive per unit time. Then the three quantities can be related by the simple formula .
This result holds no matter the arrival process (Poisson, deterministic, etc.), the service process (Poisson, deterministic, etc.), the queueing discipline (first come first served, last in first out, etc.), and the number of servers. It also holds for total time in the system, if you instead use and as the average number of people in the system –combining those in queue and those in service—and the average time in system, respectively. .