Skip to main content Accessibility help


  • Gabriel Zayas-Cabán (a1) and Hyun-Soo Ahn (a2)


From health care to maintenance shops, many systems must contend with allocating resources to customers or jobs whose initial service requirements or costs change when they wait too long. We present a new queueing model for this scenario and use a Markov decision process formulation to analyze assignment policies that minimize holding costs. We show that the classic cμ rule is generally not optimal when service or cost requirements can change. Even for a two-class customer model where a class 1 task becomes a class 2 task upon waiting, we show that additional orderings of the service rates are needed to ensure the optimality of simple priority rules. We then show that seemingly-intuitive switching curve structures are also not optimal in general. We study these scenarios and provide conditions under which they do hold. Lastly, we show that results from the two-class model do not extend to when there are n≥3 customer classes. More broadly, we find that simple priority rules are not optimal. We provide sufficient conditions under which a simple priority rule holds. In short, allowing service and/or cost requirements to change fundamentally changes the structure of the optimal policy for resource allocation in queueing systems.



Hide All
1.Albright, S.C. (1980). Optimal maintenance-repair policies for the machine repair problem. Naval Research Logistics Quarterly 27(1): 1727.
2.Alidaee, B. & Womer, N.K. (1999). Scheduling with time dependent processing times: review and extensions. Journal of the Operational Research Society 50(7): 711720.
3.Argon, N.T., Ziya, S. & Righter, R. (2008). Scheduling impatient jobs in a clearing system with insights on patient triage in mass casualty incidents. Probability in the Engineering and Informational Sciences 22(3): 301332.
4.Armstrong, M.J. (2002). Age repair policies for the machine repair problem. European Journal of Operational Research 138(1): 127141.
5.Bachman, A., Janiak, A. & Kovalyov, M.Y. (2002). Minimizing the total weighted completion time of deteriorating jobs. Information Processing Letters 81(2): 8184.
6.Bhulai, S., Brooms, A.C. & Spieksma, F.M. (2014). On structural properties of the value function for an unbounded jump Markov process with an application to a processor sharing retrial queue. Queueing Systems 76(4): 425446.
7.Blok, H. & Spieksma, F.M. (2015). Countable state Markov decision processes with unbounded jump rates and discounted cost: optimality equation and approximations. Advances in Applied Probability 47(4): 10881107.
8.Boxma, O.J. & Vlasiou, M. (2007). On queues with service and interarrival times depending on waiting times. Queueing Systems 56(3-4): 121132.
9.Brown, M. & Ross, S.M. (1973). Optimal issuing policies. Management Science 19(11): 12921294.
10.Brown, M. & Solomon, H. (1973). Optimal issuing policies under stochastic field lives. Journal of Applied Probability 10(4): 761768.
11.Browne, S. & Yechiali, U. (1990). Scheduling deteriorating jobs on a single processor. Operations Research 38(3): 495498.
12.Buyukkoc, C., Varaiya, P. & Walrand, J. (1985). The cμ-rule revisited. Advances in Applied Probability 17(1): 237238.
13.Cai, X., Wu, X. & Zhou, X. (2014). Optimal stochastic scheduling. Springer.
14.Chan, C.W., Farias, V.F. & Escobar, G.J. (2016). The impact of delays on service times in the intensive care unit. Management Science. doi: 10.1287/mnsc.2016.2441.
15.Cheng, T.C.E., Ding, Q. & Lin, B.M.T. (2004). A concise survey of scheduling with time-dependent processing times. European Journal of Operational Research 152(1): 113.
16.Down, D.G., Koole, G. & Lewis, M.E. (2011). Dynamic control of a single-server system with abandonments. Queueing Systems 67(1): 6390.
17.Guo, X. & Hernández-Lerma, O. (2003). Continuous-time controlled Markov chains with discounted rewards. Acta Applicandae Mathematica 79(3): 195216.
18.Guo, X. & Hernández-Lerma, O. (2009). Continuous-time Markov decision processes. Berlin, Heidelberg: Springer.
19.Haque, L. & Armstrong, M.J. (2007). A survey of the machine interference problem. European Journal of Operational Research 179(2): 469482.
20.Kawai, H. (1983). An optimal ordering and replacement policy of a Markovian degradation system under complete observation. Journal of Operations Research Society of Japan 26: 279290.
21.Kawai, H. (1984). An optimal inspection and replacement policy of a Markovian deterioration system. In Osaki, S. & Hatoyama, Y. (eds.), Stochastic models in reliability theory. Lecture Notes in Economics and Mathematical Systems, vol 235. Berlin, Heidelberg: Springer, pp. 177186.
22.Legros, B, Jouini, O. & Koole, G. (2014). A uniformization approach for the dynamic control of multi-server queueing systems with abandonments. Technical report, Working paper.
23.Leung, J.Y.T. (2004). Handbook of scheduling: algorithms, models, and performance analysis. New York, NY: Chapman and Hall/CRC Press.
24.Lippman, S.A. (1975). Applying a new device in the optimization of exponential queuing systems. Operations Research 23(4): 687710.
25.Master, N., Chan, C.W. & Bambos, N. (2016). Myopic policies for non-preemptive scheduling of jobs with decaying value. Probability in the Engineering and Informational Sciences 136. doi: 10.1017/S0269964816000474.
26.Mosheiov, G. (1991). V-shaped policies for scheduling deteriorating jobs. Operations Research 39(6): 979991.
27.Mosheiov, G. (1996). λ-shaped policies to schedule deteriorating jobs. Journal of the Operational Research Society 47(9): 11841191.
28.Pinedo, M.L. (2012). Scheduling: theory, algorithms, and systems. Boston, MA: Springer US.
29.Prieto-Rumeau, T. & Hernández-Lerma, O. (2012). Discounted continuous-time controlled Markov chains: convergence of control models. Journal of Applied Probability 49(4): 10721090.
30.Prieto-Rumeau, T. & Hernández-Lerma, O. (2012). Selected topics on continuous-time controlled Markov chains and Markov games, vol. 5. London, UK: Imperial College Press.
31.Prieto-Rumeau, T. & Lorenzo, J.M. (2010). Approximating ergodic average reward continuous-time controlled Markov chains. IEEE Transactions on Automatic Control 55(1):201207.
32.Puterman, M.L. (1994). Markov decision processes: discrete stochastic dynamic programming. Wiley Series in Probability and Mathematical Statistics. New York: John Wiley and Sons, Inc.
33.Puterman, M.L. & Shin, M.C. (1978) Modified policy iteration algorithms for discounted markov decision problems. Management Science 24(11): 11271137.
34.Saghafian, S., Hopp, W.J., Van Oyen, M.P., Desmond, J.S. & Kronick, S.L. (2014). Complexity-augmented triage: a tool for improving patient safety and operational efficiency. Manufacturing & Service Operations Management 16(3): 329345.
35.Srinivasan, R. & Parlikad, A.K. (2014). Semi-Markov decision process with partial information for maintenance decisions. IEEE Transactions on Reliability 63(4): 891898.
36.Stengos, D. & Thomas, L.C. (1980). The blast furnaces problem. European Journal of Operational Research 4(5): 330336.
37.Tijms, H.C. & van der Duyn Schouten, F.A. (1985). A Markov decision algorithm for optimal inspections and revisions in a maintenance system with partial information. European Journal of Operational Research 21(2): 245253.
38.Whitt, W. (1990). Queues with service times and interarrival times depending linearly and randomly upon waiting times. Queueing Systems 6(1): 335351.


Related content

Powered by UNSILO


  • Gabriel Zayas-Cabán (a1) and Hyun-Soo Ahn (a2)


Full text views

Total number of HTML views: 0
Total number of PDF views: 0 *
Loading metrics...

Abstract views

Total abstract views: 0 *
Loading metrics...

* Views captured on Cambridge Core between <date>. This data will be updated every 24 hours.

Usage data cannot currently be displayed.