Structural Properties of Optimal Scheduling Policies for Wireless Data Transmission


We analyze a cell with a fixed number of users in a time period network. The base station schedules to serve at most one user in a given time period based on information about the available data rates and other parameter(s) for all the users in the cell. We consider infinitely backlogged queues and model the system as a Markov Decision Process (MDP) and prove the monotonicity of the optimal policy with respect to the "starvation age" and the available data rate. For this, we consider both the discounted as well as the long-run average criterion. The proofs of the monotonicity properties serve as good illustrations of analyzing MDPs with respect to their optimal solutions.

Share and Cite:

N. Bolia and V. Kulkarni, "Structural Properties of Optimal Scheduling Policies for Wireless Data Transmission," International Journal of Communications, Network and System Sciences, Vol. 5 No. 10, 2012, pp. 671-677. doi: 10.4236/ijcns.2012.510069.

Conflicts of Interest

The authors declare no conflicts of interest.


[1] P. Bender, P. Black, M. Grob, R. Padovani, N. Sindhushayana and A. Viterbi, “CDMA/HDR: A Bandwidth Efficient High Speed Wireless Data Service for Nomadic Users,” IEEE Communications Magazine, Vol. 38, No. 7, 2000, pp. 70-77. doi:10.1109/35.852034
[2] L. Georgiadis, M. J. Neely and L. Tassiulas, “Resource Allocation and Cross-Layer Control in Wireless Networks,” Foundations and Trends in Networking, Vol. 1, No. 1, 2006, pp. 1-144. doi:10.1561/1300000001
[3] N. Bolia and V. Kulkarni, “Index Policies for Resource Allocation in Wireless Networks,” IEEE Transactions on Vehicular Technology, Vol. 58, No. 4, 2009, pp. 1823-1835. doi:10.1109/TVT.2008.2005101
[4] D. Tse, “Multiuser Diversity in Wireless Networks.”
[5] Matthew Andrews, “Instability of the Proportional Fair Scheduling Algorithm for HDR,” IEEE Transactions on Wireless Communications, Vol. 3, 2002, p. 2004.
[6] Q. Hu and W. Yue, “Markov Decision Processes with Their Applications,” Springer, New York, 2008.
[7] I. Kadi, N. Pekergin and J. M. Vincent, “Analytical and Stochastic Modeling Techniques and Applications,” Springer, New York, 2009.
[8] M. Puterman, “Markov Decision Processes: Discrete Stochastic Dynamic Programming,” John Wiley & Sons, Inc, New York, 1994.
[9] S. M. Ross, “Introduction to Stochastic Dynamic Programming,” Academic Press, Inc., 1983.

Copyright © 2021 by authors and Scientific Research Publishing Inc.

Creative Commons License

This work and the related PDF file are licensed under a Creative Commons Attribution 4.0 International License.