Skip to main content Accessibility help
×
Home

THE STRONG LIMIT THEOREM FOR RELATIVE ENTROPY DENSITY RATES BETWEEN TWO ASYMPTOTICALLY CIRCULAR MARKOV CHAINS

  • Ying Tang (a1), Weiguo Yang (a2) and Yue Zhang (a2)

Abstract

In this paper, we are going to study the strong limit theorem for the relative entropy density rates between two finite asymptotically circular Markov chains. Firstly, we prove some lammas on which the main result based. Then, we establish two strong limit theorem for non-homogeneous Markov chains. Finally, we obtain the main result of this paper. As corollaries, we get the strong limit theorem for the relative entropy density rates between two finite non-homogeneous Markov chains. We also prove that the relative entropy density rates between two finite non-homogeneous Markov chains are uniformly integrable under some conditions.

Copyright

References

Hide All
1.Algoet, P.H. & Cover, T.M. (1988). A sandwich proof of the Shannon–McMillan–Breiman theorem. Annals of Probability 16: 899909.
2.Amari, S. (1985). Differential geometrical method in statistics. New York: Springer-Verlag.
3.Barron, A.R. (1985). The strong ergodic theorem for densities: Generalized Shannon-McMillan-Breiman theorem. Annals of Probability 13: 12921303.
4.Breiman, L. (1957). The individual ergodic theorem of information theory. The Annals of Mathematical Statistics 28: 809811.
5.Bowerman, B, David, H.T., & Isaacson, D. (1977). The convergence of Cesaro averages for certain nonstationary Markov chains. Stochastic Process and their Applications 5: 221230.
6.Chazottes, J.R., Giardina, C., & Redig, F. (2006). Relative entropy and waiting times for continuous time Markov processes. Electronic Journal of Probability 11: 10491068.
7.Chung, K.L. (1961). The ergodic theorem of information theory. The Annals of Mathematical Statistics 32: 612614.
8.Csiszar, I. (1967). Information type measures of difference of probability distribution and indirect observations. Studia Scientiarum Mathematicarum Hungarica 2: 299318.
9.Gray, R. (2011). Entropy and information theory. 2nd ed. New York: Springer.
10.Hall, P. & Heyde, C.C. (1980). Martingale limit theory and application. New York: Academic Press.
11.Isaacson, D. & Madsen, R. (1976). Markov chains theory and applications. New York: Wiley.
12.Jia, C., Chen, D., & Lin, K. (2008). The application of the relative entropy density divergence in intrusion detection models. 2008 International Conference on Computer Science and Software Engineering, 951954.
13.Kesidis, G. & Walrand, J. (1993). Relative entropy between Markov transition rate matrices. IEEE Transactions on Information Theory 39: 10561057.
14.Kieffer, J.C. (1974). A simple proof of the Moy-Perez generalization of the Shannon–McMillan–Breiman theorem. Pacific Journal of Mathematical 51: 203204.
15.Kullback, S. & Leibler, R. (1951). On information and sufficiency. The Annals of Mathematical Statistics 22: 7986.
16.Lai, J. & Ford, J.J. (2010). Relative entropy rate based Multiple Hidden Markov Model approximation. IEEE Transactions on Signal Processing 58(1): 165174.
17.Ma, H.L. & Yang, W.G. (2011). Erratum to ‘The asymptotic equipartition property for asymptotic circular Markov chains’. Probability in the Engineering and Informational Sciences 25: 265267.
18.McMillan, B. (1953). The basic theorems of information theory. The Annals of Mathematical Statistics 24: 196219.
19.Ross, S. (1982). Stochastic processes. New York: Wiley.
20.Shannon, C. (1948). A mathematical theory of communication. Bell Systtem Technical 27: 379423.
21.Wang, Z.Z. & Yang, W.G. (2016). The generalized entropy ergodic theorem for nonhomogeneous Markov chains. Journal of Theoretical Probability 29: 761775.
22.Yang, W.G. (1998). The asymptotic equipartition property for nonhomogeneous Markov information sources. Probability in the Engineering and Informational Sciences 12: 509518.
23.Yang, W.G. & Liu, W. (2004). The asymptotic equipartition property for mth-order nonhomogeneous Markov information sources. IEEE Transactions on Information Theory 50(12): 33263330.
24.Yang, J. et al. (2017). Strong law of large numbers for generalized sample relative entropy of nonhomogeneous Markov chains. Communication in Statistics – Theory and Methods 47(2): 15711579.
25.Yari, G.H. & Nikooravesh, Z. (2011). Relative entropy rate between a Markov chain and its corresponding hidden Markov chain. Journal of Statistical Research of Iran 8: 97109.
26.Zhong, P.P., Yang, W.G., & Liang, P.P. (2010). The asymptotic equipartition property for asymptotic circular Markov chains. Probability in the Engineering and Informational Sciences 24(2): 279288.

Keywords

Metrics

Full text views

Total number of HTML views: 0
Total number of PDF views: 0 *
Loading metrics...

Abstract views

Total abstract views: 0 *
Loading metrics...

* Views captured on Cambridge Core between <date>. This data will be updated every 24 hours.

Usage data cannot currently be displayed