Hostname: page-component-848d4c4894-wzw2p Total loading time: 0 Render date: 2024-04-30T19:25:26.790Z Has data issue: false hasContentIssue false

Costa’s concavity inequality for dependent variables based on the multivariate Gaussian copula

Published online by Cambridge University Press:  12 April 2023

Fatemeh Asgari*
Affiliation:
University of Isfahan
Mohammad Hossein Alamatsaz*
Affiliation:
University of Isfahan
*
*Postal address: Department of Statistics, Faculty of Mathematics and Statistics, University of Isfahan, Isfahan 81746-73441, Iran.
*Postal address: Department of Statistics, Faculty of Mathematics and Statistics, University of Isfahan, Isfahan 81746-73441, Iran.

Abstract

An extension of Shannon’s entropy power inequality when one of the summands is Gaussian was provided by Costa in 1985, known as Costa’s concavity inequality. We consider the additive Gaussian noise channel with a more realistic assumption, i.e. the input and noise components are not independent and their dependence structure follows the well-known multivariate Gaussian copula. Two generalizations for the first- and second-order derivatives of the differential entropy of the output signal for dependent multivariate random variables are derived. It is shown that some previous results in the literature are particular versions of our results. Using these derivatives, concavity of the entropy power, under certain mild conditions, is proved. Finally, special one-dimensional versions of our general results are described which indeed reveal an extension of the one-dimensional case of Costa’s concavity inequality to the dependent case. An illustrative example is also presented.

Type
Original Article
Copyright
© The Author(s), 2023. Published by Cambridge University Press on behalf of Applied Probability Trust

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)

References

Amazigo, J. and Rubenfeld, L. (1980). Advanced Calculus and its Applications to the Engineering and Physical Siences. Wiley, New York.Google Scholar
Arias-Nicolás, J. P., Fernández-Ponce, J. M., Luque-Calvo, P. and Suárez-Llorens, A. (2005). Multivariate dispersion order and the notion of copula applied to the multivariate t-distribution. Prob. Eng. Inf. Sci. 19, 363375.CrossRefGoogle Scholar
Asgari, F. and Alamatsaz, M. H. (2022). An extension of entropy power inequality for dependent random variables. Commun. Statist. Theory Meth. 51, 43584369.Google Scholar
Asgari, F., Alamatsaz, M. H. and Khoolenjani, N. B. (2019). Inequalities for the dependent Gaussian noise channels based on Fisher information and copulas. Prob. Eng. Inf. Sci. 33, 618657.CrossRefGoogle Scholar
Blachman, N. (1965). The convolution inequality for entropy powers. IEEE Trans. Inf. Theory 11, 267271.Google Scholar
Bergmans, P. (1974). A simple converse for broadcast channels with additive white Gaussian noise. IEEE Trans. Inf. Theory 20, 279280.CrossRefGoogle Scholar
Costa, M. (1985). A new entropy power inequality. IEEE Trans. Inf. Theory 31, 751760.CrossRefGoogle Scholar
Dembo, A. (1989). Simple proof of the concavity of the entropy power with respect to added Gaussian noise. IEEE Trans. Inf. Theory 35, 887888.CrossRefGoogle Scholar
Joe, H. (1997). Multivariate Models and Dependence Concepts (Monographs Statist. Appl. Prob. 73). Chapman & Hall, London.Google Scholar
Johnson, O. (2004). A conditional entropy power inequality for dependent variables. IEEE Trans. Inf. Theory 50, 15811583.Google Scholar
Kay, S. (2009). Waveform design for multistatic radar detection. IEEE Trans. Aerosp. Electron. Systems 45, 11531166.CrossRefGoogle Scholar
Khoolenjani, N. B. and Alamatsaz, M. H. (2016). Extension of de Bruijn’s identity to dependent non-Gaussian noise channels. J. Appl. Prob. 53, 360368.Google Scholar
Shannon, C. E. (1948). A mathematical of communication. Bell Systems Tech. J. 27, 623656.Google Scholar
Sklar, A. (1959). Fonctions de répartition à n dimensions et leurs marges. Publ. Inst. Statist. Univ. Paris 8, 229231.Google Scholar
Stam, A. J. (1959). Some inequalities satisfied by the quantities of information of Fisher and Shannon. Inf. Control 2, 101112.Google Scholar
Takano, S., Watanabe, S., Fukushima, M., Prohorov, Y. and Shiryaev, A. (1995). The inequalities of Fisher information and entropy power for dependent variables. Proc. 7th Japan–Russia Symp. Prob. Theory Math. Statist. pp. 460470.Google Scholar
Villani, C. (2000). A short proof of the concavity of entropy power. IEEE Trans. Inf. Theory 46, 16951696.Google Scholar
Weingarten, H., Steinberg, Y. and Shamai, S. (2006). The capacity region of the Gaussian multiple-input multiple-output broadcast channel. IEEE Trans. Inf. Theory 52, 39363964.CrossRefGoogle Scholar