With PQDT Open, you can read the full text of open access dissertations and theses free of charge.
About PQDT Open
Search
The fundamental quantities in information theory - e.g. entropy, Fisher information, and moment - are basic information measures used to answer many practical questions. Their important properties and the relations between them are often formulated as information theoretic inequalities. Connections between information theoretic inequalities and geometric inequalities were discoved by Lieb (1978), Dembo-Cover-Thomas (1991), Lutwak-Yang-Zhang (2004, 2005, 2007), Lutwak-Lv- Yang-Zhang (2012, 2013), and others. The Shannon entropy power inequality in information theory and the Brunn-Minkowski inequality in convex geometry are related, see Lieb (1978) and Dembo-Cover-Thomas (1991). The inequalities between moments and Renyi entropy contain deep geometry in the form of affine isoperimetric inequalities (Lutwak-Yang-Zhang, 2004).
This work presents a novel approach to information theoretic inequalities that establishes new links among information theory, convex geometry, and functional analysis. The work introduces new notions in the subject of information theory, Mixed Fisher Information and Mixed Moment, that unify classical Euclidean concepts and ane concepts of Lutwak-Yang-Zhang. These new notions provide new information theoretic measures not only for single random vectors but also for pairs of random vectors. They are closely related to the fundamental notions of mixed volume and dual mixed volume in convex geometry. A series of sharp information inequalities areestablished involving mixed Fisher information, mixed moment, and Shannon and Renyi entropies. These new information inequalities are intimately connected with reverse isoperimetric inequalities in convex geometry and Sobolev type inequalities in analysis.
The classsical moment-entropy inequality shows that a continuous random variable with given second moment and maximal Shannon entropy must be Gaussian. Stam's entropy-Fisher information inequality shows that a continuous random variable with given Fisher information and minimal Shannon entropy must also be Gaussian. These results were extended to Renyi entropy and p-Fisher information of random vectors in Lutwak-Yang-Zhang (2004, 2005, 2007) and Lutwak-Lv-Yang-Zhang (2012, 2013), where an extremal family of generalized Gaussian densities were introduced. In this work, generalizations of these results and completely new information theoretic inequalities involving both continuous and arbitrary random vectors are established. An enlarged extremal family of generalized Gaussian densities are dened whose contours are not only Euclidean balls but also dilates of all l n/p -balls. Moreover, discrete cross measures are also extremal probability measures in the new mixed information inequalities. The new results and phenomena uncovered signicantly expand the understanding of information theoretic inequalities.
In order to establish the new information theoretic inequalities, connections to convex geometry and functional analysis are extended. New analytic and volume inequalities are developed in this work. Important convex bodies in the L_{p} Brunn-Minkowski theory are L_{ p} zonoids whose polars are L_{p} balls that are precisely the unit balls of nite dimensional subspaces of L_{ p} space. New volume inequalities for L_{p} zonoids and L_{p} balls are proved. These results enable the establishment of sharp analytic inequalities for functions, called logarithmic Gagliardo-Nirenberg inequalities, which are closely related to reverse isoperimetric inequalities that characterize L_{ p} balls and whose extremal functions are not radially symmetric. This is in contrast to the familiar Sobolev-type inequalities which are associated with the Euclidean isoperimetric inequality that characterizes the Euclidean ball; and also different from the affine Sobolev inequalities of Lutwak-Yang-Zhang (2002), Zhang (1999) and others, which are connected to affine isoperimetric inequalities and which characterize ellipsoids.
In Chapter 2, basic notions and notation, as well as all relevant concepts in information theory and convex geometry will be reviewed.
In Chapter 3, volume inequalities for L_{p} zonoids and L_{p} balls associated with isotropic probability measures are established. These inequalities are applied to derive logarithmic projection and centroid inequalities.
In Chapter 4, with the volume inequalities established in Chapter 2, Euclidean, ane and logarithmic Gagliador-Nirenberg-Sobolev inequalities are obtained. The logarithmic Gagliardo-Nirenberg inequalities are completely new, while the Euclidean and ane cases are generalizations of known results.
With all of the tools previously established, we can now obtain the results of Chapter 5 which is all about information theory. The notions of mixed Fisher information and mixed moment are introduced. Mixed information inequalities for pairs of random vectors are established. They include mixed entropy-Fisher information inequalities, ane entropy-Fisher information inequalities, and logarithmic entropy-Fisher information inequalities. They also include mixed moment-entropy inequalities, ane moment-entropy inequalities, and logrithmic moment-entropy inequalities. Most of these inequalities are completely new. The affine inequalities generalize Lutwak-Yang- Zhang's affine information inequalities.
Advisor: | Ltwak, Erwin |
Commitee: | Yang, Deane, Zhang, Gaoyong |
School: | New York University Tandon School of Engineering |
Department: | Mathematics |
School Location: | United States -- New York |
Source: | DAI-B 79/10(E), Dissertation Abstracts International |
Source Type: | DISSERTATION |
Subjects: | Mathematics |
Keywords: | Convex geometry, Functional analysis, Information theory |
Publication Number: | 10813161 |
ISBN: | 978-0-355-99173-4 |