- Main
Information Theory, Dimension Reduction and Density Estimation
- Saha, Sujayam
- Advisor(s): Yu, Bin;
- Guntuboyina, Aditya
Abstract
This thesis documents three different contributions in statistical learning theory. They were developed with careful emphasis on addressing the demands of modern statistical analysis upon large-scale modern datasets. The contributions concern themselves with advancements in information theory, dimension reduction and density estimation - three foundational topics in statistical theory with a plethora of applications in both practical problems and development of other aspects of statistical methodology.
In Chapter
in probability, mathematical statistics and information theory such as Kullback-Leibler divergence, chi-squared divergence, squared Hellinger distance, total variation distance etc. In contrast with previous research in this area, we study the problem of obtaining sharp inequalities between $f$-divergences in full generality. In particular, our main results allow $m$ to be an arbitrary positive integer and all the divergences $D_f$ and $D_{f_1}, \dots, D_{f_m}$ to be arbitrary $f$-divergences. We show that the underlying optimization problems can
be reduced to low-dimensional optimization problems and we outline methods for solving them. We also show that many of the existing
results on inequalities between $f$-divergences can be obtained as special cases of our results and we also improve on some existing
non-sharp inequalities.
In Chapter
In Chapter
Main Content
Enter the password to open this PDF file:
-
-
-
-
-
-
-
-
-
-
-
-
-
-