Skip to main content
eScholarship
Open Access Publications from the University of California

UC Berkeley

UC Berkeley Electronic Theses and Dissertations bannerUC Berkeley

New Information Inequalities with Applications to Statistics

Abstract

We introduce, under a parametric framework, a family of inequalities between mutual information and Fisher information. These inequalities are indexed by reference measures satisfying a log-Sobolev inequality (LSI), and reveal previously unknown connections between LSIs and statistical inequalities. One such connection is shown for the celebrated van Trees inequality by recovering under a Gaussian reference measure a stronger entropic inequality due to Efroimovich. We further present two new inequalities for log-concave priors that do not depend on the Fisher information of the prior and are applicable under certain scenarios where the van Trees inequality and Efroimovich’s inequality cannot be applied. We illustrate a procedure to establish lower bounds on risk under general loss functions, and apply it under several statistical settings, including the Generalized Linear Model and a general pairwise comparison framework.

Main Content
For improved accessibility of PDF content, download the file to your device.
Current View