Statistical Guarantees of Tuning-Free Methods for Gaussian Graphical Models
Skip to main content
eScholarship
Open Access Publications from the University of California

UC Santa Barbara

UC Santa Barbara Electronic Theses and Dissertations bannerUC Santa Barbara

Statistical Guarantees of Tuning-Free Methods for Gaussian Graphical Models

Abstract

The majority of methods for sparse precision matrix estimation rely on computationally expensive procedures, such as cross-validation, to determine the proper level of regularization.Recently, a special case of precision matrix estimation based on a distributionally robust optimization (DRO) framework has been shown to be equivalent to the graphical lasso. From this formulation, a method for choosing the regularization term, i.e., for graphical model selection, without tuning was proposed. In Chapter 2 of this thesis, we establish a theoretical connection between the confidence level of graphical model selection via the DRO formulation and the asymptotic family-wise error rate of estimating false edges. Simulation experiments and real data analyses illustrate the utility of the asymptotic family-wise error rate control behavior even in finite samples.\par Next, we propose a completely tuning-free approach to estimating sparse precision matrix based on linear regression in Chapter 3. Theoretically, the proposed estimator is minimax optimal under various norms. In addition, we propose a second-stage enhancement with non-convex penalties, which possesses strong oracle properties. We assessed our proposed methods through comprehensive simulations and real data application on human gene network analysis.

Main Content
For improved accessibility of PDF content, download the file to your device.
Current View