High-dimensional Feature Selection Using Hierarchical Bayesian Logistic Regression with Heavy-tailed Priors
Skip to main content
eScholarship
Open Access Publications from the University of California

UC Riverside

UC Riverside Previously Published Works bannerUC Riverside

High-dimensional Feature Selection Using Hierarchical Bayesian Logistic Regression with Heavy-tailed Priors

Abstract

The problem of selecting the most useful features from a great many (eg, thousands) of candidates arises in many areas of modern sciences. An interesting problem from genomic research is that, from thousands of genes that are active (expressed) in certain tissue cells, we want to find the genes that can be used to separate tissues of different classes (eg. cancer and normal). In this paper, we report our empirical experiences of using Bayesian logistic regression based on heavy-tailed priors with moderately small degree freedom (such as 1) and very small scale, and using Hamiltonian Monte Carlo to do computation. We discuss the advantages and limitations of this method, and illustrate the difficulties that remain unsolved. The method is applied to a real microarray data set related to prostate cancer. The method identifies only 3 non-redundant genes out of 6033 candidates but achieves better leave-one-out cross-validated prediction accuracy than many other methods.

Many UC-authored scholarly publications are freely available on this site because of the UC's open access policies. Let us know how this access is important for you.

Main Content
For improved accessibility of PDF content, download the file to your device.
Current View