Skip to main content
eScholarship
Open Access Publications from the University of California

UC Berkeley

UC Berkeley Electronic Theses and Dissertations bannerUC Berkeley

Topics in Conditional Inference

Abstract

The modern data analysis process is rarely one-step, but instead paved with iterative exploratory data analyses and choices. Often data analysts are tempted to peek at the data before choosing the hypotheses to be tested. In other times, the vast amount of data is screened and not all information is accessible to the analysts. In either case, data analyses have to be carried post-selection, as a consequence of even the most innocuous exploratory data analyses. A particular method to conduct post-selection inference is conditional inference, with a few instances detailed in this work.

Chapter 2 — based on Hung and Fithian (2019a) — explores a scenario where the choice of null hypothesis is dependent on the very same data used in the test. Using conditional inference, we provide a test that adapts to the data, for whichever hypothesis is most sensible. As a consequence of the adaptivity, our test is also much more powerful than the classical approaches.

Chapter 3 — based on Hung and Fithian (2019b) — describes a meta-analysis where the data itself has been selected, but meaningful inference is nonetheless desired. Through conditional inference, we modified classical methods to provide post-selection inference.

Finally in Chapter 4, I present unpublished work investigating an optimal method of combining information from a post-selection original experiment and a replication experiment, a current common concern in experimental psychology.

Main Content
For improved accessibility of PDF content, download the file to your device.
Current View