Skip to main content
Open Access Publications from the University of California

UC San Diego

UC San Diego Electronic Theses and Dissertations bannerUC San Diego

Information Theoretic Measures and Estimators of Specific Causal Influences

  • Author(s): Schamberg, Gabriel
  • Advisor(s): Coleman, Todd P;
  • Kim, Young-Han
  • et al.

The need to measure causal influences between random variables or processes in complex networks arises throughout academic disciplines. In four parts, we here develop techniques for measuring and estimating causal influences using tools from information theory, with the explicit goal of providing context for how information theoretic perspectives on causal influence fit within the vast and interdisciplinary body of work studying causality. Throughout the dissertation, we demonstrate the utility of the proposed methods with applications to physiologic, economic, and climatological datasets.

Beginning with a focus on time series, we present a modularized approach to finding the maximum a posteriori estimate of a latent time series that obeys a dynamic stochastic model and is observed through noisy measurements. We specifically consider modern signal processing problems with non-Markov signal dynamics (e.g., group sparsity) and/or non-Gaussian measurement models (e.g., point process observation models used in neuroscience). Importantly, this framework can be leveraged in the estimation of the latent parameters specifying the probability distribution of a time series, which is a fundamental step in the estimation of causal influences between time series.

Second, we study the conditions under which directed information, a popular information theoretic notion of causal influence between time series, can be estimated without bias. While the assumptions made by estimators of directed information are often presented explicitly, a characterization of when we can expect these assumptions to hold is lacking. Using the concept of d-separation from Bayesian networks, we present sufficient and almost everywhere necessary conditions for which proposed estimators can be implemented without bias. We further introduce a notion of partial directed information, which can be used to bound the bias under a milder set of assumptions.

Third, we present a sample path dependent measure of causal influence between time series. The proposed measure is a random sequence, a realization of which enables identification of specific patterns that give rise to high levels of causal influence. We demonstrate how sequential prediction theory may be leveraged to estimate the proposed causal measure and introduce a notion of regret for assessing the performance of such estimators which we subsequently bound.

Finally, we extend our focus to general causal graphs and show that information theoretic measures of causal influence are fundamentally different from mainstream (e.g. statistical) notions in that they (1) compare distributions over the effect rather than values of the effect and (2) are defined with respect to random variables representing a cause rather than specific values of a cause. We leverage perspectives from the statistical causality literature to present a novel information theoretic framework for measuring direct, indirect, and total causal effects in natural complex networks. In addition to endowing information theoretic approaches with an enhanced "resolution," the proposed framework uniquely elucidates the relationship between the information theoretic and statistical perspectives on causality.

Main Content
For improved accessibility of PDF content, download the file to your device.
Current View