Skip to main content
eScholarship
Open Access Publications from the University of California

UC Santa Barbara

UC Santa Barbara Electronic Theses and Dissertations bannerUC Santa Barbara

Regression and optimal transport models for functional and surface-valued data

  • Author(s): Liu, Xi
  • Advisor(s): Petersen, Alexander
  • et al.
Abstract

There are various types of information, such as shapes and constrained curves, that can not be represented by a scalar variable or a simple Euclidean vector. For these nonstandard data types, their inherent constraints and geometric features can often be exploited to inform model development and data analysis. In analyzing these data, the usual Euclidean norm that is implicitly used for standard multivariate analyses must be replaced by suitable functional norms or metrics. In this dissertation, some statistical models and computational tools are developed in order to analyze information in functional and surface-valued data.

In Chapter 1, the effect of a smooth curve on a binary response is analyzed through a functional generalized linear model. The proposed method develops a novel approach under the assumption that the coefficient function $\beta(t)$ is truncated, i.e. one can expect that the curve predictor loses its influence after a timepoint in its domain. To achieve an estimate $\beta(t)$ that is simultaneously smooth and truncated, a structured variable selection method and localized B-spline expansion of $\beta(t)$ are leveraged to formulate a penalized log-likelihood function, where the nested group lasso penalty guarantees the sequential entering of B-splines and hence induces truncation in $\beta(t)$. Computationally, an optimization scheme is developed to compute the entire solution path effectively when varying the truncation tuning parameter from $\infty$ to 0. Unlike previous methods, which either directly penalized the value of the truncation point or resulted in a nonconvex optimization problem, the proposed approach utilizes a nested group lasso penalty and leads to a convex optimization problem. By expressing the nonsmooth lasso penalty in its dual formulation, it can be subsequently smoothed so that the objective function can be optimized by an accelerated gradient descent algorithm. Theoretically, the convergence rate of the estimate and consistency of the truncation point estimation are derived under suitable smoothness assumptions. The proposed method is demonstrated with an application involving the effects of blood pressure curves in patients who suffered a spontaneous intracerebral hemorrhage.

In Chapter 2, a set of computational tools is developed to perform inference for a regression model where density curves appear as functional response objects with vector predictors. For such models, inference is key to understand the importance of density-predictor relationships, and the uncertainty associated with the estimated conditional mean densities, defined as conditional Fr echet means under a suitable metric. Since the positive density curve has integral equal to one, the usual $L_p$ metric is not suitable to model density curves. Instead, using the Wasserstein geometry of optimal transport, we consider the Fr echet regression of density curve responses and develop tests for global and partial effects, as well as simultaneous confidence bands for estimated conditional mean densities. This dissertation focuses on the computational aspects of the proposed statistical inference methods. An R package was developed to promote the usage of Fr echet regression of density curve responses. The accuracy of these methods, including nominal size, power, and coverage, is assessed through simulations. Furthermore, the utility of the methodology is demonstrated via regression analysis of post-intracerebral hemorrhage hematoma densities and their associations with a set of clinical and radiological covariates.

In Chapter 3, Wasserstein metric is applied in a different manner to assist the analysis of surface and contour data. The motivation for this analysis comes from cosmetics, where it is desirable for each individual to have a personalized contour map that is tailored to their unique face shape when applying makeup to enhance or change the shape of the face. Two main questions are of interest. First, given a face outline, how can it be represented in a common coordinate frame with the standard face shape templates - oval, square, rectangle, heart, and round? Second, where is the optimal location to apply contour in order to sculpt and add dimension to one's face with makeup? To address these two problems, the face shape is represented as a 2D discrete uniform distribution with support given by the face outline and the magnitude of difference between two face shapes is quantified by the 2-Wasserstein distance. Given the standard face templates -- oval, square, rectangle, heart, and round face shape, the first question can be addressed through modeling the given shape by a length five weight vector, where each element measures the similarity between the given face and the corresponding standard face template. Formally, this weight vector represents the Wasserstein barycentric coordinate of the input face outline. To accelerate the computation, an entropy regularized 2-Wasserstein metric is utilized, thereby transforming the linear programming task to an iterative Bregman projection problem. Hence all algorithms can be paralleled through multiple GPUs. For the second question, it is similar to a regression model, where the conditional information is the face type and the response is the optimal contour location. After obtaining the Wasserstein barycentric coordinates in the first question, this same weight vector is used to compute the contour barycenter. Thus, this contour barycenter depends on the standard contour templates as well as the given face type.

Main Content
Current View