Skip to main content
eScholarship
Open Access Publications from the University of California

UC Santa Barbara

UC Santa Barbara Electronic Theses and Dissertations bannerUC Santa Barbara

UC Santa Barbara Electronic Theses and Dissertations

Investigating Cellular Response to Compressive Injury with a Microfluidic MEMS Device

(2021)

Traumatic Brain Injury (TBI) is a leading cause of death and disability worldwide, making it both a global health and economic problem. Despite extensive studies utilizing tissue-level injury models, there is still no effective neural therapeutic available to counteract the neurodegenerative cascade, or secondary injury mechanism, of TBI. In part, this is due to limited understanding of cell-level response to mechanical injury. Prior research has examined the effects of mechanical strain on individual cells, but these studies have involved low strains and low strain rates (ε < 10%, ε ̇ < 100 s-1 ) leaving a largely unexplored injury regime. Furthermore, many of these tools are low throughput (100s of cells per study) which limits the statistical significance of their findings. To more thoroughly explore the effects of cellular injury, a microfluidic and electromagnetically actuated MEMS device (the ‘µHammer’) was developed to apply high strains and high strain rates (ε > 40%, ε ̇ = 200,000 s-1) to individual cells in a high throughput manner (36,000 cells per minute). With this device, compressive strain was applied to human Neural Progenitor Cells (NPCs), which were then monitored for changes in viability and gene expression. Compression studies revealed TBI secondary injury mechanisms (cell death and apoptosis), mechanically sensitive neuroinflammation signaling elements, and a previously unexplored global expression signature. These results suggest that the µHammer device can be an invaluable tool for furthering the understanding of cellular response to mechanical injury.

Cover page of A computational framework for social valuation inference

A computational framework for social valuation inference

(2021)

Organisms in a social species constantly need to make trade-offs between their own welfare and that of conspecifics. An emerging body of research suggests that the regulation of such trade-offs is an important function of social cognition. In particular, the mind has mechanisms designed to regulate tradeoffs between the welfare of the self and that of specific others, and in consequence, the mind also contains mechanisms designed to construct representations of the degree to which another individual values the welfare of the self.

Existing evidence suggests that such representations of “social valuation” play an important role in various cognitive processes such as reciprocity, partner choice, categorization and emotion. However, little is known about how people construct these representations. Because of its adaptive importance, I hypothesize that the process by which we infer social valuation is approximately consistent with normative standards of inference under uncertainty.

To test this hypothesis, I construct a Bayesian ideal observer for a simple task in which the observer, having seen the decisions made by a partner in a simple welfare-tradeoff game, needs to predict the decisions made by that partner in other rounds of the game. In a first set of studies, I find that people make predictions that closely track the predictions made by the ideal observer in that task. Additionally, participants’ reports of anger toward the partner are well-predicted by the social valuation inferences made by the ideal observer, even when the different partners inflict the same opportunity cost on the participant. I also find tentative evidence that anger ratings in that task are independently driven by deviations from expectations: individual differences in the amount by which the decisions of a partner deviated from the participant’s expectations track individual differences in anger toward that partner.

In a second set of studies, I study whether people are spontaneously curious about the situations which potentially contain the most information about another person’s valuation of the self. I present participants with pairs of dilemmas that another individual faced in a simple welfare-tradeoff game; for each pair, I ask them to choose the dilemma for which they would most like to see the decision that the individual had made. I find that on average, people spontaneously select the choices that have the potential to reveal the most information about the individual’s valuation of the participant, in the sense of allowing the ideal observer model to draw the richest inferences.

These results strengthen the thesis that representations of social valuation are a core component of the conceptual architecture of human social cognition.

Cover page of Regression and optimal transport models for functional and surface-valued data

Regression and optimal transport models for functional and surface-valued data

(2021)

There are various types of information, such as shapes and constrained curves, that can not be represented by a scalar variable or a simple Euclidean vector. For these nonstandard data types, their inherent constraints and geometric features can often be exploited to inform model development and data analysis. In analyzing these data, the usual Euclidean norm that is implicitly used for standard multivariate analyses must be replaced by suitable functional norms or metrics. In this dissertation, some statistical models and computational tools are developed in order to analyze information in functional and surface-valued data.

In Chapter 1, the effect of a smooth curve on a binary response is analyzed through a functional generalized linear model. The proposed method develops a novel approach under the assumption that the coefficient function $\beta(t)$ is truncated, i.e. one can expect that the curve predictor loses its influence after a timepoint in its domain. To achieve an estimate $\beta(t)$ that is simultaneously smooth and truncated, a structured variable selection method and localized B-spline expansion of $\beta(t)$ are leveraged to formulate a penalized log-likelihood function, where the nested group lasso penalty guarantees the sequential entering of B-splines and hence induces truncation in $\beta(t)$. Computationally, an optimization scheme is developed to compute the entire solution path effectively when varying the truncation tuning parameter from $\infty$ to 0. Unlike previous methods, which either directly penalized the value of the truncation point or resulted in a nonconvex optimization problem, the proposed approach utilizes a nested group lasso penalty and leads to a convex optimization problem. By expressing the nonsmooth lasso penalty in its dual formulation, it can be subsequently smoothed so that the objective function can be optimized by an accelerated gradient descent algorithm. Theoretically, the convergence rate of the estimate and consistency of the truncation point estimation are derived under suitable smoothness assumptions. The proposed method is demonstrated with an application involving the effects of blood pressure curves in patients who suffered a spontaneous intracerebral hemorrhage.

In Chapter 2, a set of computational tools is developed to perform inference for a regression model where density curves appear as functional response objects with vector predictors. For such models, inference is key to understand the importance of density-predictor relationships, and the uncertainty associated with the estimated conditional mean densities, defined as conditional Fr echet means under a suitable metric. Since the positive density curve has integral equal to one, the usual $L_p$ metric is not suitable to model density curves. Instead, using the Wasserstein geometry of optimal transport, we consider the Fr echet regression of density curve responses and develop tests for global and partial effects, as well as simultaneous confidence bands for estimated conditional mean densities. This dissertation focuses on the computational aspects of the proposed statistical inference methods. An R package was developed to promote the usage of Fr echet regression of density curve responses. The accuracy of these methods, including nominal size, power, and coverage, is assessed through simulations. Furthermore, the utility of the methodology is demonstrated via regression analysis of post-intracerebral hemorrhage hematoma densities and their associations with a set of clinical and radiological covariates.

In Chapter 3, Wasserstein metric is applied in a different manner to assist the analysis of surface and contour data. The motivation for this analysis comes from cosmetics, where it is desirable for each individual to have a personalized contour map that is tailored to their unique face shape when applying makeup to enhance or change the shape of the face. Two main questions are of interest. First, given a face outline, how can it be represented in a common coordinate frame with the standard face shape templates - oval, square, rectangle, heart, and round? Second, where is the optimal location to apply contour in order to sculpt and add dimension to one's face with makeup? To address these two problems, the face shape is represented as a 2D discrete uniform distribution with support given by the face outline and the magnitude of difference between two face shapes is quantified by the 2-Wasserstein distance. Given the standard face templates -- oval, square, rectangle, heart, and round face shape, the first question can be addressed through modeling the given shape by a length five weight vector, where each element measures the similarity between the given face and the corresponding standard face template. Formally, this weight vector represents the Wasserstein barycentric coordinate of the input face outline. To accelerate the computation, an entropy regularized 2-Wasserstein metric is utilized, thereby transforming the linear programming task to an iterative Bregman projection problem. Hence all algorithms can be paralleled through multiple GPUs. For the second question, it is similar to a regression model, where the conditional information is the face type and the response is the optimal contour location. After obtaining the Wasserstein barycentric coordinates in the first question, this same weight vector is used to compute the contour barycenter. Thus, this contour barycenter depends on the standard contour templates as well as the given face type.

Cover page of Essays in Behavioral Labor and Health Economics

Essays in Behavioral Labor and Health Economics

(2021)

This dissertation consists of three essays that analyze the impact of behavioral biases on labor market and health outcomes. The essays use tools from both experimental economics and applied econometrics. The common thread that runs through this research agenda is the goal of understanding how biases and suboptimal behaviors impact long-term outcomes crucial to well-being: labor market and health outcomes.

The first essay asks whether dismissal threats are more motivating than other types of incentives. In a laboratory experiment, workers earn a fixed wage per period and complete real effort tasks to reduce their chance of being fired at the end of each period. Behavioral motivators are purposefully activated in addition to monetary incentives. The design innovates on previous literature by implementing dismissal threats in a quantifiable way and by collapsing preference elicitation over incentive along with random assignment to incentives into the same round. The experiment produces two main results. First, workers produce significantly more output under dismissal threats than they do under piece rates, even though the marginal benefit of output is lower. Second, the productivity gains from strengthening dismissal threats on the margin have a large self-selection component but significant heterogeneity in pure incentive effects. Workers who prefer higher pay with steeper dismissal threats appear to respond positively to this environment, but these high-pressure incentives backfire among workers who want to avoid them.

The second essay implements a lab experiment to investigate the effects of self-image concerns on search behavior. Subjects play a simple sequential search game in which they decide how many times to search for a wage offer before giving up. Feedback from search contains both instrumental information about search prospects and signals about subjects' relative performance on an intelligence test taken earlier in the experiment. Treatments isolate and shut down two mechanisms: biased belief updating and information avoidance. Despite replicating results from the literature on overconfidence in incentivized reporting of initial beliefs, subject search behavior does not differ between treatments with or without self-image concerns during search. These results seem to suggest that people are more likely to state overconfident beliefs when these beliefs are directly elicited, but that people act much closer to the rational Bayesian benchmark when actions only indirectly reveal self-relevant beliefs.

The third essay is joint work with Michael F. Pesko. We estimate the effect of county-level e-cigarette indoor vaping restrictions on adult prenatal smoking and birth outcomes using United States birth record data for 7 million pregnant women living in places already comprehensively banning the indoor use of traditional cigarettes. We use both cross-sectional and panel data to estimate our difference-in-difference models. Our panel model results suggest that adoption of a comprehensive indoor vaping restriction increased prenatal smoking by 2.0 percentage points, which is double the estimate obtained from a cross-sectional model. We also document heterogeneity in effect sizes along lines of age, education, and type of insurance.

Cover page of Neural Question Answering Models with Broader Knowledge Scope and Deeper Reasoning Power

Neural Question Answering Models with Broader Knowledge Scope and Deeper Reasoning Power

(2021)

Natural language has long been the most prominent tool for humans to disseminate, learn and create knowledge. However, in the era where new information is generated at an unprecedentedly rate and people’s craving of knowledge becomes broader and deeper, efficiently extracting the desired knowledge from the vast amount of language become a significant challenge. Even with the aid of modern search engines which sometimes directly return a text snippet along with the ranked list of pages. The accuracy of extracted knowledge is still insufficient such that the users often need to manually inspect each of the retrieved pages. This is especially the case when the queries becomes more complex and less common.In this dissertation, we investigate the problem of knowledge extraction centering around a simple and generic task formulation: we aim to build a system that takes natural language questions as input, processes the underlying knowledge source (usually text corpus or structured knowledge source) and finally returns a short piece of text that adequately answers the questions. In a nutshell, the goal of the proposed approaches in this dissertation is to enable AI systems to accurately answer broader and harder questions. We begin by studying the traditional structured QA system which uses a structured knowledge base (KB) as the underlying knowledge source. Specifically, we propose reasoning methods to automatically populate the missing knowledge in a KB, and a hybrid neural model that combines both KB and text to answer questions. Next, we utilize strong and large pretrained models to build QA systems that directly answer questions from text corpora. We introduce a knowledge-enhanced pretraining strategy which explicitly injects more entity-centric knowledge into pretrained models. Finally, we present a multi-hop QA model that could efficiently navigate over the large text corpus (over millions of documents) and reason over multiple text evidence to derive the answer. Altogether, these techniques allow users to ask questions from broader domains and with increased complexity.

Cover page of Medical Street Wisdom: A Community-based Study on the Precarity and Utility of Unapproved Opioid Overdose Reversal Techniques among Syringe Exchange Clientele

Medical Street Wisdom: A Community-based Study on the Precarity and Utility of Unapproved Opioid Overdose Reversal Techniques among Syringe Exchange Clientele

(2021)

In the United States, people who inject drugs intravenously (IVUs) often respond to opioid overdoses using intervention techniques that are not medically approved due to the unique legal, embodied, social and environmental risks IVUs, but not professionals, must navigate. Despite an unprecedented legislative shift towards harm reduction-informed overdose drug policy in the past decade, which help mitigate several risks complicating IVUs’ ability to respond by medically approved means, lay methods remain a common practice. While expert critiques of these methods are prevalent in medical and public health educational discourse, the unique cultural meanings and uses that unapproved methods hold among IVUs remain underacknowledged in professional medicine and underexamined in social scientific literature. Because medical and public health discourse discounts the methods without adequately addressing the unique risks IVUs contend with that professionals do not, IVUs are unable—not merely unwilling—to follow recommended medical practices. I argue that this medical lay knowledge must be recognized as an enduring cultural feature and an essential survival method among those who hold and apply this knowledge, and that the precarity of unapproved methods can be mitigated by policies and research practices that engage and validate, not erase and discredit, these alternative ways of knowing.

Trans-Pacific Values: The United States and the Regional Economy of the Pacific, 1900-1941

(2021)

This dissertation finds that between World War I and World War II, the United States’ trade interdependence with Asia forged a vibrant Pacific World of Commerce. As a consequence of World War I, U.S. demand for resources from Asia facilitated greater entanglements with the economies of Asia through trade interdependence. This project locates and traces out the empirical development of a deeply embedded network of global trade between the U.S. firms and intra-Asian trade networks that traversed the Pacific. These trade relationships strengthened and expanded during World War I across commodity trades, shipping, labor, and business networks. The War increased access to these networks and cemented their presence as a sustained and significant condition of trade interdependence in the War’s aftermath. Infrastructure changes brought by the war also established new and lasting trade connections that returned in the post-World War II economy.This dissertation also argues that U.S. globalization in the Pacific developed as a consequence of World War I. Scholars have traditionally located the United States’ Open Door agenda as central to understanding U.S. foreign policy goals in Asia, which prioritized the nation’s free-market access to China and equal most-favored-nation clauses with Europe for the objective of expanding U.S. export trade. During this period the center of the global economy was organized through the United States’ trans-Atlantic trade relationship with Europe. By contrast, this dissertation demonstrates how the Pacific gradually displaced the Atlantic within the global order.

Cover page of The Long and Short of Labor Supply Changes

The Long and Short of Labor Supply Changes

(2021)

The study of the dynamics, causes, and consequences of changes in labor supply is central to understanding modern economies and identifying candidate policies to improve welfare. Each chapter of my dissertation contributes to one or more components of this broad theme by combining applied econometrics techniques with insights from quantitative theoretical models. Using these tools, I aim to address three questions: How do aggregate hours worked change over the business cycle? Can the rise in female labor force participation account for household migration trends? What role do individual labor supply choices play in driving aggregate growth as populations age?

In Chapter 1 of my dissertation, I study the finite sample properties of a novel approach to identifying macroeconomic shocks with long-run restrictions. In contrast to past studies, this approach constructs and constrains the long-run impact of shocks directly using local projections rather than inferring them from vector autoregressions. Through a series of Monte Carlo simulations, I show that the local projections approach can yield substantial reductions in both bias and mean squared error, while also boasting decreased sensitivity to the choice of included lag length and assumed order of integration of the endogenous variables. I then use data from the Bureau of Labor Statistics to revisit a long standing debate on the response of aggregate hours worked to positive productivity shocks. I find that labor hours rise in response to positive productivity shocks and follow a hump-shaped profile thereafter. This result is robust to a number of specification choices and provides new evidence in support of the standard real business cycle model.

My joint work with Christine Braun and Peter Rupert constitutes Chapter 2 of my dissertation, and studies the relationship between the historical rise in female labor force participation and contemporaneous decline in household migration rates. Between 1964 and 2000, the inter- county migration rate of married couples declined by 15%. Concurrently, female labor force participation among married women and the relative wages of women increased by 39 and 14 percentage points, respectively. Using a two location household level search model of the labor market, we show that both the increase in dual earner households and the rise in women’s wages contributed significantly to the decline in the migration rate of married households, with each explaining 53% and 20% of the decline, respectively. We further show that this co-location problem has important implications for structural models designed to estimate lifetime earnings inequality.

Finally, I conclude my dissertation in Chapter 3 with joint work with Thomas Cooley and Espen Henriksen wherein we study the growth effects of aging populations in Europe’s four largest economies – France, Germany, Italy, and the United Kingdom. Since the early 1990’s, GDP per-capita growth in these economies has slowed while at the same time a combination of longer individual life expectancy and declining fertility have led to gradually ageing populations. Using a general equilibrium overlapping generations model, we show that demographic change such as this affects economic growth directly through aggregate savings and labor supply decisions. These decisions are further affected indirectly through additional distortions caused by rising tax rates needed to fund pension systems. We find that the net effect of these forces can account for a significant fraction of the historical growth slowdown and that evolving demographics will continue to drag down growth over the next 20 years. We highlight that the degree to which gains to life expectancy change labor supply decisions is the most important margin through which demographic change affects growth by studying several reforms aimed at increasing late-life labor supply.

Cover page of Examining Science Identity Work and Scientific Literacy in Non-STEM Majors

Examining Science Identity Work and Scientific Literacy in Non-STEM Majors

(2021)

Scientific literacy is vital for 21st century citizens to have the skills to make informeddecisions based on sound science, and seeing oneself as a science person is important for citizens to feel capable of making such decisions based on science. Science educators are in the best position to encourage these attribute in their students. In particular, undergraduate non-STEM majors make up approximately 55% of college graduates, and college and university faculty are in a particularly important position to offer opportunities for STEM identity work in this population of students to encourage the development of a science person identity, which is intertwined with learning the skills important for scientific literacy like evaluation of science in the news. This mixed-methods study focused on students in a biology course developed primarily for non-STEM majors, and examined their STEM identity work along with evidence of their scientific literacy. Data collected includes surveys, focus groups, observations of group work and final presentations, and final written reflections. Using Wilcoxon Rank Sum tests to compare pre-class and post-class survey responses, student participants felt more like science people following their experiences in this non-STEM majors’ biology course and that their biology professor saw them as science people at the end of the course compared to the beginning of the term. Qualitative analysis supported xi these findings, with participants sharing that they felt like science people when they were successful in school science, had the opportunity to engage with science content in lab or real-life case studies, were recognized by instructors and peers as competent in science, and recognized qualities of scientists that they felt they also shared. In terms of scientific literacy, following their experience in non-STEM majors’ biology, students’ beliefs and attitudes surrounding science became more positive and their self-efficacy and ability to communicate and apply science content to contexts outside of the classroom also were positively impacted. Students were also more confident of their ability to evaluate science as presented in the popular media following the course. While this identity work was taking place, the students also had opportunities to demonstrate and further develop scientific literacy skills, such as the application of content knowledge and communication of scientific ideas, along with highlighting the importance of their evolving beliefs, attitudes, and interests in science. Non-science majors are frequently overlooked when considering research in science education, as research centered around undergraduates tends to focus on STEM major retention or interest, rather than the STEM identity work of those students who have chosen a major in a different field. Therefore, this research contributes to the body of literature with undergraduates in STEM courses, along with literature centered on identity work, which tends to focus on younger age groups (K-12 grades).

Polypeptoid Chain Conformation and Its Role in Block Copolymer Self-Assembly

(2021)

Polymer chain conformation underlies polymer physical properties and impacts many of polymer functionalities. Understanding chain conformation is critical for predicting and controlling the structures and properties in polymeric systems. In block copolymers, chain conformation of the consisting blocks closely impacts the thermodynamics of microphase separation and the resultant structures, which are key to block copolymers as functional materials. However, the understanding of chain conformation effects beyond coil–coil block copolymers is yet nascent, partially due to the challenge to precisely control chain conformation without introducing other complicating factors.This dissertation utilizes sequence-defined polypeptoids to install precise chain conformation control into traditional polymer systems, to examine the role of chain conformation in block copolymer self-assembly. First, the polypeptoid chain conformation is examined in terms of local stiffness, overall chain size, and response to solvent quality by comparing chemically identical helical and coil polypeptoids in dilute solution. The detailed understanding from molecular length scale reveals that the helical secondary structure, driven by steric hindrance from side chains, makes the polypeptoid chains locally stiffer but overall more compact than the coil analogues. Further, we show these helical chains are relatively insensitive to solvent conditions due to their sterically defined nature. Then, through the design of model polypeptoid-containing block copolymer systems, we are able to study the effects of the helical chain conformation, which has distinct space-filling characteristics from chemically analogous coil chains, on the melt self-assembly of block copolymers. In the lamellae-forming system, the helical chain conformation is shown to decrease the order–disorder transition temperature through a combination of decreasing the enthalpic interaction between dissimilar blocks and experiencing amplified chain stretching. Further, polypeptoids are used as a conformation tuning handle and are shown to modulate network phase formation and stability as a conformationally tunable interfacial segment, which demonstrates the importance of chain conformation at the vicinity of the interface in determining the morphology of block copolymers. The findings in this dissertation highlight the importance of chain conformation on the self-assembly thermodynamics of block copolymers, and polypeptoids as highly-controlled, precise polymers to aid the fundamental understanding of polymeric materials.