Mapping the Idiographic Dynamics of Emotion
Skip to main content
eScholarship
Open Access Publications from the University of California

UC Berkeley

UC Berkeley Electronic Theses and Dissertations bannerUC Berkeley

Mapping the Idiographic Dynamics of Emotion

Abstract

Background. Emotions are both idiographic (i.e., idiosyncratic, experienced within the individual) and dynamic (i.e., they exhibit change over time). However, studies of emotion often utilize cross-sectional measurement and nomothetic (group-aggregated) analysis, inhibiting precise understanding of within-person emotion dynamics. Recently, ecological momentary assessment (EMA) has become a popular method to study emotion by measuring individuals at multiple points in time during daily life. While EMA time series data hold great potential for understanding emotions as idiographic, dynamic phenomena, a key barrier remains: how to model individual differences in emotional experience, while also obtaining generalizable information about the nature of emotions at the population level. To address this problem, we present an approach that is innovative in both data collection and statistical modeling. Methods. We collected EMA data on six discrete emotions (anxious, irritable, sad, joyful, content, excited) from 115 undergraduates. These data represent the most intensively sampled emotion time series in the literature to date, with observations taken by smartphone surveys every 30 minutes (24 times per day) during a 14-day sampling window. Over 34,000 observations were obtained across the sample (M = 302 per person). This is vital for the precise detection of rapidly-varying emotion dynamics. Results. We then applied finite mixture modeling (FMM; also known as latent profile analysis) to this data in a “nested” fashion. First, aggregating across time points within each individual, FMM was applied to each emotion time series to classify every individual’s set of unique emotion profiles as blends of six discrete emotions. Next, a between-persons classification step was conducted by aggregating across the individual emotion profiles. Conclusion. Individual-level models revealed 795 unique latent states of emotional experience across the sample, which we termed affect profiles. By then aggregating across the person-specific affect profiles, we identified 7 distinct types of affect profiles (in other words, meta-classes) across the sample. At this group level, we recovered three ‘negative affect’, two ‘positive affect’, and two ‘mixed affect’ profiles. Affect profiles are discussed with an eye toward their potential clinical implications and utility. Future analyses will examine the temporal dynamics of these categories and investigate their relation to psychopathology. Two supplementary sections present additional analyses that begin to address these future aims. Impact. Crucially, the present approach offers a way to distill high-dimensional EMA time series into a manageable set of discrete affect states. In the two-stage modeling approach discussed here, we first categorize moments of each person’s life into their idiosyncratic, unique affect profiles. From the set of all idiosyncratic profiles, we can distill a set of common affect profiles across the group. Unlike most other analytic approaches to date, this allows us to consider affect at both the idiographic and nomothetic levels simultaneously. Comparison of the universality vs. idiosyncrasy of the identified affect profiles may shed light on our understanding of affect in general. Further, examining which individuals exhibit which types of affect profiles, and the temporal dynamics of these profiles as they occur, may be advantageous to both researchers and clinicians. By enabling momentary affect states to be dichotomized (i.e., occurring in a given moment or not), this approach facilitates the application of prediction modeling to determine when a person’s affect states will occur. If affect profiles are reliably associated with relevant behavior patterns, as some preliminary evidence suggests, clinicians could utilize this information about affect states to inform behavior change.

Main Content
For improved accessibility of PDF content, download the file to your device.
Current View