Behavioral Context Recognition In the Wild
The ability to automatically recognize a person's behavioral context (including where they are, what they are doing, who they are with, etc.) is greatly beneficial in health monitoring, aging care, personal assistants, smart homes, customized entertainment, and many other domains. For all of these different applications to succeed on a larger scale, the context-recognition component must be unobtrusive and work smoothly, without making people adjust their behavior. It is important for research to validate context-recognition systems in the real world - under the same conditions in which such applications will eventually be deployed. In this thesis, I promote context recognition in-the-wild, capturing people's authentic behavior in their natural environments using natural, everyday devices.
In Chapter 1, I introduce the field of behavioral context recognition, and describe three parts of research in the field: defining the problem (what are the inputs and what are the outputs), collecting data, and artificial intelligence (AI) / machine learning (ML) methods.
In Chapter 2, I present the problem of behavioral context recognition and the challenges of addressing behavior in-the-wild. I introduce the ExtraSensory Dataset, which was collected from 60 participants in-the-wild, and is publicly available at http://extrasensory.ucsd.edu. I describe simple machine learning methods and demonstrate that smartphones and smartwatches can be used to successfully recognize diverse contexts in regular life (e.g., walking, sleeping, at school, on a bus, cooking, shower, phone in pocket).
In Chapter 3, I specifically address machine learning solutions that facilitate training classifiers with irregular data from the wild - highly unbalanced, with occasions of missing labels or sensors, and potentially collected in phases addressing different sets of labels.
In Chapter 4, I address the challenge of collecting labeled data in-the-wild and describe our self-reporting solution - the ExtraSensory App. I analyze the collected data and subjective feedback from the participants to gain insights about user-interface design to engage users to contribute labels about their own behavior. A revised version of the app, with improvements based on this dissertation, is publicly available at http://extrasensory.ucsd.edu/ExtraSensoryApp.
In Chapter 5, I discuss the progress this work makes in the field of behavioral context recognition, and suggest directions for future improvements.