Skip to main content
eScholarship
Open Access Publications from the University of California

A Deep Gaze into Social and Referential Interaction

Creative Commons 'BY' version 4.0 license
Abstract

In this study, we explicitly code and study the social, referential, and pragmatic features of gaze in human-human spontaneous dyadic interaction, providing novel observations that can be executed in a machine in order to improve multimodal human-agent dialogue. Gaze is an important non-verbal social signal that contains attentional cues about where to look and provides information about others' intentions and future actions. In this work, various types of gaze behaviour are annotated in detail along with speech to explore the meaning of temporal patterns in gaze cues and their co-relations. Considering that 80% of the total stimuli perceived by the brain is visual, gaze behaviour is complex and challenging; hence, implementing human-human gaze cues to an avatar/robots could improve human-agent interaction and make it more natural.

Main Content
For improved accessibility of PDF content, download the file to your device.
Current View