Assessing a Bayesian account of human gaze perception
Skip to main content
eScholarship
Open Access Publications from the University of California

Assessing a Bayesian account of human gaze perception

Abstract

Although gaze can be directed at any location, different locations in the visual environment vary in terms of how likely they are to draw another person’s attention. One could therefore weigh incoming perceptual signals (e.g., eye cues) against this prior knowledge (the relative visual saliency of locations in the scene) in order to infer the true target of another person’s gaze. This Bayesian approach to modeling gaze perception has informed computer vision techniques, but we assess whether it is a good model for human performance. We present subjects with a “gazer” fixating his eyes on various locations on a 2-dimensional surface, and project an arbitrary photographic image onto that surface. Subjects judge where the gazer is looking in the image. A full Bayesian model, which takes image saliency information into account, fits subjects’ gaze judgments better than a reduced model that only considers the perceived direction of the gazer’s eyes.

Main Content
For improved accessibility of PDF content, download the file to your device.
Current View