Skip to main content
eScholarship
Open Access Publications from the University of California

Inferring Human Interaction from Motion Trajectories in Aerial Videos

Abstract

People are adept at perceiving interactions from movementsof simple shapes but the underlying mechanism remains un-known. Previous studies have often used object movementsdefined by experimenters. The present study used aerial videosrecorded by drones in a real-life environment to generate de-contextualized motion stimuli. Motion trajectories of dis-played elements were the only visual input. We measuredhuman judgments of interactiveness between two moving el-ements, and the dynamic change of such judgments over time.A hierarchical model was developed to account for human per-formance in this task, which represents interactivity using la-tent variables, and learns the distribution of critical movementfeatures that signal potential interactivity. The model providesa good fit to human judgments and can also be generalized tothe original Heider-Simmel animations (1944). The model canalso synthesize decontextualized animations with controlleddegree of interactiveness, providing a viable tool for studyinganimacy and social perception.

Main Content
For improved accessibility of PDF content, download the file to your device.
Current View