Skip to main content
eScholarship
Open Access Publications from the University of California

Query-guided visual search

Abstract

How do we seek information from our environment to find solutions to the questions facing us? We pose an open-endedvisual search problem to adult participants, asking them to identify targets of questions in scenes guided by only an in-complete question prefix (e.g. Why is..., Where will...). Participants converged on visual targets and question completionsgiven just these function words, but the preferred targets and completions for a given scene varied dramatically dependingon the query. We account for this systematic query-guided behavior with a model linking conventions of linguistic refer-ence to abstract representations of scene events. The ability to predict and find probable targets of incomplete queries maybe just one example of a more general ability to pay attention to what problems require of their solutions, and to use thoserequirements as a helpful guide in searching for solutions.

Main Content
For improved accessibility of PDF content, download the file to your device.
Current View