Automatic computation of navigational affordances explains selective processing of geometry in scene perception: behavioral and computational evidence
One of the more surprising findings in visual cognition is the apparent sparsity of our scene percepts. Yet, scene perception also enables planning and navigation, which require a detailed, structured analysis of the scene geometry, including exit locations and the obstacles along the way. Here, we hypothesize that computation of navigational affordances (e.g., paths to an exit) is a “default” task in the mind, and that task induces selective analysis of the scene geometry most relevant to computing these affordances. In an indoor scene setting, we show that observers more readily detect changes if these changes impact shortest paths to visible exits. We show that behavioral detection rates are explained by a new model of attention that makes heterogeneous-precision inferences about the scene geometry, relative to how its different regions impact navigational affordance computation. This work provides a formal window into the contents of our scene percepts.