The perception of shape, lighting, and material properties in images
Three scene properties determine the image of a 3D object: the material reflectance, the illumination, and the object's shape. Because all three properties determine the image, one cannot solve for any one property without knowing the other two. Nevertheless, people often are able to perceive these properties consistently and relatively accurately. We explore the relationship between these properties, the sources of image information the visual system can use to recover these properties, and the assumptions the visual system tends to make. We first conducted a shape perception experiment in which we investigate whether the visual system assumes the angle between the lighting direction and the viewing direction. Observer errors were minimized when the light was 20-30 degrees above the viewing direction, confirming the light-from-above prior. In a second study, we conducted two psychophysical experiments to determine how viewers use shape information to estimate the lighting direction from shaded images. We found that observers can accurately determine lighting direction when a host of shape cues specify the objects. When shading the is the only cue, observers always set lighting direction to be from above. We modeled the results in a Bayesian framework that include a prior distribution describing the assumed lighting direction. Finally, we explore how disparity and defocus information may be useful in material perception to distinguish glossy and matte surfaces. We describe the types of images needed to investigate this question, and introduce a method to render them.