Language comprehension in grounded contexts involves in-tegrating visual and linguistic information through decisionsabout visual fixation. But when the visual signal also con-tains information about the language source – as in the caseof written text or sign language – how do we decide where tolook? Here, we hypothesize that eye movements during lan-guage comprehension represent an adaptive response. Usingtwo case studies, we show that, compared to English-learners,young signers delayed their gaze shifts away from a languagesource, were more accurate with these shifts, and produced asmaller proportion of nonlanguage-driven shifts (E1). Next,we present a well-controlled, confirmatory experiment, show-ing that English-speaking adults produced fewer nonlanguage-driven shifts when processing printed text compared to spokenlanguage (E2). Together, these data suggest that people adaptto the value of seeking different information in order to in-crease the chance of rapid and accurate language understand-ing.