The ability to process social information is a critical compo-nent of children’s early language and cognitive development.However, as children reach their first birthday, they begin tolocomote themselves, dramatically affecting their visual ac-cess to this information. How do these postural and locomotorchanges affect children’s access to the social information rele-vant for word-learning? Here, we explore this question by us-ing head-mounted cameras to record 36 infants’ (8-16 monthsof age) egocentric visual perspective and use computer visionalgorithms to estimate the proportion of faces and hands in in-fants’ environments. We find that infants’ posture and orienta-tion to their caregiver modulates their access to social informa-tion, confirming previous work that suggests motoric develop-ments play a significant role in the emergence of children’s lin-guistic and social capacities. We suggest that the combined useof head-mounted cameras and the application of new computervision techniques is a promising avenue for understanding thestatistics of infants’ visual and linguistic experience.