Studies show that people can recognize their own movements,
such as their own walking (presented in silhouette using point
lights), their own drawing (presented as a moving point light),
own clapping, and their own piano playing. We extend this
result to proprioceptive control, showing that people can
recognize their own eye movements, when presented as just a
point moving against a black background. Eye movements
were recorded using a wearable eye tracking glass, while
participants executed four tasks. A week later, participants
were shown these videos, alongside another person's videos,
for each task, and asked to recognize their own movements.
Males recognized their own eye movements significantly
above chance, but only for tasks with large and familiar body
movements. Females performed below chance in these tasks.
We argue that the standard common coding/motor simulation
model does not account for this result, and propose an
extension where eye movements and body movements are
strongly coupled. In this model, eye movements automatically
trigger covert motor activation, and thus participate directly in
motor planning, simulations and the sense of agency