Most recent research on human planning attempts to adjudicate between a small set of hypothesized models based on their ability to predict participants' choices, using carefully designed experiments and/or model comparison. Here, we propose an alternative approach. We designed a task in which gaze is highly indicative of participants' planning operations, allowing us to discover properties of human planning from eye-tracking data in a data-driven way. Our results reveal ways that people's planning strategies have both similarities and differences with classical planning algorithms like best-first search and Monte Carlo tree search. They also provide a more nuanced perspective on previously proposed properties of human planning like pruning and depth limits. We conclude that planning research would benefit greatly from an increased use of rich sources of data that provide more direct evidence about the internal processes underlying sequential decision-making.