The tiny brains of insects presumably impose significant computational limitations on algorithms controlling their behavior. Nevertheless, they perform fast and sophisticated visual maneuvers. This includes tracking features composed of second-order motion, in which the feature is defined by higher-order image statistics, but not simple correlations in luminance. Flies can track the true direction of even theta motions, in which the first-order (luminance) motion is directed opposite the second-order moving feature. We exploited this paradoxical feature tracking response to dissect the particular image properties that flies use to track moving objects. We find that theta motion detection is not simply a result of steering toward any spatially restricted flicker. Rather, our results show that fly high-order feature tracking responses can be broken down into positional and velocity components - in other words, the responses can be modeled as a superposition of two independent steering efforts. We isolate these elements to show that each has differing influence on phase and amplitude of steering responses, and together they explain the time course of second-order motion tracking responses during flight. These observations are relevant to natural scenes, where moving features can be much more complex.