Many animals, including humans, navigate their surroundings by visual input, yet we understand little about how visual information is transformed and integrated by the navigation system. In Drosophila melanogaster , compass neurons in the donut-shaped ellipsoid body of the central complex generate a sense of direction by integrating visual input from ring neurons, a part of the anterior visual pathway (AVP). Here, we densely reconstruct all neurons in the AVP using FlyWire, an AI-assisted tool for analyzing electron-microscopy data. The AVP comprises four neuropils, sequentially linked by three major classes of neurons: MeTu neurons, which connect the medulla in the optic lobe to the small unit of anterior optic tubercle (AOTUsu) in the central brain; TuBu neurons, which connect the anterior optic tubercle to the bulb neuropil; and ring neurons, which connect the bulb to the ellipsoid body. Based on neuronal morphologies, connectivity between different neural classes, and the locations of synapses, we identified non-overlapping channels originating from four types of MeTu neurons, which we further divided into ten subtypes based on the presynaptic connections in medulla and postsynaptic connections in AOTUsu. To gain an objective measure of the natural variation within the pathway, we quantified the differences between anterior visual pathways from both hemispheres and between two electron-microscopy datasets. Furthermore, we infer potential visual features and the visual area from which any given ring neuron receives input by combining the connectivity of the entire AVP, the MeTu neurons’ dendritic fields, and presynaptic connectivity in the optic lobes. These results provide a strong foundation for understanding how distinct visual features are extracted and transformed across multiple processing stages to provide critical information for computing the fly’s sense of direction.