Cetaceans (whales and dolphins) use acoustic cues to determine the locations and identities of environmental stimuli within their underwater habitats. Dolphins evolved unique auditory systems for spatially differentiating ultrasonic signals, whereas the larger baleen whales appear to have evolved different mechanisms for localizing lower frequency sound sources. Many of the cues that terrestrial mammals use to localize sounds in air are less well suited for localizing sounds underwater. Nevertheless, cetaceans can localize sounds as well as or better than most terrestrial mammals. Position dependent spectral filtering likely plays an important role in sound localization by toothed whales, whereas phase differences between the ears may be important for baleen whales. However, it is exceedingly difficult to determine how filtering and phase differences contribute to spatial hearing by whales and dolphins because, in contrast to terrestrial mammals, the structures through which cetaceans receive sounds are completely internalized (and thus invisible). Computational models of cetacean auditory processing provide one viable approach to generating testable predictions about the mechanisms cetaceans use to localize and identify sound sources.