We present a case-study demonstrating the usefulness ofBayesian hierarchical mixture modelling for investigating cog-nitive processes. In sentence comprehension, it is widely as-sumed that the distance between linguistic co-dependents af-fects the latency of dependency resolution: the longer thedistance, the longer the retrieval time (the distance-based ac-count). An alternative theory, direct-access, assumes that re-trieval times are a mixture of two distributions: one distribu-tion represents successful retrievals (these are independent ofdependency distance) and the other represents an initial failureto retrieve the correct dependent, followed by a reanalysis thatleads to successful retrieval. We implement both models asBayesian hierarchical models and show that the direct-accessmodel explains Chinese relative clause reading time data betterthan the distance account.