In natural languages, closed-class items predict open-classitems but not the other way around. For example, in English, ifthere is a determiner there will be a noun, but nouns can occurwith or without determiners. Here, we asked whether languagelearners’ computations are also asymmetrical. In threeexperiments we exposed adults to a miniature language withthe one-way dependency “if X then Y”: if X was present, Ywas also present, but X could occur without Y. We createddifferent versions of the language in order to ask whetherlearning depended on which of these categories was an open orclosed class. In one condition, X was a closed class and Y wasan open class; in a contrasting condition, X was an open classand Y was a closed class. Learning was significantly betterwith closed-class X, even though learners’ exposure wasotherwise identical. Additional experiments demonstrated thatthe perceptual distinctiveness of closed-class items driveslearners to analyze them differently; and, crucially, that theprimary determinant of learning is the mathematicalrelationship between closed- and open-class items and not theirlinear order. These results suggest that learners privilegecomputations in which closed-class items are predictive of,rather than predicted by, open-class items. We suggest that thedistributional asymmetries of closed-class items in naturallanguages may arise in part from this learning bias.