Most statistical learning studies focus on the learning oftransitional probabilities between adjacent elements in asequence, however, other statistical regularities may un-derpin different aspects of processing language and regu-larities in other domains. Here, we investigate how con-junctive statistical regularities (of the form A and B to-gether predict C) can be learned, and how this learningis impacted by similarity in representations analogousto that in unambiguous words, homonyms with mul-tiple unrelated meanings, and polysemes with multiplerelated meanings. We observed that provided the stimu-lus structure is relatively simple, participants are readilyable to learn conjunctive probabilities and display sen-sitivity to relatedness among representations. These re-sults open new theoretical possibilities for exploring thedomain-generality of how the learning and processingsystems merge conjunctive information in simple labo-ratory tasks and in natural language.