Human understanding of relations between objects depends onthe ability to code meaningful role bindings. Computationalmodels of relational reasoning have proposed that neuraloscillations provide a basic mechanism enabling workingmemory to code the bindings of objects into relational roles.We adapted a behavioral oscillation paradigm to investigatemoment-to-moment changes in representations of semanticroles. On each trial, a picture was presented showing an action(chasing) relating two animals, one animal playing an agentrole (chaser) and the other playing a patient role (chased). Afterthe picture disappeared, the inter-stimulus interval (ISI) wasvaried in densely-sampled increments followed by a verbalprobe indicating an animal in a role. Reaction time (RT) todecide the validity of the verbal probe was recorded. We foundthat RTs varied systematically with ISI in an oscillatoryfashion. A task that required memory for a relational roleevoked stronger theta- and alpha-band oscillations than did amemory task not involving relational roles. The behavioraloscillation patterns in the role-identification task revealed aphase shift between the two semantic roles in the alpha band.