The posterior parietal cortex of human and non-human primates has been implicated in sensorimotor transformations, whereby sensory input (vision, touch, sound) is converted into motor commands. For instance, visual information regarding a target can be used to appropriately guide reaching movements towards the relevant part of space. These sensorimotor and multisensory representations suggest that the role of the posterior parietal lobe is to implement perception for action, rather than create a passive, purely perceptual representation of space. This dissertation consists of three different studies that map such multimodal and sensorimotor representations in the human brain, using functional magnetic resonance imaging (fMRI). The first study shows that observation, imagery, and execution of reaching rely on similar neural substrates in the posterior parietal lobe, specifically involving more superior (dorsal) areas. These 'mirror neuron' activations are in agreement with macaque parieto-frontal circuits underlying reaching movements, and differ from the more ventral (inferior) activations for grasping or for hand- object interactions. This suggests that (visuomotor) mirror neurons are specific to the hand action that is executed (reaching versus grasping). The second study attempts to dissociate visual and proprioceptive feedback from the reaching hand, to identify visuomotor and proprioceptive-motor areas involved in the control of reaching. Reaching without visual feedback from the moving hand involves more anterior and medial parietal areas than reaching with visual feedback. This indicates a posterior- to-anterior organization for sensorimotor representations that use visual versus somatosensory input, respectively, within posterior parietal cortex. Finally, the third study investigates tactile and visual representations of target location. Identifying target location using exploratory hand movements in the absence of visual input activates similar intraparietal and superior parietal areas as does visual identification of spatial location. The common activation by visual and tactile input suggests multisensory processing in these areas