Action Perception Laboratory

 

Multimodal action perception

We not only watch human actions but also make sense of them by listening to the sounds they make. Our earlier neurophysiology research showed that neurons in non-human primate Superior Temporal Sulcus integrate the sound of an action with the sight of the action. We observed supra-additive increases in the visual response to an action with the addition of the matching action sound; this integration could help these cells better code multimodal actions. At the same time collaborators in Parma were finding that mirror neurons that respond during the execution of actions and the visual observation of actions also responded to the sound of actions. Together our results indicated that maybe actions are coded by a multisensory neural network linking temporal cortex through inferior parietal cortex to pre-motor cortex areas traditionally thought to be involved in speech production.

Currently we are testing in humans how we integrate the sight and sound of actions. We use psychophysical action sound detection paradigms, crossmodal adaptation paradigms, as well as Transcranial Magnetic Stimulation (TMS) techniques. Some of the questions under investigation include: 1. Are audiovisual actions integrated in the same way as audiovisual speech? 2. Does the Mirror Neuron System have a role in audiovisual action integration? 3. What are the characteristics of action sight and sounds that determine whether they are integrated and perceived as a unitary experience?

 

 

 

Back to research