15th International Congress of Phonetic Sciences (ICPhS-15)
Although the impact that visual information can have on speech perception is well known, we do not yet have an adequate description of the neural mechanisms involved. We asked subjects to identify consonants produced by a speaker they both saw and heard while we acquired functional magnetic resonance images (fMRI) of their brain. During one experimental condition the acoustics were synchronous with the visual image of the speaker's face movements, in another they were delayed by 250 ms. With respect to unimodal control conditions, we found more extensive enhanced activity in the superior temporal gyrus and sulcus (STG/STS), bilaterally, when the audiovisual stimuli were synchronous than when the sound was delayed. When we directly compared these two experimental conditions, we found more activity in the right premotor cortex and inferior parietal lobule (IPL) when the acoustics were delayed. The results indicate that polymodal regions of the STS and IPL play important but different roles in audiovisual speech perception.
Bibliographic reference. Jones, Jeffery A. / Callan, Daniel E. (2003): "Brain activation when acoustic information is delayed during an audiovisual speech task", In ICPhS-15, 2209-2212.