Sensitivity of occipito-temporal cortex, premotor and Broca’s areas to visible speech gestures in a familiar language

Vincenzo Maffei, Iole Indovina, Elisabetta Mazzarella, Maria Assunta Giusti, Emiliano Macaluso, Francesco Lacquaniti, Paolo Viviani

Research output: Contribution to journalArticlepeer-review


When looking at a speaking person, the analysis of facial kinematics contributes to language discrimination and to the decoding of the time flow of visual speech. To disentangle these two factors, we investigated behavioural and fMRI responses to familiar and unfamiliar languages when observing speech gestures with natural or reversed kinematics. Twenty Italian volunteers viewed silent video-clips of speech shown as recorded (Forward, biological motion) or reversed in time (Backward, non-biological motion), in Italian (familiar language) or Arabic (non-familiar language). fMRI revealed that language (Italian/Arabic) and time-rendering (Forward/Backward) modulated distinct areas in the ventral occipito-temporal cortex, suggesting that visual speech analysis begins in this region, earlier than previously thought. Left premotor ventral (superior subdivision) and dorsal areas were preferentially activated with the familiar language independently of time-rendering, challenging the view that the role of these regions in speech processing is purely articulatory. The left premotor ventral region in the frontal operculum, thought to include part of the Broca’s area, responded to the natural familiar language, consistent with the hypothesis of motor simulation of speech gestures.

Original languageEnglish
Article numbere0234695
JournalPLoS One
Issue number6
Publication statusPublished - Jun 2020

ASJC Scopus subject areas

  • Biochemistry, Genetics and Molecular Biology(all)
  • Agricultural and Biological Sciences(all)
  • General


Dive into the research topics of 'Sensitivity of occipito-temporal cortex, premotor and Broca’s areas to visible speech gestures in a familiar language'. Together they form a unique fingerprint.

Cite this