Human brain responses to non-verbal audiovisual dynamic stimuli

  • Aina Puce

Abstract

When interacting with others we focus on the spoken word, but also ‘read’ non-verbal cuesfrom the face and voice. Our recordings of the electrical activity of human brain (eventrelatedpotentials, or ERPs) indicate that these audiovisual cues are integrated as early as140msec post-stimulus (sensory ERP components: auditory N140 and visual N170). Whenmultisensory inputs were congruent, and potentially redundant, N170 lost its usual categorysensitivity,whereas auditory N140 showed selectivity to primate and human vocalizations. Alate ERP, P400, was significantly larger when a human face was paired with an incongruentsound. This cross-modal incongruity is similar to a previously described auditory (physical)incongruity potential (McCallum et al 1984). When audiovisual and unisensory stimulationwere contrasted, auditory N140 and visual N170 exhibited underadditivity, whereas laterERPs showed more complex effects. Our data indicate that the human brain possessesspecialized circuitry for rapidly processing information from the face and voice.
Section
Full Articles