AN ANALYSIS OF PERCEPTUAL DEPENDENCIES IN AUDIOVISUAL SPEECH PERCEPTION: A NEW APPROACH

  • Nicholas Altieri
  • Noah Silbert
  • James T. Townsend

Abstract

Ecological speech signals consist of both auditory and visual information.Determining whether the dimensions of perception, including the auditory and visualcomponents of speech, are combined independently (e.g., Garner & Morton, 1969) is animportant problem in cognitive psychology. To investigate whether perceptual dependenciesoccur in congruent and incongruent audiovisual speech, we implemented the statisticalmethodology of General Recognition Theory (GRT; Ashby & Townsend, 1986), amultidimensional extension of signal detection theory. We carried out an identificationexperiment where the auditorily and visually articulated syllables /be/ and /ge/ werecombined in a 2 x 2 design to yield four stimulus categories: (A_V) /be_be/, /be_ge/, /ge_be/,and /ge_ge/. The stimuli /be_ge/ and /ge_be/ elicit the classic McGurk fusions of de and bgerespectively. Results obtained from model fitting suggest that the auditory and visualcomponents of speech are generally perceived independently, although dependencies canarise with the presentation of incongruent stimuli. Marginal d’s and decision criteria alsodiffer as a function of stimulus level.
Section
Full Articles