Neural Correlates of Auditory -Visual Speech Perception in Noise
Gilbert, Jaimie
This item is only available for download by members of the University of Illinois community. Students, faculty, and staff at the U of I may log in with your NetID and password to view the item. If you are trying to access an Illinois-restricted dissertation or thesis, you can request a copy through your library's Inter-Library Loan office or purchase a copy directly from ProQuest.
Permalink
https://hdl.handle.net/2142/85225
Description
Title
Neural Correlates of Auditory -Visual Speech Perception in Noise
Author(s)
Gilbert, Jaimie
Issue Date
2009
Doctoral Committee Chair(s)
Charissa Lansing
Department of Study
Speech and Hearing Science
Discipline
Speech and Hearing Science
Degree Granting Institution
University of Illinois at Urbana-Champaign
Degree Name
Ph.D.
Degree Level
Dissertation
Keyword(s)
Health Sciences, Speech Pathology
Language
eng
Abstract
Speech perception in noise may be facilitated by presenting the concurrent optic stimulus of observable speech gestures. Objective measures such as event-related potentials (ERPs) are crucial to understanding the processes underlying a facilitation of auditory-visual speech perception. Previous research has demonstrated that in quiet acoustic conditions auditory-visual speech perception occurs faster (decreased latency) and with less neural activity (decreased amplitude) than auditory-only speech perception. These empirical observations provide support for the construct of auditory-visual neural facilitation. Auditory-visual facilitation was quantified with response time and accuracy measures and the N1/P2 ERP waveform response as a function of changes in audibility (manipulation of the acoustic environment by testing a range of signal-to-noise ratios) and content of optic cue (manipulation of the types of cues available, e.g., speech, nonspeech-static, or non-speech-dynamic cues). Experiment 1 (Response Time Measures) evaluated participant responses in a speeded-response task investigating effects of both audibility and type of optic cue. Results revealed better accuracy and response times with visible speech gestures compared to those for any non-speech cue. Experiment 2 (Audibility) investigated the influence of audibility on auditory-visual facilitation in response time measures and the N1/P2 response. ERP measures showed effects of reduced audibility (slower latency, decreased amplitude) for both types of facial motion, i.e., speech and non-speech dynamic facial optic cues, compared to measures in quiet conditions. Experiment 3 (Optic Cues) evaluated the influence of the type of optic cue on auditory-visual facilitation with response time measures and the N1/P2 response. N1 latency was faster with both types of facial motion tested in this experiment, but N1 amplitude was decreased only with concurrent presentation of auditory and visual speech. The N1 ERP results of these experiments reveal that the effect of audibility alone does not explain auditory-visual facilitation in noise. The decreased N1 amplitude associated with the visible speech gesture and the concurrent auditory speech suggests that processing of the visible speech gesture either stimulates N1 generators or interacts with processing in N1 generators. A likely generator of the N1 response is the auditory cortex, which matures differently without auditory stimulation during a critical period. The impact of auditory-visual integration deprivation on neural development and ability to make use of optic cues must also be investigated. Further scientific understanding of any maturational differences or differences in processing due to auditory-visual integration deprivation is needed to promote utilization of auditory-visual facilitation of speech perception for individuals with auditory impairment. Research and (re)habilitation therapies for speech perception in noise must continue to emphasize the benefit of associating and integrating auditory and visual speech cues.
Use this login method if you
don't
have an
@illinois.edu
email address.
(Oops, I do have one)
IDEALS migrated to a new platform on June 23, 2022. If you created
your account prior to this date, you will have to reset your password
using the forgot-password link below.