The Top-Down and Bottom-Up Processing of Ongoing Speech by Aphasic Subjects
Beck, Ann Elizabeth Richardson
This item is only available for download by members of the University of Illinois community. Students, faculty, and staff at the U of I may log in with your NetID and password to view the item. If you are trying to access an Illinois-restricted dissertation or thesis, you can request a copy through your library's Inter-Library Loan office or purchase a copy directly from ProQuest.
Permalink
https://hdl.handle.net/2142/67540
Description
Title
The Top-Down and Bottom-Up Processing of Ongoing Speech by Aphasic Subjects
Author(s)
Beck, Ann Elizabeth Richardson
Issue Date
1981
Department of Study
Speech and Hearing Science
Discipline
Speech and Hearing Science
Degree Granting Institution
University of Illinois at Urbana-Champaign
Degree Name
Ph.D.
Degree Level
Dissertation
Keyword(s)
Health Sciences, Speech Pathology
Language
eng
Abstract
The primary purpose of this study was to examine the manner in which aphasic subjects utilized top-down and bottom-up levels when processing ongoing speech. In individual sessions, 30 subjects (10 aphasic, 10 normal and 10 brain-damaged control) heard 288 test sentences which contained one member of one of 12 phonemically similar word pairs in one of three semantic contexts: congruent, neutral or noncongruent. Immediately after hearing a sentence, a subject saw a written representation of a word pair, one member of which had been in the sentence just heard. The subject's task was to indicate which word he thought had been in the sentence. Measures of the subjects' latency and accuracy were obtained and analyzed with essentially parallel, nested factorial designs that had as their factors condition, position, group and subject within group, which was the random variable. The data for each subject group on each measure were also analyzed for a simple three-way factorial (condition-x-position-x-subject). In the case of accuracy scores the variate analyzed was the arcsin transform of the percent error and of latency scores the latency in msec of correct responses. Results of the analyses indicated that normal subjects responded with more speed than did brain-damaged control subjects but with a similar level of accuracy. Aphasic subjects were slower and less accurate than either group of control subjects. All subject groups responded in a similar manner to semantic conditions: the responses to the congruent and unbiased contexts were equal in accuracy and latency but to the noncongruent were less accurate and slower than to either of the other contexts. Both control groups did demonstrate a floor effect on the accuracy level of the unbiased to the congruent condition, aphasic subjects did not. Inspection of raw error data indicated that combining the data from aphasic subjects masked certain response differences. Aphasic subjects were divided into two groups (n = 5), one high comprehending and one low. The accuracy data from these subgroups were analyzed with a procedure parallel to that used for subject groups overall. Results of this analysis indicated that high comprehending aphasic subjects made fewer errors on the congruent than the unbiased condition. Low comprehending aphasic subjects made a number of errors that was high and consistent regardless of context. Aphasic subjects who comprehended speech adequately thus appeared to have slightly less accurate bottom-up processing of speech than did controls, but an interactive strategy similar to that of control subjects. Low comprehending aphasic subjects, however, appeared to have decided deficits in both levels of processing with the greater impairment occurring in their top-down processing of speech. Implications for normal strategies of processing ongoing speech and for aphasia therapy are also discussed.
Use this login method if you
don't
have an
@illinois.edu
email address.
(Oops, I do have one)
IDEALS migrated to a new platform on June 23, 2022. If you created
your account prior to this date, you will have to reset your password
using the forgot-password link below.