Differential item functioning analyses of the biology subject test of the College Entrance Examination of Taiwan
Wang, Chen-Shih
This item is only available for download by members of the University of Illinois community. Students, faculty, and staff at the U of I may log in with your NetID and password to view the item. If you are trying to access an Illinois-restricted dissertation or thesis, you can request a copy through your library's Inter-Library Loan office or purchase a copy directly from ProQuest.
Permalink
https://hdl.handle.net/2142/23163
Description
Title
Differential item functioning analyses of the biology subject test of the College Entrance Examination of Taiwan
Author(s)
Wang, Chen-Shih
Issue Date
1995
Doctoral Committee Chair(s)
Ackerman, Terry A.
Department of Study
Education
Discipline
Education
Degree Granting Institution
University of Illinois at Urbana-Champaign
Degree Name
Ph.D.
Degree Level
Dissertation
Keyword(s)
Education, Tests and Measurements
Education, Educational Psychology
Education, Social Sciences
Language
eng
Abstract
The College Entrance Examination of Taiwan (CEET) is a college-level examination program, administered by the College Entrance Examination Center in Taiwan. The general purpose of the CEET program is to help colleges and universities in Taiwan select candidates for their freshmen classes. This study investigated differential item functioning (DIF) in the Biology tests of the CEET.
The general purposes of this study were: (a) to assess DIF longitudinally, (b) to investigate the effects of two different scoring rubrics--the partial-credit-with-penalty and the trichotomous scoring rubrics--on DIF, and (c) to examine the relationship between dimensionality and DIF.
The 1991, 1992, and 1993 Biology tests of the CEET were used, with only single-answer multiple-choice (SMC) and multiple-answer multiple-choice (MMC) items included. The polytomous SIBTEST procedures were conducted for the first two purposes whereas the Poly-DIMTEST, accompanying with the Hierarchical Cluster Analysis, procedures were employed for the third purpose.
Major findings were summarized as follows: (a) Almost half of the items displayed DIF in each year data, however, the number of items that favored the males and the females did not show that female examinees encounter more disadvantages than the males; (b) Although the total number of the DIF-flagged items seemed to decrease from year to year, the amount of DIF did not display a significant difference among the three consecutive years; (c) The different scoring rubrics did have significantly different effects on the amount of DIF displaying in each of the three years' data; (d) The 1991 complete test (including both the SMC and MMC items) was assessed to be unidimensional, whereas the 1992 and 1993 complete test tended to be multidimensional; (e) The MMC-only data displayed significantly different results on the dimensionality tests under the two different scoring rubrics.
Based on the results obtained, some major conclusions were drawn: (a) A content specification system should be developed for effectively monitoring the DIF items over years, identifying the causes of producing DIF, and reducing DIF items; (b) Future research should examine DIF under different scoring rubrics using different DIF detection methods, and explore the reasons that some items presented DIF under one scoring rubric but not under the other; (c) The factors causing multidimensionality and the major valid skill(s) being measured using a specific scoring rubric need to be explored; (d) The dimensionality assessment test needs to be developed to investigate the number of dimensions and the amount of multidimensionality in a test for the polytomously scored items.
Use this login method if you
don't
have an
@illinois.edu
email address.
(Oops, I do have one)
IDEALS migrated to a new platform on June 23, 2022. If you created
your account prior to this date, you will have to reset your password
using the forgot-password link below.